Publications – Michigan Virtual https://michiganvirtual.org Thu, 29 May 2025 14:16:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://michiganvirtual.org/wp-content/uploads/2021/10/cropped-mv-favicon-32x32.png Publications – Michigan Virtual https://michiganvirtual.org 32 32 Out of Order, Still Out of Reach: Navigating Assignment Sequences for Michigan Virtual World Language Courses https://michiganvirtual.org/research/publications/navigating-assignment-sequences-for-mv-world-language-courses/ Tue, 20 May 2025 20:34:44 +0000 https://michiganvirtual.org/?post_type=publication&p=95885

In online asynchronous courses, students can submit assignments anytime during the enrollment window, often in any order they like. While previous research has focused on the timing of assignment submissions, Cuccolo & DeBruler highlighted how the order of assignment submissions is associated with lower course performance in STEM courses. This study expands that research to World Language courses, highlighting that students’ final course scores decreased as deviations from the pacing guide increased.

]]>

Introduction

During the 2023-24 school year, approximately 11% of Michigan’s K-12 students took at least one virtual course. These 154,056 students comprised over 1,019,000 enrollments, 71% of which came from grades 9-12. In addition, 68% of Michigan school districts reported at least one virtual enrollment (Freidhoff et al., 2024). Despite the number of students opting for online learning, virtual pass rates remain lower than non-virtual ones (Freidhoff, 2015). For example, the overall pass rate for virtual courses during the 2023-24 school year was 63% compared to 74% for non-virtual courses (Freidhoff et al., 2024). Perhaps this difference can partially be attributed to the various challenges students face that are unique to online education (Johnson et al., 2023). 

Indeed, the flexibility of learning “anytime, anywhere,” particularly in asynchronous courses, requires students to possess or acquire a strong foundation of self-regulation, metacognitive and time management skills (Johnson et al., 2023; Digital Learning Institute, nd) because students can submit assignments at any time (but typically guided by an end-of-term deadline) and in any order they like, as courses are typically open (as opposed to having release conditions). Certain Michigan Virtual courses, such as those for core subject areas and electives, fall into this category, meaning students may submit any assignment at any time and in any order during the course term. To help students experience success in their online courses, Michigan Virtual provides students with pacing guides against which to benchmark their progress. Pacing guides outline the order in which students should complete course content. In other words, pacing guides show students which assignments they should complete each week to stay on track within their course. 

While there is a growing body of research pointing to the role that the timing of assignment submissions plays in students’ course performance (Carvalho et al., 2022; DeBruler, 2021; Dunlosky et al., 2013; Kim & Seo, 2015; Lim, 2016; Michigan Virtual Learning Research Institute, 2019), limited research exists exploring how the order of students’ assignment submissions is associated with course outcomes. Research that has examined the order of students’ assignment submissions has focused on massive open enrollment courses or university students and, thus, is difficult to generalize to K-12 populations (e.g., Perna et al., 2014; Lim, 2016b). However, Cuccolo & DeBruler (2024) found that submitting assignments out of order was common among students enrolled in Michigan Virtual STEM courses. Submitting assignments out of order seemed to be the rule and not the exception, with about 93% of students submitting at least one assignment out of order. Despite being common, this behavior was not beneficial for students, as students who submitted at least one assignment out of order had final course scores that were 9.5 points lower than students who completely adhered to course pacing guides. In Michigan Virtual STEM courses, the order of students’ assignment submissions is associated with their final course scores (Cuccolo & DeBruler, 2024), suggesting that assignment sequencing may be an understudied aspect of pacing that has important implications for student outcomes. 

Cuccolo & DeBruler’s original study focused solely on STEM courses, as the highly scaffolded nature of these courses was an ideal choice for examining the impact of pacing guide deviations. Because the amount of scaffolding, number and type of assignments, and course structure (linear vs. non-linear) likely vary by content area and individual course, it stands to reason that the extent to which student assignment submission patterns relate to final course scores may differ between subject areas. Preliminary analyses of assignment sequencing patterns in five core subject areas (English Language and Literature, Life and Physical Sciences, Mathematics, Social Sciences and History, and World Languages) revealed that moving out of alignment with course pacing guides was an especially common behavior for students enrolled in World Language courses. As such, the current study aimed to obtain a deeper understanding of the prevalence and impact of this behavior in these World Language courses. 

Methods

Data & Sample Overview

A subset of highly enrolled Michigan Virtual World Languages courses was selected to facilitate generalizability about students’ sequencing behavior within these courses. The table in Appendix A shows the enrollment information used in the selection process and the number of enrollments included in the current study. A Michigan Virtual Technology Integration team member provided enrollment data from the selected courses for Spring 2024 (the most recently completed semester at the time the data was requested and pulled). 

Analysis

Several variables were created to facilitate the analyses and answer the aforementioned research questions. First, because course pacing guides structure assignments sequentially (i.e., Assignments 1, 2, 3, 4, etc.), each student’s assignment submission was benchmarked against the one immediately preceding it—this ‘User-Driven’ variable coded in-sequence assignments with a 0 and out-of-sequence assignments with a 1. The total number of assignments submitted out of order was then calculated for each student. 

Next, to better contextualize students’ deviation from course pacing guides, the number of assignments submitted out of order was divided by the total number of assignments the student submitted and multiplied by 100 (‘Percentage of Assignments Submitted Out of Order’). For example, if a student submitted 50 assignments, 5 of which were out of order, the percentage out of order would be 10 percent. 

The ‘Magnitude’ variable represents the difference between the intended submission order of consecutively submitted assignments and was used to understand the degree to which students submitted assignments out of alignment with the course pacing guides. For example, if a student submitted assignment 5 and then 17, the magnitude value would be 12. Each student’s respective magnitude values for all submitted assignments were averaged (‘Average Magnitude’) to obtain a value that captured, on average, the extent to which they deviated from course pacing guide expectations. 

Finally, based on how students’ assignment submission patterns were categorized, students were assigned to one of two user sequence groups. Students who submitted at least one assignment out of order were assigned to the out-of-sequence group. In contrast, if students submitted all their assignments in order (aligned to pacing guide expectations), they were assigned to the in-sequence group. Appendix A contains the names and definitions of key variables referenced in the report and examples when applicable. Table 1 below provides an example of the data layout for readers. 

Assignment NameUser-DrivenPacing GuideMagnitude
11.1 Ein Quiz14725
11.2 Ein Quiz0480
5.3 Das Essen vorbereiten Arbeitsblatt12325
6.3 Mein Selbstbericht1274
6.1 Mahlzeit Diskussion1252
4.3 Eine Aufnahme: Videodiskussion1196
Table 1. Example Data Layout

Results

Sample Description

After removing duplicate enrollments (students who were enrolled in more than one of the courses selected for the study) and outliers (students who had completed fewer than 50% of their assignments to ensure reliability, given analyses focused on understanding how assignment submission patterns related to course performance), the final sample consisted of 1,873 students. On average, students in the sample had completed about 3.29 (SD = 2.99) online courses. During the semester in which the data were collected (Spring 2024), students carried a Michigan Virtual (MV) online course load of about one course (SD = 0.73). Most students (60.86%, n = 1140) attended schools where the Non-White School Population was 25% or less. Additionally, about 39.94% of students came from Mid-Low Poverty schools, whereas only 4.91% attended high-poverty schools.

When examining the sample by school locale classification, approximately 25.79% (n = 483) of students were from large suburban areas, followed by 13.61% (n = 255) from rural fringe areas. Regarding entity type, most students (86.92%, n =  1628) came from LEA (Local Education Agency) Schools. American Sign Language 1B had the highest student enrollment of the courses included in the study. Please review the table in Appendix B for a breakdown of the number of students in each course. 

Research Questions

What does students’ assignment sequencing look like in World Language courses?

What percentage of students go out of sequence? Is going out of sequence a common behavior?

The majority of students sampled went out of sequence at least once (n = 1,817, 97.01%). This left only 56 students (2.99%) who completed their World Language course fully aligned with their course pacing guides. As such, going out of sequence appeared to be the norm, implying that students in World Language courses are more likely to deviate from their course pacing guides than adhere to them. 

What are the average and median number of assignments submitted out of sequence?

The total number of assignments submitted out of order ranged from zero (students fully adhered to course pacing guides) to 90. The average number of assignments submitted out of order was approximately 31 (SD = 19.90). The median (a metric referring to the middle value of a data set when organized in ascending order) number of assignments submitted out of order was 29, meaning half of the students sampled submitted fewer than 29 assignments out of order. In contrast, half submitted more than 29 assignments out of alignment with course pacing guides.

What is the average and median percentage of completed assignments submitted out of sequence? 

To put the number of assignments students submitted out of sequence in the context of the number of available course assignments, the percentage of assignments submitted out of sequence was examined. This represents the number of assignments submitted out of sequence divided by the total number of assignments completed and multiplied by 100. While the percentage of assignments submitted out of sequence ranged from zero to 97.70%, the average was 44.70%. The median was slightly higher, with half of the students submitting more than 47% of assignments out of sequence and half submitting fewer than that.

What is the average and median magnitude of assignments submitted out of sequence?

In addition to examining the number of assignments submitted out of sequence, the extent to which assignments were out of alignment with course pacing guides was analyzed. On average, students were about three and a half assignments “off” from the intended pacing guide order (SD = 2.96). This may look like, for example, a student submitting assignment eight when the course pacing guide recommended submitting assignment eleven. While some students moved through the course completely aligned with pacing guide expectations, others were as much as approximately 15 assignments “off.” Please refer to Table 2 for descriptive information about key study variables. 

VariableAverage (SD)MinimumMedianMaximum
Final Course Scores82.64 (14.59)23.8187.73100.00
Dropped Courses0.04 (0.26)0.000.006.00
Completed Courses3.29 (2.99)1.003.0036.00
Current Class Load (Enrollment Load)1.17 (0.73)1.001.008.00
Average Magnitude3.50 (2.96)0.002.6714.57
Percentage of Assignments Completed95.09 (8.49)55.22100.00100.00
Percentage of Assignments Submitted Out of Order44.70 (25.24)0.0047.2797.70
Total # of Assignments Submitted Out of Order31.09 (1.91)0.0029.0090.00
Table 2. Descriptive Statistics for Key Study Variables

What does the relationship between students’ assignment sequencing and course performance look like in World Language courses?

Correlations helped describe the relationship between students’ assignment sequencing (percentage of assignments submitted out of order, average magnitude) and students’ course performance (final course scores). Correlations show the changes in one variable relative to changes in the other. 

The negative correlation observed between final course scores and the percentage of assignments submitted out of order means that as one variable decreased, the other increased. For example, as students’ final course scores decreased, the percentage of assignments submitted out of order increased. Likewise, final course scores increased as the percentage of assignments submitted out of order decreased. A similar relationship was observed between the extent to which assignments were submitted out of order (average magnitude) and final course scores. As students’ magnitudes increased, scores decreased; as magnitude values decreased, final scores increased. A correlation matrix is provided in Appendix C for those interested in examining the specific correlation coefficients. 

In short, sequencing variables and final course scores moved in opposite directions. An important caveat to these findings is that correlations only describe the relationship between variables and do not isolate the cause of the relationship (i.e., it cannot be said which variable causes the observed relationship).

What does student performance look like at each quartile of the percentage of assignments submitted out of order?

To understand how out-of-sequence movement related to course performance, students’ average final course scores at each quartile1 of the percentage of assignments submitted out of sequence were examined.

As illustrated in Table 3, students in the 1st quartile—those who completed the smallest percentage of assignments out of order—achieved the highest average final course scores (M = 88.3). In contrast, students in the 4th quartile, who submitted the largest percentage of assignments out of sequence, earned the lowest average scores (M = 78.7). This 9.6-point difference between the highest and lowest quartiles equals roughly one letter grade. Notably, the most significant drop in scores occurred between the 1st and 2nd quartiles, where average scores declined by 5.8 points, suggesting that even moderate increases in out-of-order submission behavior may be linked to a meaningful decline in academic performance.

What does student performance look like at each quartile of the average magnitude of assignments submitted out of order?

A similar trend emerged when analyzing the magnitude of out-of-order assignment submissions. Students in the 1st quartile—those with the lowest magnitude values—achieved the highest average final course scores (M = 87.3). In contrast, those in the 4th quartile, representing students with the highest magnitude values, had the lowest average scores (M = 78.2). This 9.1-point gap between the first and fourth quartiles is nearly equivalent to a full letter grade. The most pronounced drop occurred between the 2nd and 3rd quartiles, where average final course scores declined by 5.1 points, indicating that even modest increases in the degree of out-of-order submission may be associated with noticeable decreases in academic performance.

Reviewing Table 3 can provide additional information about the average final course scores at each quartile of the respective sequencing variables. A similar pattern was observed when looking at median final course scores at each quartile of the percentage of assignments submitted out of order and average magnitude; the reader can review these trends in Table 4.

Quartiles1st
Bottom 25%
2nd
50%
3rd
75%
4th
Top 25%
Average Final Course Scores
Percentage of Assignments Completed Out of Order88.382.581.078.7
Difference from Previous Quartilen/a-5.8-1.5-2.3
Total Difference (Q1 – Q4)-9.6
Table 3. Average Final Course Scores Broken Down by Quartile of Predictors: Percentage of Assignments Completed Out of Order
Quartiles1st
Bottom 25%
2nd
50%
3rd
75%
4th
Top 25%
Average Final Course Scores
Average Magnitude87.385.180.078.2
Difference from Previous Quartilen/a-2.2-5.1-1.8
Total Difference (Q1 – Q4)-9.1
Table 4. Average Final Course Scores Broken Down by Quartile of Predictors: Average Magnitude
Quartiles1st
Bottom 25%
2nd
50%
3rd
75%
4th
Top 25%
Median Final Course Scores
Percentage of Assignments Completed Out of Order92.488.386.182.3
Average Magnitude92.190.784.781.4
Table 5. Median Final Course Scores Broken Down by Quartile of Predictors

Discussion

The current study highlights the prevalence of submitting assignments out of alignment with course pacing guides in Michigan Virtual self-paced World Language courses. Nearly all students in the sample (97%) deviated from course pacing guides at least once, and on average, submitted approximately 44% of their completed course assignments out of order. These findings exceed those reported in prior studies. For example, Cuccolo & DeBruler (2024) found that 93% of students in Michigan Virtual online STEM courses submitted at least one assignment out of order, with an average of approximately 38% of assignments submitted out of sequence. Additionally, a brief visual inspection of the World Language course data supports anecdotal evidence from instructors: a substantial number of students fail to submit assignments requiring video recordings or scheduled meetings with instructors, assignment types that are especially critical in these particular courses for formative skill assessment and feedback. Taken together, these findings suggest that students in World Language courses may be particularly prone to deviating from the recommended assignments sequence, underscoring the importance of monitoring students’ assignment submission patterns in this context. 

Consistent with Cuccolo & DeBruler’s (2024) findings, this study also observed a negative relationship between moving out of alignment with course pacing guides and final course scores. Specifically, final course scores steadily declined as students submitted a greater percentage of assignments out of order, and strayed further from the intended assignment sequence (i.e., higher magnitude). The most significant decline in scores occurred between the 1st and 2nd quartiles for the percentage of assignments submitted out of order, and between the 2nd and 3rd quartiles for average magnitude. Practically speaking, this suggests critical thresholds: a noticeable decline in student grades may be seen as they begin to submit more than about a quarter of assignments out of order, or when they are more than one assignment “off” from pacing guide recommendations. As such, these thresholds may serve as useful benchmarks and noteworthy intervention points for teachers and mentors. 

Overall, the findings suggest that both the frequency and degree of out-of-order assignment completion are meaningfully associated with student performance. Students who submitted fewer assignments out of order and stayed closer to the intended sequence (i.e., lower magnitude) earned notably higher final course scores. As students increased the proportion or the extent to which they deviated from the expected assignment order, their average course scores declined by nearly a full letter grade between the lowest and highest quartiles for both predictors. These patterns highlight the potential academic consequences of straying from the designed learning sequence in online courses. As such, teachers and mentors should be mindful of intervening early when monitoring student performance. 

While the current study’s findings provide actionable insights into how assignment submission patterns are associated with course scores, it is essential to note that this study was correlational by design. In other words, the causal agent in the observed relationship between assignment sequencing and final course score is unclear. Other unstudied factors may contribute to the observed relationship. For example, moving out of alignment with course pacing guides may be part of a broader pattern of student behaviors related to performance. For instance, self-regulatory and metacognitive skills have been positively associated with student achievement (Xu et al., 2023). These skills encompass students’ ability to engage with a task cognitively, reflect and evaluate their learning, manage resources, and direct their efforts (Xu et al., 2023). Self-regulatory and metacognitive skills are increasingly important in an online learning environment, where students have increased autonomy over when, where, and how they engage with course content (Xu et al., 2023). Without a strong foundation, students may struggle to engage deeply with assignments and feedback, scaffold knowledge, and manage their time appropriately. Thus, self-regulated learning could be effective in better understanding the relationship between assignment submission patterns and student performance. 

To promote adherence to course pacing guides and foster effective course navigation, teachers and mentors should incorporate transparent and proactive communication practices into their routines. In particular, setting course expectations early by helping students understand course structure, workload, pacing, and tips for success can help students prepare for the demands of self-paced online learning (Cuccolo & Green, 2024). Monitoring the gradebook and benchmarking student progress against course pacing guides is also recommended. The thresholds identified in this study (e.g., exceeding about a quarter of out-of-sequence submissions or being more than one assignment “off” from the pacing guide’s intended submission order) can serve as timely indicators for intervention. Personalized feedback remains a highly effective strategy for engaging students in online learning as it works double duty as both a relationship-building strategy and a way to monitor/motivate academic progress (DeBruler & Harrington, 2024). Regular communication between mentors, teachers, and their students helps bridge information gaps and close feedback loops. Additionally, mentors can also help contextualize student behavior for teachers (Cuccolo & DeBruler, 2023). While it may be unrealistic to believe students will complete all assignments in order when enrolled in a self-paced course, teachers and mentors can help encourage pacing behaviors that set students up for success. 

Appendix

Appendix A. Study Glossary
VariableDefinitionExample
User-Driven A variable that indicates if a student’s assignment was submitted in or out of alignment with pacing guide expectations. If a student’s current assignment was one greater than the previously submitted assignment, it was considered in-sequence (and given a value of “0”); otherwise, it was considered out-of-sequence (and given a value of “1”). If a student submits Assignment 5 and then Assignment 17, this would be labeled as an out-of-sequence assignment because 17 is not one greater than 5.
Total # of Assignments Submitted Out of Order The total number of assignments a student submitted out of order. The total sum of the number of “1s” in the “UserDriven” column for a student. For example, if this value is 18, the student submitted 18 assignments out of their intended pacing guide order.
Percentage of Assignments Submitted Out of Order The total number of assignments the student submitted out of order divided by the total number of assignments the student completed, multiplied by 100. If a student submitted 50 assignments, 5 of which were out of order, the percentage out of order would be 10%.
Magnitude This represents the extent to which students deviated from the pacing guide—the difference between the intended pacing guide and the actual submission order of consecutive assignment submissions. If a student submitted assignments 5 and then 17, the magnitude value would be 12.
Average Magnitude Takes the average of all of a student’s ‘Magnitude’ values. If a student had magnitude values of 12, 3, 8, and 2, then the average magnitude would be 6.25.
Percentage of Assignments Completed The number of completed assignments divided by the total number of assignments in the course, multiplied by 100. A value of 70 would indicate a student completed 70% of all available course assignments.
Dropped Courses Number of courses the student dropped. A value of two would mean the student has dropped two virtual courses.
Completed Courses Number of courses the student completed. A value of four would mean the student has completed four virtual courses.
Current Class Load The number of classes the student was taking in the target semester. A value of two would mean that the student took two virtual classes during the spring 2024 semester.
Appendix B. Enrollment Information For The Current Study
Course NameStudents per Course (n)Students per Course %
American Sign Language 1B74039.51%
American Sign Language 2B26013.88%
Spanish 2B1296.89%
American Sign Language 1A1136.03%
Spanish 1B1115.93%
French 1B874.64%
French 2B844.48%
German 1B754.00%
German 2B673.58%
Japanese 1B623.31%
Japanese 2B321.71%
Spanish 1A241.28%
Japanese 1A150.80%
American Sign Language 2A140.75%
German 1A140.75%
French 1A130.69%
Spanish 2A130.69%
French 2A90.48%
German 2A60.32%
Japanese 2A50.27%
Total1,873100%

Appendix C. Relationship Between Sequencing Variables And Final Course Scores
Final Course ScoresAverage MagnitudePercentage CompletePercentage Out of Order
Final Course Scores1-0.200.60-0.18
Average Magnitude-0.201-0.190.72
Percentage Complete0.60-0.191-0.20
Percentage Out of Order-0.180.72-0.201
* Bolded values represent statistically significant relationships

References

Cuccolo, K., & Green, C. (2024). Starting Strong: Understanding Teacher-Student Communication in Online Courses. Michigan Virtual. https://michiganvirtual.org/research/publications/understanding-teacher-student-communication

Cuccolo, K. & DeBruler, K. (2023). Examining Mentors’ Navigation of Online Environments and Use of Student Support Practices. Michigan Virtual. https://michiganvirtual.org/research/publications/examining-mentors-navigation-of-online-environments

DeBruler, K. (2021). Research On K-12 Online Best Practices. Michigan Virtual. 

DeBruler, K. & Harrington, C. (2024). Key Strategies for Supporting Disengaged and Struggling Students in Virtual Learning Environments. Michigan Virtual. https://michiganvirtual.org/research/publications/key-strategies-for-supporting-disengaged-and-struggling-students-in-virtual-learning-environments/

Digital Learning Institute. (n.d.). “What is Self Paced Learning? Definition, Benefits and Tips.” [Blog]. Retrieved from https://www.digitallearninginstitute.com/blog/what-is-self-paced-learning-definition-benefits-and-tips

Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14(1), 4-58. https://doi.org/10.1177/1529100612453266

Freidhoff, J. R. (2015). Michigan’s K-12 virtual learning effectiveness report 2013-14. Lansing, MI: Michigan Virtual University. Retrieved from http://media.mivu.org/institute/pdf/er_2014.pdf

Freidhoff, J. R., DeBruler, K., Cuccolo, K., & Green, C. (2024). Michigan’s K-12 virtual learning effectiveness report 2022-23. Michigan Virtual. https://michiganvirtual.org/research/publications/michigans-k-12-virtual-learning-effectiveness-report-2022-23/

Johnson, C. C., Walton, J. B., Strickler, L., & Elliott, J. B. (2023). Online teaching in K-12 education in the United States: A systematic review. Review of Educational Research, 93(3), 353-411. https://doi.org/10.3102/00346543221105550

Kim, K. R., & Seo, E. H. (2015). The relationship between procrastination and academic performance: A meta-analysis. Personality and Individual Differences, 82, 26-33. https://doi.org/10.1016/j.paid.2015.02.038

Lim, J. M. (2016). Predicting successful completion using student delay indicators in undergraduate self-paced online courses. Distance Education, 37(3), 317-332. https://doi.org/10.1080/01587919.2016.1233050

Lim, J. M. (2016b). The relationship between successful completion and sequential movement in self-paced distance courses. International Review of Research in Open and Distributed Learning, 17(1), 159-179. https://doi.org/10.19173/irrodl.v17i1.2167

Michigan Virtual Learning Research Institute. (2019). Pacing Guide For Success In Online Mathematics Courses. https://michiganvirtual.org/blog/pacing-guide-for-success-in-online-mathematics-courses/

Perna, L. W., Ruby, A., Boruch, R. F., Wang, N., Scull, J., Ahmad, S., & Evans, C. (2014). Moving through MOOCs: Understanding the progression of users in massive open online courses. Educational Researcher, 43(9), 421-432. https://doi.org/10.3102/0013189X14562423

Xu, Z., Zhao, Y., Zhang, B., Liew, J., & Kogut, A. (2023). A meta-analysis of the efficacy of self-regulated learning interventions on academic achievement in online and blended environments in K-12 and higher education. Behaviour & Information Technology, 42(16), 2911-2931. https://doi.org/10.1080/0144929X.2022.2151935

]]>
Michigan’s K-12 Virtual Learning Effectiveness Report, 2023-24 https://michiganvirtual.org/research/publications/michigans-k-12-virtual-learning-effectiveness-report-2023-24/ Mon, 31 Mar 2025 18:19:55 +0000 https://michiganvirtual.org/?post_type=publication&p=90922

Based on pupil completion and performance data reported by public schools to MDE or CEPI, this report highlights 2023-24 enrollment totals, completion rates, and the overall impact of virtual courses on K-12 pupils. Detailed findings are presented in sections on schools, courses, and students, as well as over 90 data tables.

]]>

Introduction

This report presents an analysis of information on virtual learners reported by schools to the state and shares findings in a highly consumable way to aid the evaluation of virtual learning programs. This year’s report is the 12th edition of this annual publication and completes 14 years of data on K-12 virtual learning in Michigan.

The report is organized into several sections. Each section is meant to capture the essential findings without being overly intensive; however, data tables have been included in the appendices to provide those interested with more in-depth information. Information about the report’s methodology is captured in Appendix A. Please note that in some tables and figures, percentages may not sum to 100% due to rounding.

Schools

Fast Facts

  • 613 school districts reported at least one virtual enrollment. This represented approximately 68% of Michigan school districts.
  • 1,421 schools reported at least one virtual enrollment. This is a decrease of approximately 3.5% compared to last year’s value of 1,475.
  • 10% of this year’s schools did not report any virtual enrollments the prior year. These 144 schools added 27,719 enrollments with a 77% pass rate.
  • 90% of this year’s schools also reported virtual enrollments last year. They accounted for over 991,000 enrollments with a pass rate of 63%.
  • 198 schools that offered virtual learning the prior year did not report any for this year.
  • 56% of the 1,421 schools with virtual enrollments had 100 or more virtual enrollments. These higher-volume schools accounted for 98% of virtual enrollments.
  • 79% of schools with virtual enrollments had a general education school emphasis; 19% had an alternative education emphasis.
  • 88% of schools with virtual learning were LEA schools.
  • LEA schools accounted for 60% of the virtual enrollments; PSA schools generated 38% of the virtual enrollments.
  • Approximately 51% of virtual enrollments came from schools with part-time virtual learning options.
  • LEA schools represented 72% of the full-time virtual schools.
  • 71% of virtual enrollments came from students in grades 9-12.
  • 38% of virtual enrollments came from suburban schools, the most of any locale.
  • Schools with a general education emphasis had a 72% virtual pass rate, outperforming those with an alternative education emphasis, which had a pass rate of 51%.
  • 29% of schools had a school-wide virtual pass rate of 90% to 100%, an increase of one percentage point from last year.

Number of Districts and Schools

For the 2023-24 school year, 613 districts reported having at least one virtual enrollment. This represented approximately 68% of the 899 Michigan public school districts for the year. See the MI School Data Report for a breakdown of the district count. Within those districts, 1,421 schools reported virtual enrollments, 54 fewer than the prior year. When looking over the last two years, schools fell into three categories, which are also captured in Table B1:

  • Leaving – 198 schools had virtual enrollments the prior year (2022-23) but did not report any virtual enrollments in 2023-2024. Last year, those schools accounted for a total of 27,830 virtual enrollments and had a pass rate of 70%.
  • Returning – 1,277 or 90% of schools in this year’s dataset reported virtual enrollments in both 2022-23 and 2023-2024. This year, these schools generated 991,942 enrollments and had a pass rate of 63%, which was three percentage points lower than their rate in 2022-23.
  • New – 144 schools reported virtual enrollments this year that did not last year. Those schools accounted for 27,719 enrollments, with a pass rate of 77%.

394,331 of this year’s enrollments came from 30 schools that reported 1,000 or more enrollments than they did in 2022-23. On the other hand, 27 schools reported decreases of 1,000 or more virtual enrollments this year. Despite these declines, these schools yielded 65,155 virtual enrollments this year. See Table B2. These declines are less drastic than in previous years, suggesting that volatility from the pandemic may be starting to slow. About 17.4% of schools in both years saw their pass rates increase by 10 or more percentage points from the prior year. See Table B3.

By Grade Level

There were 1,019,661 virtual enrollments across the 1,421 schools. Students in 12th grade generated the most virtual enrollments (249,246), representing 24% of all virtual enrollments. There continued to be a smaller percentage of high school virtual enrollments than before the pandemic. In the 2019-20 school year, 81% of the virtual enrollments came from students in high school; this year, high school enrollments accounted for approximately 71% of the virtual enrollments. It seems likely that this percentage will continue moving upward over the next several years.

The overall pass rate for virtual enrollments was 63%, a decrease of 2 percentage points over the prior year, which may be expected as high school enrollments continue to rebound. See Table G1 for a more specific breakdown of all the completion statuses. This ranged from a high of 82% in first grade to a low of 45% in 9th grade.

This year, elementary grades tended to see larger percentage decreases than last year (two to eight percentage points), whereas the middle school grades (6th, 7th, 8th) saw increases of between one to four percentage points. Among the high school grades, 9th and 10th grades saw small decreases (three and two percentage points, respectively), while 11th and 12th grades saw small increases (one and two percentage points, respectively). See Table B4 for more information.

The fairly consistent pattern of a higher pass rate in non-virtual coursework continued. For 2023-24, virtual learners had a 63% pass rate in their virtual courses but a 74% pass rate for their non-virtual coursework. See Table B5. As a pre-pandemic comparison, the 2019-20 school year virtual pass rate was 12 percentage points lower than those students’ non-virtual pass rate.

By School-Level Virtual Pass Rate

Of the 1,421 schools with virtual enrollments, 411, or 29%, had school-level virtual pass rates of 90% to 100%. This was one percentage point higher than the prior year. Approximately 61% percent of the schools (865) had virtual pass rates of 70% or higher. This was three percentage points higher than the prior year. See Table B6. Thus, even though the overall pass rate in the state dropped year over year, a higher percentage of schools experienced high levels of student performance.

By Entity Type

LEA schools and PSA schools accounted for almost all the virtual enrollments, with 60% and 38%, respectively. Virtual enrollments came from 1,251 (88%) LEA schools, while only 128 (9%) of the schools were PSAs. See Table B7. LEA schools had a higher pass rate (64%) than PSA schools (60%), continuing last year’s trend, although pass rates for both entities dropped. See Table B8 or, for a more in-depth look at the completion statuses, see Table G2.

To put this in perspective, as of January 2025, roughly 44% of all LEA schools had a virtual program, compared to approximately 23% of ISD schools. The percentages of State and PSA schools with virtual programs were similar at approximately 33 and 35%, respectively.

By Full-Time Virtual Schools

The number of full-time virtual schools (76) decreased by one from the prior year. Fifty-five of the 76 full-time virtual schools (72%) were LEA schools. PSA schools (18) accounted for 24% of the full-time virtual schools. See Table B9. Despite the sizable difference in the number of schools, PSAs reported more virtual enrollments (62%) from full-time virtual students statewide compared to LEAs (37%). PSA full-time virtual learners saw higher virtual pass rates (62%) than their counterparts in LEA schools (55%). See Table B10 and Table G3. Overall, the number of virtual enrollments from full-time virtual schools increased from 449,188 in 2022-2023 to 499,793 this year. Approximately 49% percent of the virtual enrollments came from full-time virtual learners.

A quick note about full-time virtual schools: Historically, full-time virtual schools have only provided students with 100% of their learning online. Thus, it was safe to designate all enrollments from such a school as being part of a full-time virtual program. Over the last several years, however, LEAs have started to add full-time virtual options to their offerings. In some cases, this is a separate school, which makes it analogous to cyber schools. However, it seems that schools are increasingly offering multiple forms of online learning (“Full Virtual,” “Face Virtual,” and “Supplemental Virtual”) from the same building code. See page 15 of the Educational Entity Master Glossary for more information on these field values. This means that some schools report various forms of virtual (and sometimes non-virtual) learning from a single building code. Case in point, 12% of the enrollments from virtual learners in LEA full-time programs were not flagged as being delivered virtually, indicating what may be more of a hybrid approach.

By Part-Time Virtual Schools

About 95% of the schools offering virtual learning do so to supplement their face-to-face course offerings. These 1,398 schools, referred to in this report as part-time virtual schools, were predominantly LEA schools (89%). See Table B11. Eighty-eight percent of the part-time virtual students were enrolled through LEA schools, and 11% through PSA schools. LEA schools accounted for 431,372 virtual enrollments or 83% of the part-time enrollments. In total, enrollments from part-time virtual schools accounted for approximately 51% of all the virtual enrollments for the year. LEA schools had a pass rate of 68%, whereas PSA schools had a pass rate of 54%. Overall, the pass rate for the part-time virtual schools (67%) was seven percentage points higher than the rate for the full-time virtual schools (60%). See Table B12 and Table G4.

By School Emphasis

Seventy-nine percent of schools with virtual learning were designated as General Education and produced 608,000 (60%) virtual enrollments. Schools with Alternative Education as their emphasis accounted for 409,964 (40%) of the virtual enrollments. See Table B13. There was a considerable difference in virtual pass rates between these two types of schools. General Education schools had a 72% virtual pass rate, whereas Alternative Education schools had a 51% virtual pass rate (see Table B14 and Table G5), though this varied by entity type. LEA schools, for instance, had a 76% virtual pass rate for General Education schools and a 53% virtual pass rate for Alternative Education schools. See Table B15.

By Number of Virtual Enrollments

Fifty-six percent of schools with virtual enrollments had 100 or more virtual enrollments. These schools were responsible for 98% of the virtual enrollments (1,000,507). See Table B16.

Interestingly, last year’s trend that schools with fewer virtual enrollments per student performed better was absent. The trend in which, in general, schools with 1-2 enrollments per student or 5+ enrollments per student performed better than those with 3-4 enrollments per student continued. Indeed, 40% of schools with an average of one to two virtual enrollments per virtual learner had a pass rate of 90-100%, and 40% of schools with an average of five or more virtual courses per student also had a pass rate between 90-100%. See Table B17.

By Locale

Suburban schools represented 36% of schools with virtual enrollments. Rural settings provided the second most schools with 34%. Suburban schools also tallied the largest percentage of the virtual enrollments at 38%. Rural schools were the next closest, providing 30% of the enrollments. See Table B18. In each of the four locales, schools with 100 or more virtual enrollments accounted for the largest percentage of schools. See Table B19. Virtual pass rates varied by locale. City schools, which had the highest virtual pass rate in 2022-23, had the lowest pass rate this year at 62%. This represents an eleven-percentage point drop. Suburban schools had the highest at 68%. See Table B20. On the other hand, Rural and Suburban schools had 47% of their schools achieve building-wide virtual pass rates of 80% or higher. See Table B21. For more information about locales, including definitions, please see pages 23-24 of the Educational Entity Master Glossary.

By School Free or Reduced-Price Lunch Categories

Schools were categorized into one of four categories based on the percentage of all learners at the school (not just virtual learners) that qualified for free or reduced-price (FRL) meals:

  • Low FRL (<=25%)
  • Mid-Low FRL (>25% to <=50%)
  • Mid-High FRL (>50% to <=75%)
  • High FRL (>75%)

Similarly to last year, none of the categories had 50% or more of its schools report virtual learners. Mid-high FRL had the highest percentage at 47%. The higher numbers of schools from each category with virtual learners may be attributable to the pandemic, and numbers will likely continue to stabilize in the coming years. See Table B22.

While High FRL schools represented only 37% of schools with virtual learners (369), they accounted for 44% of the virtual enrollments. Mid-High FRL schools accounted for 36% of the enrollments. Low FRL schools, on the other hand, reported less than 5% of the virtual enrollments. The virtual pass rate for Low FRL schools was 85% compared to 57% for Mid-High FRL and 60% for High FRL schools. See Table B23.

By High Enrollment Schools

There were over 200 schools that had 1,000 or more virtual enrollments. It is worth noting that these high enrollment schools were less defined by having a larger number of virtual students (88,136 versus 67,835) and more about the number of virtual enrolls per student (8.9 virtual courses per student in the high enrollment schools compared to 3.5 in the non-high enrollment schools).

For high enrollment schools, both LEAs (153 schools) and PSAs (46 schools) had virtual pass rates of 61%. The corresponding virtual pass rates for virtual enrollments from schools with less than 1,000 enrollments moved in opposite directions. For non-high enrollment schools, the LEA schools’ virtual pass rate was nine percentage points higher (70%), whereas the PSA schools’ virtual pass rate was seven percentage points lower (54%). See Table B24 and Table B25.

There was close to a 50-50 split in the number of Alternative Education (104) and General Education (100) high enrollment schools. The virtual pass rate for these schools was 50% and 70%, respectively. In both cases, students from non-high enrollment schools performed better – three percentage points and five percentage points, respectively. See Table B26 and Table B27.

Courses

Fast Facts

  • Just over 1M virtual enrollments were taken by Michigan K-12 students; the overall pass rate for virtual enrollments was 63%.
  • Virtual enrollments were spread across 1,091 different course titles.
  • 66% of virtual enrollments occurred in the core subject areas of English Language and Literature, Mathematics, Life and Physical Sciences, and Social Sciences and History.
  • The course titles with the highest enrollments for each core subject were:
    • English Language and Literature: English 9, English 10, English 11, and English 12
    • Mathematics: Geometry, Algebra I, Algebra II, and Consumer Mathematics
    • Life and Physical Sciences: Biology, Chemistry, Earth Science, and Physical Science
    • Social Sciences and History: U.S. History—Comprehensive, World History and Geography, Economics, and World History—Overview

Number of Courses

The 1,019,661 virtual enrollments came from 1,091 different course titles, as determined by unique SCED codes.

Courses by Subject Area

English Language and Literature was, again, the subject area with the highest number of virtual enrollments (186,987)—18% of all virtual enrollments. In fact, since 2013-14, English Language and Literature has consistently been the highest enrollment subject area. Mathematics (17%), Social Sciences and History (16%), and Life and Physical Sciences (15%) were the next highest enrollment subject areas. In high enrollment subject areas (greater than 75,000 virtual enrollments), virtual pass rates varied from a low of 59% in Mathematics to a high of 64% in both Physical, Health, and Safety Education and Social Sciences and History. See Table C1 and Table G6. Six of the 23 subject areas (Agriculture, Food, and Natural Resources; Engineering and Technology; Manufacturing; Miscellaneous; Nonsubject Specific; and Transportation, Distribution, and Logistics) had virtual pass rates that were equal to or greater than the non-virtual pass rates for these students. See Table C2. For comparison, in the years just prior to the pandemic, only one or two of the subject areas saw equal or better performance in the virtual courses.

Highest Virtual Enrollment Courses

For English Language and Literature, the most highly enrolled in virtual courses were 9th, 10th, 11th, and 12th grade English/Language Arts. Of those four, the pass rate was lowest for 9th grade English/Language Arts (46%) and rose fairly consistently for each subsequent grade level to finish at 63% for 12th grade English/Language Arts. The next three course titles were at the 6-8 grade level, followed by a multi-grade course and two in the K-5 level (grade 1 and kindergarten). See Table C3.

Geometry, Algebra I, and Algebra II had the most Mathematics enrollments, each having over 27,000. Middle school Mathematics courses ranged from 6,900 to 8,800 enrollments. The pass rate across the top 10 most enrolled-in virtual Mathematics courses ranged from a low of 46% for Algebra I to a high of 77% in Consumer Mathematics. See Table C4.

Biology (34,536), Chemistry (20,687), and Earth Science (14,904) were the highest enrollment course titles, responsible for 10% or more of the virtual enrollments in Life and Physical Sciences courses. Three others—Physical Science, Earth and Space Science, and Environmental Science—each had more than 7,000 enrollments. Earth Science had the lowest pass rate (52%) of those in the top 10; the highest was 73% in both Science (grade 6) and Science (grade 7). See Table C5.

For Social Sciences and History, both U.S. History–Comprehensive (22,878) and World History and Geography (19,245) yielded 10% or more of the virtual enrollments. Three other titles had more than 10,000 enrollments (Economics, World History—Overview, and U.S. Government—Comprehensive). Pass rates for the top 10 most enrolled in courses ranged from a low of 49% in World History and Geography to a high of 77% in Psychology. See Table C6.

Thirty-three AP courses were taken virtually in 2023-24. There were just under 4,900 virtual AP enrollments, up from around 4,000 enrollments the prior year. AP Psychology was the most popular course, accounting for 17% of the enrollments. The pass rate for AP courses taken virtually was 87%. See Table C7. The pass rate for non-virtual AP courses taken by virtual learners was 95%.

There were just over 2,600 students who took at least one AP course virtually. For close to half (45%) of these students, AP courses were the only virtual courses taken. Students whose only virtual courses were AP accounted for over 2,200 enrollments or about 45% of the virtual AP enrollments for the year. These students had a virtual pass rate of 91%, four percentage points higher than the state virtual AP pass rate. In addition, of the 925 schools that had virtual enrollments in 9th, 10th, 11th, or 12th grade, 269 schools (29%) had virtual AP enrollments.

Subject Area Enrollments by Locale

Course enrollment patterns were quite consistent across locales. For instance, in Social Sciences and History, Life and Physical Sciences, and Physical, Health, and Safety Education, the difference in the percentage of virtual enrollments across the locales (Rural, Town, Suburb, and City) was within one to two percentage points. See Table C8. However, pass rates in virtual courses varied across subject areas and locale. For instance, in English Language and Literature, the pass rates ranged from City and Rural at 58% and 60%, respectively, up to 66% for Suburban schools. These trends—City and Rural schools lagging behind the performance of students in other locales and Suburban schools outperforming students in other locales—were also true for the other core subjects of Mathematics, Life and Physical Sciences, and Social Sciences and History. See Table C9. Last year, it was Town and Rural schools that followed the lowest-performing locale trend.

Subject Area Enrollments by Student Sex

Males and females enrolled in subject areas in similar proportions. In the four highest enrollment subject areas (English Language and Literature, Mathematics, Life and Physical Sciences, and Social Sciences and History), the proportion of enrollments from males and females was the very same or within one percent of the other. Pass rates, however, showed more variability by student sex. In 15 of the 19 subject areas with reported pass rates for both sexes, females outperformed males—a trend that has been consistent in past years. Overall, females had a 65% virtual pass rate whereas males had a 62% pass rate. See Table C10.

Courses by Virtual Method

Schools classified virtual courses into one of three methods: Blended Learning, Digital Learning, or Online Learning. See pages 346 and 347 of the Michigan Student Data System Collection Details Manual Version 4.0.

  • Blended Learning – A hybrid instructional delivery model where pupils are provided content, instruction, and assessment at a supervised educational facility where the pupil and teacher are in the same physical location and in part through internet-connected learning environments with some degree of pupil control over time, location, and pace of instruction. For a course to be considered blended, at least 30% of the course content is delivered online.
  • Digital Learning – A course of study that is capable of generating a credit or a grade that is provided in an interactive internet-connected learning environment that does not contain an instructor within the online environment itself. There may be a teacher of record assigned to the course, but this teacher does not provide instruction to students through the online environment. For a course to be considered online as opposed to blended, all (or almost all) the course content is delivered online.
  • Online Course – A course of study that is capable of generating a credit or a grade that is provided in an interactive internet-connected learning environment, where pupils are separated from their teachers by time, location, or both. For a course to be considered online as opposed to blended, all (or almost all) the course content is delivered online.

Blended Learning enrollments accounted for 7% of the virtual enrollments and had a pass rate of 68%. Digital Learning totaled 9% of the enrollments with a 63% pass rate. Online courses represented most of the enrollments (84%) and yielded a pass rate of 63%. Perhaps worth noting are the striking 1,273 schools (128,675 students) with online course enrollments as compared to the 188 schools (11,590 students) with blended learning enrollments. See Table C11.

Students

Fast Facts

  • Over 154,000 K-12 students took at least one virtual course which represented 11% of Michigan public school students.
  • Elementary and middle school students each tended to reflect about 2% to 5% of students per grade; high school students reflected 13% to 27% per grade.
  • 52% of virtual learners passed all their virtual courses. 17% of virtual learners did not pass any of their virtual courses.
  • Of the over 26,000 students who did not pass any of their virtual courses, 33% took only one or two courses. More than half of these students took and did not pass five or more virtual courses, and 18% took and did not pass 11 or more virtual courses.
  • Female students had a slightly higher pass rate (65%) than did males (62%).
  • Students in poverty made up the majority of virtual learners (64%) and virtual enrollments (71%). Students in poverty also had a lower pass rate (58% v. 77%).
  • Part-time virtual learners had higher pass rates (67%) compared to full-time virtual learners (60%).
  • Students using special education services made up 13% of the virtual learners.
  • Pass rates were highest for students taking the fewest virtual courses. Students taking one to two virtual courses had a pass rate of 80% whereas those taking five or more had virtual pass rates of 61%.
  • White students represented 62% of virtual students; African American or Black students were 19%.
  • Over 840,000 virtual enrollments were from students whose districts were stable (all enrollments from the same district) throughout the year. These enrollments had a virtual pass rate of 69%.

By Grade Level

For the 2023-24 school year, 154,087 Michigan K-12 students, approximately 11% of students in the state, took at least one virtual course. This change represents approximately a 3% decrease from the previous year, and a 26% decrease from 2021-22. Seventy-five percent of virtual learners came from the high school grades. Each elementary and middle school grade level tended to be around 2% to 5% of the virtual learners with each of the high school grade levels between 13% to 27%. See Table D1.

By Student Sex

There were slightly more females (78,908) enrolled in virtual courses than males (75,209), though from a percentage perspective, each represented about half of the population. Females had a 3% higher pass rate (65% compared to males at 62%), continuing the trend seen in past years of females outperforming their male counterparts on this measure. See Table D2 and Table G7.

By Race/Ethnicity

White students represented 62% of virtual students with African American or Black students totaling the second highest percentage with 19%. While African American or Black students represented 19% of virtual students, approximately 60% of schools enrolled at least one African American or Black student (93% of schools enrolled at least one White student). Asian students had the highest pass rate at 80%. See Table D3 and Table G8. These demographics are similar to the statewide K-12 demographics for 2023-24.See Student Enrollment Count Report.

By Poverty Status

Sixty-four percent of virtual learners were classified as living in poverty. This is 1% higher than the prior year and approximately 10 percentage points higher than the percentage of K-12 students statewide who were economically disadvantaged. See Student Enrollment Count Report. Students living in poverty took 71% of the virtual enrollments for the year. When looking from a school-level perspective, 93% of Michigan schools enrolled at least one student in poverty in a virtual course. Comparatively, 86% of schools enrolled at least one student who was not in poverty in a virtual course. The pass rate for students in poverty (58%) was 19 percentage points lower than students who were not in poverty (77%). See Table D4 and Table G9. In 2022-23, the performance gap was 17 percentage points.

Prior to the pandemic, the data consistently showed that students in poverty performed better in their non-virtual courses. The 2020-21 and 2021-22 school years deviated from that pattern. In 2021-22, we saw that students in poverty had a higher pass rate in their virtual courses (64%) than they did in their non-virtual courses (62%). For the 2022-23 year, this trend was reversed and students in poverty did better in their non-virtual courses (64% compared to 60%). This remained consistent for the 2023-24 year with both students in poverty and those not performing better in their non-virtual courses. See Table D5.

Seventy-four percent of full-time virtual learners were in poverty compared to 60% of part-time virtual learners. The pass rate for full-time virtual learners in poverty was 56% compared to 60% for part-time virtual learners. See Table D6.

To get a sense of how the poverty level of schools might impact virtual learning patterns, we categorized schools into one of four categories based on the percentage of all learners at the school (not just virtual learners) that qualified for free or reduced-price (FRL) meals:

  • Low FRL (<=25%)
  • Mid-Low FRL (>25% to <=50%)
  • Mid-High FRL (>50% to <=75%)
  • High FRL (>75%)

About 6% of all Michigan K-12 students who attended Low FRL schools were virtual learners. Nine percent of the state’s students in Mid-Low FRL, and 12% of those in Mid-High FRL schools were virtual learners. Seventeen percent of students in High FRL schools took virtual courses in the 2023-2024 school year. See Table D7. Although overall virtual enrollments have steadily decreased since pandemic highs, this trend has remained relatively stable. From 2020-21 through the current year, schools with higher percentages of students qualifying for FRL also saw higher percentages of virtual learners.

By Special Education Status

Students using special education services made up 13% of the virtual learners and 14% of the virtual enrollments. These percentages are similar to the statewide percentage of students using special education services (14%) for the 2023-24 school year. See the Student Enrollment Counts Report. Seventy-six percent of Michigan schools enrolled a Special Education student in a virtual course. Students using special education services had a virtual pass rate of 56% compared to 64% for those who did not. See Table D8 and Table G10.

Table D9, shows how virtual enrollments varied by a students’ primary disability. Just over 7,900 students had “Specific Learning Disability” listed as their primary disability. This translated to 39% of the virtual learners receiving special education services. The second and third largest groups were students with Other Health Impairments (4,451) and Emotional Impairment (2,667). These groups represented 22% and 13% respectively of virtual learners receiving special education services. Students with Physical Impairment had the highest virtual pass rate at 79%.

Table D10 shows how the percentage of virtual learners using special education services by primary disability compares to the overall state rates. For instance, only about five percent of the states’ students with an IEP have “Emotional Impairment” listed as their primary disability. However, 26% of those students ended up taking at least one virtual course in 2023-24. These two tables can assist in tracking how virtual learning is being used to target specific disabilities and how well performance follows.

By Home-School / Nonpublic Student Status

Table D11 shows virtual learning data for home-schooled and nonpublic students enrolling in a public school to augment their education. There were just over 7,500 such students, and this group of students generated over 34,000 virtual enrollments, an increase of approximately 14,000 enrollments from the 2022-23 year. These students had a 93% virtual pass rate.

By Full-Time or Part-Time

Thirty-three percent of students (50,803) were enrolled in cyber or full-time virtual schools. Students in these schools accounted for 449,793 or 49% of the virtual enrollments for the year. The pass rate for full-time virtual students was 60%. Sixty-eight percent of virtual learning students were part-time virtual learners, taking some courses virtually to supplement their face-to-face schedule. This subset made up 51% of the virtual enrollments and had a pass rate of 67%. See Table D12. The 67% virtual pass rate was six percentage points lower than the non-virtual pass rate for these students. See Table D13.

Another way to conceptualize full/part-time status is to look at the percentage of a student’s enrollments that were delivered virtually. There were many students (67,800) that had 75% or more of their enrollments reported as being delivered virtually. Examination of pass rates showed that students who had fewer than 25% of their enrollments delivered virtually and those who had 75% or more of their enrollments delivered virtually, outperformed the students in the middle two quartile groups. See Table D14. Table D15 and Table D16 show how the percentage of students, enrollments, and pass rates changed for LEA schools and PSA schools, respectively.

By Mobility Status

For the fourth consecutive year, mobility data were included as part of the data set. The mobility variable included the following statuses: stable, incoming, or outgoing. According to MI School Data, a student is marked as stable if he or she is in the same school for all collections for the school year, incoming students are those who transferred any time after the fall count day, and mobile students were present for fall count day but not subsequent ones. Some of the enrollments did not include information on this variable and were listed in the data tables as “Missing.” More information about this variable is available on the MI School Data Student Mobility page. Click on the About this Report down arrow on that page and then click About the Data to view definitions.

When it came to district stability, over 840,000 (83%) of the virtual enrollments were classified as stable. The pass rate for stable enrollments was 69%. Incoming enrollments to a district represented 7% of the virtual enrollments and had a pass rate of 49%. See Table D17.

When looking at mobility from a poverty perspective, we get a more nuanced picture. Eighty percent of virtual enrollments from students in poverty were stable compared to 90% for students who were not in poverty. The pass rate for stable, in poverty enrollments was 64% but rose to 80% for stable, not in poverty enrollments. For incoming virtual enrollments, there was a nine-percentage point advantage for students who were not in poverty (48% v. 57%). See Table D18.

Looking at mobility from a locale perspective showed somewhat similar virtual enrollment percentages across geographies. Rural schools had the lowest percentage of stable enrollments at 82%. Town schools were next at 83% followed by City schools at 84%. Suburban schools reported 85% of their enrollments as stable. See Table D19. Virtual pass rates showed a similar pattern. Stable enrollments from Rural schools had a 69% pass rate whereas the pass rate was 73% for Suburban schools. The incoming pass rates tended to lag the stable pass rates regardless of the locale. See Table D20.

A final mobility dimension explored was how enrollment and performance varied across full-time and part-time virtual schools. Full-time virtual or cyber schools had a lower percentage of their virtual enrollments designated as stable compared to part-time (77% v. 88%). The full-time pass rate for stable enrollments also lagged that of students from part-time virtual programs (66% v. 71%). See Table D21.

By Non-Virtual Course Performance

Part-time virtual learners with at least three non-virtual courses were classified into one of three categories based on their success in those non-virtual courses. The three categories were:

  • Passed all Non-Virtual Courses
  • Did Not Pass 1 or 2 Non-Virtual Courses
  • Did Not Pass 3 or More Non-Virtual Courses

In total, 82% of part-time virtual learners had at least three or more non-virtual enrollments. Of that group, 45% of students passed all their non-virtual courses, 18% did not pass one or two, and 37% did not pass three or more. There were clear differences in virtual pass rates between the three categories. Students passing all their non-virtual courses had an 84% virtual pass rate. Students who did not pass one or two non-virtual courses had a virtual pass rate of 70%, and those with the lowest non-virtual success had a virtual pass rate of only 45%. See Table D22.

By Virtual Course Performance

Fifty-two percent of virtual learners passed every virtual enrollment they took. This was the same as the prior year. Seventeen percent did not pass any of their virtual enrollments, and 31% passed some, but not all of their virtual enrollments. Students who passed all their virtual courses were responsible for 37% of the virtual enrollments. Students with mixed success generated 48% of the enrollments, and students who did not pass any of their virtual courses accounted for 15% of the virtual enrollments (compared to 14% in 2022-23). See Table D23.

For the students who did not pass any of their virtual courses, 33% only took one or two virtual courses. On the other hand, over 15,000 students did not pass five or more virtual courses, and close to 5,000 students did not pass 11 or more virtual courses. See Table D24 and Table G11. This is the first time since the 2018-19 school year where the percentage of virtual learners who didn’t pass any virtual courses and failed 11 or more exceeded 3% of the population. Further analysis of students failing all their 11 or more virtual courses showed 89% of these students had a single school report data for them. Close to 70% of these students came from full-time virtual programs. Over 800 students used special education services (17%), and over 4,000 of these students (85%) were in poverty.

What Table G11 makes clear is that for students who do not pass any of their virtual enrollments, “withdrawns” were rampant. For the virtual enrollments from students who did not pass any of their virtual enrollments, 47% had a “Withdrawn” status (exited, failing, or passing), and another 24% were classified as “Incomplete.” For those taking 11 or more virtual courses, 39% had a “Withdrawn” status, and 36% were marked “Incomplete.” In each case, only 28% and 24% of the virtual enrollments, respectively, were actually classified as “Completed/Failed.” Please see the section on Pass Rate Calculations for more elaboration on the impact of such issues on pass rates.

By Virtual Usage

Continuing pre-pandemic trends, virtual learners had the highest pass rates when they took one or two virtual courses. Students taking one to two virtual courses had a pass rate of 80% compared to a pass rate of 73% for those taking three to four virtual courses and a pass rate of 61% for students taking five or more virtual courses. About 34% of students took one or two virtual courses; however, 54% were found to have taken five or more virtual courses during the year. See Table D25.

A new table in last year’s report, Table D26, shows pass rate by virtual method and virtual usage. Blended Learning students had the highest overall pass rate (68%) and specifically, those taking one to two virtual courses were among the highest pass rates at 79%. Students enrolled in one to two Online Courses were also among the highest pass rates at 80%. For the Digital Learning and Online Course virtual usage methods, pass rates decreased as virtual course usage increased.

State Assessment

Fast Facts

  • 41% of 11th grade virtual learners who took the SAT scored proficient in the Reading/Writing component. 17% tested proficient in Math.
  • For 8th grade students, the percentages were 55 and 17, respectively.
  • Higher proficiency rates on state assessments were seen with higher non-virtual performance and with students who were not in poverty.
  • Higher percentages of part-time virtual learners reached levels of proficiency on state assessment measures than their full-time counterparts.

By Subject Area

State assessment data can be used to provide an independent measure of student performance. Based on SAT and M-STEP data from students in 11th grade, virtual learners showed lower percentages reaching proficiency on the Evidence-Based Reading and Writing (SAT), Mathematics (SAT), Science (M-STEP) and Social Studies (M-STEP) examinations than the statewide proficiency rates. Forty-one percent of the 11th grade virtual learners tested proficient in Evidence-Based Reading and Writing, and 17% were proficient in Mathematics. For Science, 30% tested proficient whereas Social Studies had 32% of the virtual learners reach proficiency. See Table E1. The pattern was similar for those taking the 8th grade assessments. See Table E2.

By Non-Virtual Performance

As expected, the percentage of 8th and 11th grade virtual learners testing proficient on these state tests varied considerably when accounting for their non-virtual performance. For instance, students taking a minimum of three non-virtual courses and passing all of them had proficiency rates that exceeded the statewide average for each assessment. Students who did not pass one or two of their non-virtual courses and those not passing three or more of their non-virtual courses had much lower rates of proficiency. See Table E3 and Table E4.

By Poverty Status

Students in poverty consistently recorded proficiency rates that were considerably lower than their peers who were not in poverty. As examples, 27% of virtual learners in poverty scored proficient on the 11th grade Evidence-Based Reading and Writing exam compared to 59% for those who were not in poverty. For Mathematics, only 9% of 8th grade virtual learners in poverty scored proficient compared to 37% for those not in poverty. See Table E5 and Table E6.

By Full- or Part-Time Type

Both 8th and 11th grade students taking virtual courses in a part-time capacity had higher rates of proficiency on the assessments compared to full-time virtual learners. For some assessments, the gap was sizable. For instance, the difference was 13 percentage points for 11th grade Mathematics and 14 points for 8th grade Mathematics. See Table E7 and Table E8.

Maps

Berrien, Alpena-Montmorency-Alcona, and Muskegon Area ISDs/RESAs had over 20% of students in their service areas take a virtual course in 2023-24. In total, there were 10 ISDs/RESA with 15% or more of the students taking virtual courses. An additional 17 ISDs/RESA had at least 10% and less than 15% of their students take a virtual course. Only three ISDs/RESAs (Macomb, Livingston, and Manistee) had less than 5% of their students take at least one virtual course. See Figure 2.

Figure 2. 2023-24 Percentage of Students Who Took a Virtual Course (Non-Cyber) by ISD

Map shows Michigan ISDs colored by the percentage of students who took at least one virtual course. All but three ISDs have some color of blue meaning they had at least 5% or more of their students take a virtual course (non-cyber) in 2023-24. In contrast, 10 ISDs had 15% or more of its students with virtual enrollments; see the preceding paragraph for more detail.

Over one in four students (almost 5,700 students) attending a PSA cyber school resided within the Wayne RESA service area. The Genesee, Ingham, Kent, Macomb, and Oakland ISD service areas were the only other ISDs with 1,000 or more of their resident students attending PSA cyber schools. Forty-five of the 56 ISDs had 100 or more students attending a PSA cyber school. See Figure 3.

Figure 3. 2023-24 Count of PSA Cyber School Students by Resident ISD

Map shows Michigan ISDs colored by the percentage of PSA cyber students by resident ISD. The majority of counties have less than 200 resident students who attend a PSA Cyber school. Counties with the highest percentage include Genesee, Ingham, Kent, Macomb, Oakland, and Wayne.

Reflections on Higher Performing Schools

As the above sections of the report make clear, virtual learning performance, in general, has continued to produce mixed results. The analyses in this section will focus exclusively on those schools that achieved pass rates of 80% or higher to glean a clearer picture of what virtual learning looked like for these schools and programs and how it might have differed, if at all, from the state statistics.

There were 648 Michigan schools with virtual pass rates of 80% or higher, reflecting 46% of all schools in the state with virtual learners. These schools reported 56,860 virtual learners or about 37% of the state’s virtual learners. When zooming in on these higher performing schools, the data show:

  • Successful virtual programs can support various numbers of students, enrollments, and courses offerings – These schools showed success with 10 or fewer students (35%) and 100 or more students (27%). See Table F1. Some offered few enrollments (116 schools had one to nine virtual enrollments) while others offered many (292 schools had 100 or more). See Table F2. They also varied in the number of course titles offered. Thirty-eight percent of these schools offered 10 or fewer virtual courses titles. Twenty-two percent had enrollments between 26 and 50 courses, and 17% of these schools had students in more than 50 different virtual courses. See Table F3.
  • LEA and PSA schools can offer successful virtual programs – Forty-six percent of LEA schools with virtual programs had schoolwide virtual pass rates of 80% or higher. For PSA schools, 38% achieved pass rates of 80% or higher. See Table F4. Both traditional school districts and charter districts can run successful virtual programs.
  • Schools in cities, suburbs, towns, and rural settings are proving virtual learning success – All locales had schools with virtual pass rates of 80% or higher. Rural and Suburban schools had almost half (47%) of their schools reach this threshold, and City schools were close behind at 45%. See Table F5. These schools are provide evidence of virtual learning success across the various geographies of the state.
  • These schools show strong results across students of different race/ethnicities – These higher performing schools also showed promise for equitable outcomes for students of different races and ethnicities. The pass rates for African American or Black students (88%) and Hispanic or Latino (89%) were considerably closer to the White pass rate (91%) than it was across all schools. Asian students had the highest pass rate at 93%. See Table F6. For these schools, virtual programs appear to be approaching more equitable outcomes.
  • Students in poverty are succeeding in these virtual programs – Recall that across the entire state, students in poverty had a pass rate (58%) that was 19 percentage points lower than those virtual students who were not in poverty. In these 648 schools, the virtual pass rate for students in poverty rose to 88%—considerably closer to the 93% virtual pass rate for the students in those schools who were not in poverty. In these higher performing schools, students in poverty continued to represent a large percentage of virtual learners (49%) and virtual enrollments (53%), but those percentages were considerably smaller than the 64% of virtual learners and 71% of virtual enrollments seen across all virtual programs across the state. See Table F7. Additionally, virtual program success varied by a school’s free or reduced-lunch category (FRL). Sixty-nine percent of Low FRL schools with virtual learners achieved virtual pass rates of 80% or higher. The virtual pass rate was 60% of the Mid-Low FRL schools, 40% of Mid-High FRL schools, and 30% of High FRL schools. See Table F8. While some High FRL schools showed it was possible, it was considerably rarer than it was for Low FRL schools.
  • Both full- and part-time programs can run effective virtual programs, but success is rarer for full-time programs – Forty-seven percent of part-time programs were able to yield schoolwide virtual pass rates of 80% or higher. It was considerably more difficult for full-time programs to achieve similar success. Only 15 of the 76 full-time programs (20%) reached the 80% pass rate mark. See Table F9.
  • Both general education and alternative education programs reached 80% school-wide virtual pass rates – There were 587 general education schools in Michigan that achieved schoolwide virtual pass rates of 80% or higher. These schools represented 52% of general education schools with virtual programs. For alternative programs, 50 schools reached this mark. As a percentage of alternative programs, it represented just 19% of such schools, indicating that while it is possible, this threshold of success remains a sizable challenge. See Table F10.
  • Virtual students can perform at or above their face-to-face performance level – In these 648 schools, there were 11,664 virtual learners who took a minimum of three virtual courses and had data for a minimum of three non-virtual courses. Eighty percent of these students had virtual pass rates that met or exceeded their non-virtual pass rates. See Table F11.

Conclusion

This year’s report represents the 14th year of data on the effectiveness of virtual learning in Michigan’s K-12 system. Many trends witnessed in past years continue to exist.

Table 1. Summary of Virtual Learning Metrics by School Year Since 2010-11

School Year# of Virtual Learners# of Virtual Enrollments# of SchoolsVirtual Pass Rate
2010-1136,34889,92165466%
2011-1252,219153,58385062%
2012-1355,271185,05390660%
2013-1476,122319,6301,00757%
2014-1591,261445,9321,07260%
2015-1690,878453,5701,02658%
2016-17101,359517,4701,10255%
2017-18112,688581,9111,15855%
2018-19120,669639,1301,22555%
2019-20121,900672,6821,22556%
2020-21418,5133,647,4932,20774%
2021-22208,4601,408,7631,91469%
2022-23159,0561,027,7051,47565%
2023-24154,0871,019,6611,42163%

As Table 1 makes clear, the huge influx of virtual learners during the pandemic has mostly subsided and levels seem to be more in line with pre-pandemic trends. Unfortunately, the reduction in virtual learners and enrollments has been accompanied by an 11-percentage point drop in the virtual pass rate since 2020-21.

As we predicted in last year’s report, the virtual pass rate continued to shift toward pre-pandemic levels. We have been monitoring several factors likely to impact the rate. The first is changes in schools entering and existing. One hundred forty-four new schools were represented in this year’s data while 198 schools from last year dropped out because they didn’t have any virtual learners this year. The new schools added close to 28,000 virtual enrollments, roughly the same amount as the departing schools contributed last year. Interestingly, the new schools outperformed the departing schools; new schools had a 77% pass rate this year while the schools that left had a 70% pass rate the prior year. Obviously, the weight of this change on the pass rate was minimal compared to schools who were in both last year and this year. There were 1,277 schools in both years. The pass rate for these schools dropped from 65% to 63%.

A second shift related to Alternative Education programs. Prior to the pandemic, Alternative Education programs produced close to half the virtual enrollments. At the height of the pandemic, they dropped to just 10%. Since then, the percentage has been rebounding; this year, alternative education enrollments rose back up to be 40% of the virtual enrollments, 4 percentage points higher than the prior year. This is particularly important because the pass rate gap between Alternative Education programs and General Education programs was sizable. For this year, that performance gap was 21 percentage points lower for Alternative Education programs.

A third dynamic to understand relates to the grade levels of virtual learners. Pre-pandemic, we saw about 80% of the virtual enrollments come from the high school level. That percentage dropped to 40% for the 2020-21 school year. This year, the high school percentage was up to 71% of virtual enrollments, three percentage points higher than the prior year. With the K-5 pass rates at 80% and the 6-8th grade pass rate at 70%, enrollments at these levels tend to prop up the overall virtual pass rate. For instance, if the elementary, middle school, and high school virtual pass rates remained the same, but the proportion of virtual enrollments were the same as the 2019-20 school year, we would have seen the virtual pass rate drop by about 1.5%. Therefore, if virtual enrollments continue to shift toward a larger percentage of high school enrollments, a slight decline in the overall pass rate is predictable.

Given the 2023-24 figures for these three key factors, we predict there is likely more correction coming. Thus, we believe the overall pass rate will continue to backslide. That said, there is optimism that the floor may be closer to 60% rather than the 55% that was consistently observed prior to the pandemic.

On the positive side, the report also captured examples of schools and students benefiting from virtual learning. Forty-six percent of virtual learners attended schools that had virtual pass rates of 80% or higher, and equity of outcomes was much closer to desired reality. Clearly, these schools add to the evidence that online learning can and does work for many schools and students. To date, however, these schools reflect more of the exception—the hope—rather than the rule. As school, community, and legislative leaders evaluate their virtual learning programs, the data provided in this report can serve as informative benchmarks, and the varied analyses can be used as models to understand local implementation success at a deeper level.

School leaders looking to take the next step forward with their virtual programs may find value in the many free resources that Michigan Virtual has authored. These resources include a series of practical guides to online learning designed for students, parents, teachers, mentors, school administrators, and school board members. Michigan Virtual also provides quality reviews of supplemental online learning programs to Michigan schools at no cost. There are also the National Standards for Quality Online Learning, which offer frameworks to evaluate online programs, online teaching, and online courses. Finally, educational leaders looking to communicate and collaborate with others around the future of learning may find value in the Future of Learning Council.

Appendix A – Methodology

COVID-19 Impact

Readers should note that the COVID-19 pandemic appears to have significantly impacted data from 2019-20 through the 2022-23 reports. It may still be impacting this year’s data; caution is advised when comparing this year’s findings with prior years.

About the Data

The data for this report came from the following sources:

  • Michigan Student Data System – School Year 2023-2024;
  • Educational Entity Master (EEM);
  • Michigan Student Data System Teacher Student Data Link (TSDL) – Collection Year 2023-2024; and
  • Michigan’s K-12 Virtual Learning Effectiveness Report, 2022-23 – Used for comparing this year’s data with the 2022-23 school year.

Because the data for this report incorporates a variety of sources, the findings within may differ from those found through the MI School Data portal which may use different query parameters.

Enrollments classified as virtual in this report were treated as such due to the TSDL virtual method field indicating virtual delivery. Enrollments where the TSDL virtual method field was set to “Blended Learning,” “Digital Learning,” or “Online Course” were treated as virtual. According to the Michigan Student Data System Collection Details Manual Version 4.0, the virtual method field indicates “the type of virtual instruction the student is receiving.” (See page 346).

In prior years of the report, additional strategies, such as keyword searches of the local course title field, were used to flag virtual enrollments. Past years demonstrated that such efforts yield a low percentage of the virtual enrollments. This effort was discontinued starting with the 2020-21 report.

Michigan Virtual Students

Because this report is published by Michigan Virtual, some people have falsely concluded that the data in this report is about Michigan Virtual students only. Quite the contrary, the data in this report represent K-12 virtual learning across all providers, and Michigan Virtual as a provider would reflect only a small percentage of the virtual enrollments covered in this report. Readers interested in Michigan Virtual specific results can find those published in its Annual Report: 2023-24, which include data on the number of students, districts, and enrollments served as well as its virtual pass rate.

Enrollment Calculations

Enrollment data for this report principally relies on data collected in the MSDS Student Course Component. See page 324 of the Michigan Student Data System Collection Details Manual Version 4.0 for more details about this collection. Through this collection, the State collects data for each course a student takes. It is important to note some key variations in the data collection that impact possible approaches to calculating enrollment counts.

An example of known variation is the local naming conventions for course titles. For instance, one school may call a course “English 9”, another “9th Grade English,” and yet another “ELA 9.” The Student Course Component resolves this issue by requiring schools to report each enrollment with a Subject Area Code and a Course Identifier Code (SCED Course Code). These codes are created by the National Center for Education Statistics through the School Courses for the Exchange of Data (SCED) initiative. By using these standardized codes, we can compare data more readily across schools.

Another important variation involves course sections. In addition to the course title and SCED Course Code, schools frequently parse a course title into multiple sections. For example, a school with trimester courses may break a course into three sections, one for each trimester. A semester-based school, on the other hand, may break up a course into two sections. Others have chosen to break their courses into even smaller units such as quarters while others report what seem to be course units or lessons. Sometimes, schools use course sections to differentiate the online and face-to-face components of courses. For our purposes, the key point is that there is not always one enrollment record per student per course title.

Multiple course sections for a single course title are not, in and of themselves, problematic. They could be resolved if a weighting variable—for instance, the fraction of a Carnegie unit each section represents—was collected. The State does collect a field, Credits Granted, in the Student Course Component that might be used. However, two main drawbacks significantly impair its use. The first is that the field is only required for Migrant-eligible and dual-enrolled students. As such, many enrollments do not have a reported value. The second hindrance is inconsistent reporting of data that do exist. In some cases, schools report the Carnegie unit that was possible to be earned (same value no matter the completion status of the enrollment), although others treat the field value as variable depending on how well the student did (e.g., report a 0.5 for a student with a “Completed/Passed” completion status, but a 0.0 for a student who had a “Completed/Failed” completion status). These drawbacks make the Credits Granted field unusable as a weighting variable.

The challenge of variable course sections reported is multiplied when more than one school entity reports on the same pupil. The data appear to contain instances of two or more schools reporting on the same enrollments. Flavors of this appear to be a school partnering with an ISD to provide special education services and both reporting the same enrollments. Another example appears to occur when a student transfers from one district and then enrolls in the same courses at the new school. Table A1 and Table A2 highlight enrollment variation.

Table A1. 2023-24 Virtual Enrollment Counts and Pass Rates by Number of Virtual Enrollments Per Student/SCED Code Pair

# of Virtual Enrolls per Student/SCED Code Pair# of Enrolls% of EnrollsPass Rate
1458,87045%62%
2437,28043%65%
354,0815%50%
425,5443%52%
54,2750%50%
6 or More39,6114%84%
Total1,019,661100%63%

Table A2. 2023-24 Percentage of Students by Total Student Enrollment Counts (Virtual and Non-Virtual) and Full- or Part-Time Schools

Enrollment Count (Virtual and Non-Virtual) Full-TimePart-Time
1 to 58%7%
6 to 1025%22%
11 to 1548%42%
16 to 2013%18%
21+7%11%
Total100%100%

Given these data limitations, enrollment counts and related data figures in this report should be treated as estimates that, generally speaking, convey the trends observed for the school year.

Pass Rate Calculations

For this report, the pass rate was calculated based on data reported in the “Completion Status” field. For more information about the Completion Status field, including definitions for each status, see page 341 of the Michigan Student Data System Collection Details Manual Version 4.0. Column one of Table A3 displays the various statuses reported by schools for the virtual enrollments.

Table A3. 2022-23 Number and Percentage of Virtual Enrollments by Completion Status

Completion Status# of Enrolls% of Enrolls
Audited1,1300%
Completed/Failed142,34614%
Completed/Passed644,48463%
Incomplete93,5039%
Ongoing Enrolled740%
Tested Out2030%
Withdrawn/Exited86,6548%
Withdrawn/Failing15,7252%
Withdrawn/Passing35,5423%
Total1,019,661100%

Throughout this report, the pass rate simply represents the percentage of virtual enrollments with a status of “Completed/Passed.” Notice that the percentage of enrollments with a “Completed/Passed” status in Table A3 matches the statewide pass rate. This pass rate formula remains consistent with past reports.

Please keep in mind that calculating the pass rate in this manner will result in the lowest possible percentage. To illustrate why this is, consider the completion status of “Audited.” These virtual enrollments are not “failures” per se, but act as such in the formula since they are added to the formula’s denominator without impacting the numerator. Another example is enrollments with a completion status of “Incomplete.” About 9% of the virtual enrollments in this report were classified as “Incomplete.” As such, they are treated in the report’s pass rate formula as zero passes, even though some may eventually be awarded a passing status. Finally, it is unclear how to best treat enrollments with a “Withdrawn” status. For instance, 3% of the virtual enrollments this year were marked as “Withdrawn/Passing,” meaning that the student was passing the course at the time the student was withdrawn. Should these enrollments be counted as failures? What about students whose enrollments were marked as “Withdrawn/Exited” (8% of the virtual enrollments)? Based on the data available, there is no way to determine whether that exiting occurred in the first few weeks of class or the final weeks of class. The data do not provide insight into whether the student was re-enrolled in a different course or whether it was too late for re-enrollment in a credit-bearing opportunity for the student.

The research team raises these issues because they represent questions for which there are no definitive answers. In the end, the team decided to report the pass rate as the percentage of all virtual enrollments that were reported as “Completed/Passed.” To provide readers with a better idea of the impact of this approach, additional data tables are provided in Appendix G that allow interested readers to draw their own conclusions and to calculate their own formulas for many of the pass rates reported.

Appendix B – School Tables

Note: Click on the hyperlinked table number to return to the section of the report that discusses the table.

Table B1. Two Year Comparison (2022-23 and 2023-24) of Virtual Enrollment Data with Pass Rates

School Years# of Schools% of 2023-24 Schools# of 2023-24 Enrolls% of 2023-24 Enrolls2023-24 Pass Rate
2022-23 Only198NANANANA
2023-24 Only14410%27,7193%77%
Both Years (2022-23 and 2023-24)1,27790%991,94297%63%
Note: The # schools in the “2022-23 Only” row had 27,830 enrollments and a 70% pass rate for that year. The 1,277 schools in both years had a pass rate of 65% for 2022-23.

Table B2. Virtual Enrollment Differences for Schools Reporting Virtual Learners in Both 2022-23 and 2023-24

Year-to-Year Enroll Difference
(2023-24 minus 2022-23)
# of Schools
Both Years
% of Schools
Both Years
# of Enrolls
Current Year
% of Enrolls
Current Year
+1,000 or More302%394,33140%
+500 to +999292%45,1635%
+100 to +49916313%176,16418%
0 to +9940231%93,2439%
-1 to -9943234%81,6078%
-100 to -49917314%99,76610%
-500 to -999212%36,5134%
-1,000 or More272%65,1557%
Total1,277100%991,942100%

Table B3. Virtual Pass Rate Differences for Schools Reporting Virtual Learners in Both 2022-23 and 2023-24

Year-to-Year Pass Rate Difference
(2023-24 minus 2022-23)
# of Schools
Both Years
% of Schools
Both Years
# of Enrolls
Current Year
% of Enrolls
Current Year
50 or More Percentage Points Increase151%2,5850%
25 to 49 Percentage Points Increase564%50,0335%
10 to 24 Percentage Points Increase15112%90,1469%
0 to 9 Percentage Points Increase38130%400,14040%
1 to 9 Percentage Points Decrease29223%367,25937%
10 to 24 Percentage Points Decrease13611%48,9015%
25 to 49 Percentage Points Decrease413%19,5512%
50 or More Percentage Points Decrease191%5,8751%
NA – < 10 Enrolls in One or Both Years18615%7,4521%
Total1,277100%991,942100%

Table B4. 2023-24 K-12 Virtual Enrollment Data by Grade Level

Grade Level# of Schools# of Enrolls% of Enrolls% ChangePass Rate% Change from Prior Year
K10618,1342%-33%77%-8%
112421,6152%-27%82%-5%
212921,1082%-24%81%-6%
315621,8672%-25%78%-7%
415921,7242%-26%78%-6%
518624,4682%-20%80%-2%
630243,9074%2%73%2%
734855,3135%4%72%4%
845866,1776%5%67%1%
9767143,58914%6%45%-3%
10848161,48616%3%55%-2%
11875171,02717%2%62%1%
12878249,24624%6%67%2%
Total1,4211,019,661100%-1%63%-1%

Table B5. 2023-24 Pass Rate Comparison for Students in Their Virtual and Non-Virtual Courses by Grade Level

Grade LevelVirtual Pass RateNon-Virtual Pass Rate
K77%64%
182%71%
281%70%
378%67%
478%68%
580%68%
673%68%
772%59%
867%59%
945%61%
1055%69%
1162%78%
1267%86%
Total63%74%

Table B6. 2023-24 Number and Percentage of Schools and Virtual Enrollments by School Pass Rate

School Pass Rate# of Schools% of Schools# of Enrolls% of Enrolls
0% to <10%906%26,2863%
10% to <20%232%3,6650%
20% to <30%383%77,0058%
30% to <40%675%91,5319%
40% to <50%675%41,7594%
50% to <60%1138%177,94617%
60% to <70%15811%164,52716%
70% to <80%20514%143,44014%
80% to <90%24918%165,48516%
90% to 100%41129%128,01713%
Total1,421100%1,019,661100%

Table B7. 2023-24 Number and Percentage of Schools and Virtual Enrollments by Entity Type

Entity Type# of Schools% of Schools# of Enrolls% of Enrolls
ISD School272%7,3711%
ISD Unique Education ProviderNR0%NR0%
LEA School1,25188%614,06360%
LEA Unique Education Provider101%9,0591%
PSA School1289%388,84638%
State SchoolNR0%NR0%
Total1,421100%1,019,661100%
Note: Data are not reported (NR) out of caution for confidentiality.

Table B8. 2023-24 Number of Virtual Enrollments and Virtual Pass Rate by Entity Type

Entity Type# of EnrollsPass Rate
ISD School7,37170%
ISD Unique Education Provider301NR
LEA School614,06364%
LEA Unique Education Provider9,05993%
PSA School388,84660%
State School21NR
Total1,019,66163%
Note: Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table B9. 2023-24 Number and Percentage of Full-Time (FT) Virtual or Cyber School by Entity Type

Entity Type# of FT Schools% of FT Schools
ISD SchoolNRNR
LEA School5572%
LEA Unique Education ProviderNRNR
PSA School1824%
Total76100%
Note: Data are not reported (NR) out of caution for confidentiality.

Table B10. 2023-24 Number and Percentage of Students and Enrollments from Full-Time (FT) Virtual or Cyber Schools with Virtual Pass Rates by Entity Type

Entity Type# of FT Students% of FT Students# of FT Enrolls% of FT EnrollsPass Rate
ISD School3531%3,1571%NR
LEA School20,91941%182,69137%55%
LEA Unique Education Provider2701%2,3080%NR
PSA School29,49758%311,63762%62%
Total50,803100%499,793100%60%
Note: Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table B11. 2023-24 Number and Percentage of Part-Time (PT) Virtual Schools by Entity Type

Entity Type# of PT Schools% of PT Schools
ISD School262%
ISD Unique Education Provider40%
LEA School1,19689%
LEA Unique Education Provider81%
PSA School1108%
State School10%
Total1,398100%

Table B12. 2023-24 Number and Percentage of Students and Enrollments from Part-Time (PT) Virtual Schools with Virtual Pass Rates by Entity Type

Entity Type# of PT Students% of PT Students# of PT Enrolls% of PT EnrollsPass Rate
ISD School1,1831%4,2141%81%
ISD Unique Education ProviderNR0%NR0%NR
LEA School92,10888%431,37283%68%
LEA Unique Education Provider1,0141%6,7511%NR
PSA School11,29711%77,20915%54%
State SchoolNR0%NR0%NR
Total105,101100%519,868100%67%
Note: Because some students took courses across multiple entity types, a student may be counted toward more than one Entity Type. The total row, however, reflects the number of unique students. Data are not reported (NR) out of caution for confidentiality. Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table B13. 2023-24 Number and Percentage of Schools and Virtual Enrollments by School Emphasis

School Emphasis# of Schools% of Schools# of Enrolls% of Enrolls
Alternative Education26719%409,96440%
General Education1,12979%608,00060%
Special Education or Vocational/CTE252%1,6970%
Total1,421100%1,019,661100%

Table B14. 2023-24 Number of Virtual Enrollments and Virtual Pass Rate by School Emphasis

School Emphasis# of EnrollsPass Rate
Alternative Education409,96451%
General Education608,00072%
Special Education1,47662%
Vocational/CTE221NR
Total1,019,66163%
Note: Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table B15. 2023-24 Virtual Pass Rates for General Education and Alternative Education Schools by Entity Type

Entity TypeGeneral Ed Pass RateAlternative Ed Pass Rate
ISD SchoolNRNR
LEA School76%53%
LEA Unique Education ProviderNRNR
PSA School67%40%
Total72%51%
Note: Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table B16. 2023-24 Number and Percentage of Schools and Virtual Enrollments by Number of Virtual Enrollments per School

# of Virtual Enrolls Per School# of Schools% of Schools# of Enrolls% of Enrolls
1 to 918613%8100%
10 to 191067%1,5150%
20 to 29806%1,9190%
30 to 39483%1,6070%
40 to 49463%2,0520%
50 to 59352%1,9120%
60 to 69363%2,3070%
70 to 79292%2,1190%
80 to 89302%2,5430%
90 to 99252%2,3700%
100+80056%1,000,50798%
Total1,421100%1,019,661100%

Table B17. 2023-24 Percentage of Schools by Ratio of Virtual Courses to Student and School Pass Rate

School Pass Rate1 to 2 Virtual Courses / Student3 to 4 Virtual Courses / Student5+ Virtual Courses / Student
0% to <10%4%5%18%
10% to <20%0%0%7%
20% to <30%1%2%9%
30% to <40%2%3%16%
40% to <50%2%4%14%
50% to <60%4%10%20%
60% to <70%8%12%28%
70% to <80%14%18%25%
80% to <90%23%21%23%
90% to 100%40%25%40%
Total100%100%100%

Table B18. 2023-24 Number and Percentage of Schools and Virtual Enrollments by Locale

Locale# of Schools% of Schools# of Enrolls% of Enrolls
Rural48034%302,37230%
Town19914%119,89412%
Suburb51436%383,13938%
City22015%165,82916%
Not Specified81%48,4275%
Total1,421100%1,019,661100%

Table B19. 2023-24 Percentage of Schools with Virtual Enrollments by Virtual Enrollment Totals and Locale

Locale1 to 24 Enrolls25 to 49 Enrolls50 to 74 Enrolls75 to 99 Enrolls100+ EnrollsTotal
Rural23%8%8%7%54%100%
Town26%9%5%3%57%100%
Suburb25%8%6%4%57%100%
City22%12%4%2%60%100%
Not Specified13%0%13%0%75%100%

Table B20. 2023-24 Virtual Pass Rate by Locale

LocalePass Rate% Change from 22-23
Rural63%1%
Town67%1%
Suburb68%4%
City62%-11%
Not SpecifiedNRNR
Total 63%-2%
Note: Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table B21. 2023-24 Percentage of Schools with Virtual Enrollments by Building Pass Rate and Locale

Locale0% to 20% Pass Rate20% to 40% Pass Rate40% to 60% Pass Rate60% to 80% Pass Rate80% to 100% Pass RateTotal
Rural5%7%13%28%47%100%
Town8%7%14%31%40%100%
Suburb9%7%13%24%47%100%
City10%10%11%24%45%100%
Not SpecifiedNRNRNRNRNR100%
Total8%8%13%26%46%100%
Note: Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table B22. 2023-24 Number and Percentage of Schools with Virtual Enrollments by School Free or Reduced-Price Lunch Categories

Free or Reduced-Price Lunch Category# of Schools with Virtual Students# of MI Schools (All)% of Schools with Virtual Learners
Low FRL (<=25%)14936741%
Mid-Low FRL (>25% to <=50%)36788042%
Mid-High FRL (>50% to <=75%)5321,14247%
High FRL (>75%)36998937%
Missing4NANA
Total1,4213,37842%
Note: All Michigan K-12 schools with building codes were used to calculate the state figures. State data are available through MI School Data.

Table B23. 2023-24 Number and Percentage of Virtual Enrollments with Virtual Pass Rate by School Free or Reduced-Price Lunch Categories

Free or Reduced-Price Lunch Category# of Enrolls% of EnrollsPass Rate
Low FRL (<=25%)45,1134%85%
Mid-Low FRL (>25% to <=50%)158,59116%82%
Mid-High FRL (>50% to <=75%)362,49236%57%
High FRL (>75%)453,23544%60%
Missing2300%NR
Total1,019,661100%63%
Note: Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table B24. 2023-24 Number of Schools, Students, and Enrolls for Schools with 1,000 or More Virtual Enrollments with Courses Per Student and Pass Rate by Entity Type

Entity Type# of Schools# of Students# of EnrollsCourses Per StudentPass Rate
LEA School15351,329399,5767.861%
PSA School4635,998368,45910.261%
Other51,60012,4887.8NR
Total20488,136780,5238.961%
Note: Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table B25. 2023-24 Number of Schools, Students, and Enrolls for Schools with Less Than 1,000 Virtual Enrollments with Courses Per Student and Pass Rate by Entity Type

Entity Type# of Schools# of Students# of EnrollsCourses Per StudentPass Rate
LEA School1,09861,815214,4873.570%
PSA School824,74320,3874.354%
Other371,5134,2642.875%
Total1,21767,835239,1383.569%

Table B26. 2023-24 Number of Schools, Students, and Enrolls for Schools with 1,000 or More Virtual Enrollments with Courses Per Student and Pass Rate by School Emphasis

School Emphasis# of Schools# of Students# of EnrollsCourses Per StudentPass Rate
Alternative Education10436,075344,0919.550%
General Education10052,775436,4328.370%
Total20488,136780,5238.961%

Table B27. 2023-24 Number of Schools, Students, and Enrolls for Schools with Less Than 1,000 Virtual Enrollments with Courses Per Student and Pass Rate by School Emphasis

School Emphasis# of Schools# of Students# of EnrollsCourses Per StudentPass Rate
Alternative Education1639,84565,8736.753%
General Education1,02957,451171,5683.075%
Other258841,6971.965%
Total1,21767,835239,1383.569%

Appendix C – Course Tables

Note: Click on the hyperlinked table number to return to the section of the report that discusses the table.

Table C1. 2023-24 Number of Schools, Virtual Students, and Virtual Enrollments with Percentage of Virtual Enrollments and Virtual Pass Rate by Subject Area

Subject Area# of Schools# of Students# of Enrolls% of EnrollsPass Rate
Agriculture, Food, and Natural Resources1669171,1180%82%
Architecture and Construction712944500%81%
Business and Marketing5579,84213,4741%76%
Communication and Audio/Visual Technology3002,9873,6380%75%
Engineering and Technology1741,7952,3110%84%
English Language and Literature1,22089,666186,98718%61%
Health Care Sciences3612,7483,5640%79%
Hospitality and Tourism1461,0151,2950%71%
Human Services4339,55111,5441%72%
Information Technology61012,95616,8822%72%
Life and Physical Sciences1,21984,189154,63215%62%
Manufacturing413104930%89%
Mathematics1,26089,313176,48317%59%
Military Science1534NR0%NR
Miscellaneous83333,17065,0176%72%
Nonsubject Specific1204,2138,1351%93%
Physical, Health, and Safety Education99350,26976,1847%64%
Public, Protective, and Government Services2651,5131,8360%81%
Religious Education and Theology4692NR0%NR
Social Sciences and History1,22488,751165,73916%64%
Transportation, Distribution, and Logistics26961260%87%
Visual and Performing Arts91248,20875,6287%63%
World Languages91032,41753,9845%58%
Total1,421154,0871,019,661100%63%
Note: Since schools and students may have enrollments in multiple subject areas, row counts may not equal to the total. The total row, however, reflects the number of unique schools and students. Data are not reported (NR) out of caution for confidentiality. Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table C2. 2023-24 Pass Rate Comparison for Students in Their Virtual and Non-Virtual Courses by Subject Area

Subject AreaVirtual Pass RateNon-Virtual Pass Rate
Agriculture, Food, and Natural Resources82%81%
Architecture and Construction81%82%
Business and Marketing76%85%
Communication and Audio/Visual Technology75%84%
Engineering and Technology84%78%
English Language and Literature61%73%
Health Care Sciences79%87%
Hospitality and Tourism71%82%
Human Services72%81%
Information Technology72%76%
Life and Physical Sciences62%71%
Manufacturing89%80%
Mathematics59%69%
Military ScienceNR74%
Miscellaneous72%72%
Nonsubject Specific93%57%
Physical, Health, and Safety Education64%75%
Public, Protective, and Government Services81%83%
Religious Education and TheologyNR88%
Social Sciences and History64%72%
Transportation, Distribution, and Logistics87%83%
Visual and Performing Arts63%80%
World Languages58%74%
Total63%73%
Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students or fewer than 100 enrollments.

Table C3. 2023-24 Number of Schools, Virtual Students, and Virtual Enrollments with Percentage of Virtual Enrollments and Virtual Pass Rate by Course Title for the Top 10 Most Enrolled in English Language and Literature Courses

English Language and Literature Course Titles# of Schools# of Students# of Enrolls% of EnrollsPass Rate
English/Language Arts I (9th grade)70719,99030,55816%46%
English/Language Arts II (10th grade)68319,97730,40416%53%
English/Language Arts III (11th grade)64817,68028,22715%59%
English/Language Arts IV (12th grade)63814,63125,51614%63%
Language Arts (grade 8)3225,4919,8805%63%
Language Arts (grade 7)2624,2648,0554%70%
Language Arts (grade 6)2013,1886,2943%71%
Language Arts—General602,5443,9432%55%
Language Arts (grade 1)721,8213,1862%78%
Language Arts (kindergarten)641,5233,1312%73%
Total1,08272,772149,19480%58%
Note: Since schools and students may have enrollments in multiple titles, row counts may not equal the total. The total row, however, reflects the number of unique schools and students. % of Enrolls based on overall total of 186,987 for this subject area.

Table C4. 2023-24 Number of Schools, Virtual Students, and Virtual Enrollments with Percentage of Virtual Enrollments and Virtual Pass Rate by Course Title for the Top 10 Most Enrolled in Mathematics Courses

Mathematics Course Titles# of Schools# of Students# of Enrolls% of EnrollsPass Rate
Geometry73521,61133,73119%54%
Algebra I65919,51430,13317%46%
Algebra II68317,66627,82716%61%
Consumer Mathematics4017,38510,9166%77%
Mathematics (grade 7)2844,7708,7905%67%
Mathematics (grade 8)2814,8698,6475%59%
Mathematics (grade 6)2463,5546,9294%69%
Business Mathematics1882,9224,5163%48%
Mathematics—Other1792,6604,2092%54%
Pre-Algebra2142,7924,0922%51%
Total1,08269,195139,79079%57%
Note: Since schools and students may have enrollments in multiple titles, row counts may not equal the total. The total row, however, reflects the number of unique schools and students. % of Enrolls based on overall total of 176,483 for this subject area.

Table C5. 2023-24 Number of Schools, Virtual Students, and Virtual Enrollments with Percentage of Virtual Enrollments and Virtual Pass Rate by Course Title for the Top 10 Most Enrolled in Life and Physical Sciences Courses

Life and Physical Sciences Course Titles# of Schools# of Students# of Enrolls% of EnrollsPass Rate
Biology72122,63534,53622%53%
Chemistry59813,55220,68713%60%
Earth Science3929,79414,90410%52%
Physical Science3408,71213,8159%55%
Earth and Space Science2425,3788,0275%65%
Environmental Science3524,7087,0105%54%
Science (grade 8)2743,7606,9945%68%
Science (grade 7)2513,6516,9264%73%
Science (grade 6)2142,8095,5314%73%
Physics3603,8355,4514%65%
Total1,03865,372123,88180%58%
Note: Since schools and students may have enrollments in multiple titles, row counts may not equal the total. The total row, however, reflects the number of unique schools and students. % of Enrolls based on overall total of 154,632 for this subject area.

Table C6. 2023-24 Number of Schools, Virtual Students, and Virtual Enrollments with Percentage of Virtual Enrollments and Virtual Pass Rate by Course Title for the Top 10 Most Enrolled in Social Sciences and History Courses

Social Sciences and History Course Titles# of Schools# of Students# of Enrolls% of EnrollsPass Rate
U.S. History—Comprehensive56315,34722,87814%53%
World History and Geography43712,49119,24512%49%
Economics67214,03415,3749%64%
World History—Overview3558,91513,5918%60%
U.S. Government—Comprehensive3839,27110,1166%63%
Social Studies (grade 8)2354,6308,6825%66%
Social Studies (grade 7)1893,4906,7504%73%
Civics3425,6526,4284%58%
Psychology4835,1186,3254%77%
Social Studies (grade 6)1973,2705,9114%75%
Total1,04563,324115,30070%61%
Note: Since schools and students may have enrollments in multiple titles, row counts may not equal the total. The total row, however, reflects the number of unique schools and students. % of Enrolls based on overall total of 165,739 for this subject area.

Table C7. 2023-24 Number of Schools, Virtual Students, and Virtual Enrollments with Percentage of Virtual Enrollments and Virtual Pass Rate for AP Courses

AP Course Title# of Schools# of Students# of Enrolls% of EnrollsPass Rate
AP Art History392103076%74%
AP Biology531101794%85%
AP Calculus AB541151774%80%
AP Calculus BC37821493%93%
AP Chemistry34871523%83%
AP Chinese Languages: Language and CultureNRNRNRNRNR
AP Comparative Government and PoliticsNRNRNRNRNR
AP Computer Science A731983337%87%
AP Computer Science Principles1649NRNRNR
AP DrawingNRNRNRNRNR
AP EconomicsNRNRNRNRNR
AP English Language and Composition811732635%83%
AP English Literature and Composition581802896%89%
AP Environmental Science591642856%86%
AP European History1729NRNRNR
AP French Language and CultureNRNRNRNRNR
AP Government1669NRNRNR
AP Human Geography31681142%93%
AP Macroeconomics631841984%89%
AP Microeconomics611521633%91%
AP Music TheoryNR33NRNRNR
AP Physics 12949NRNRNR
AP Physics 211NRNRNRNR
AP Physics C28691172%90%
AP Physics C: Electricity and MagnetismNRNRNRNRNR
AP Physics C: MechanicsNRNRNRNRNR
AP Psychology11547180817%89%
AP Spanish Language and Culture35721182%82%
AP Spanish Literature and CultureNRNRNRNRNR
AP Statistics641933447%92%
AP U.S. Government and Politics551511593%93%
AP U.S. History531211854%87%
AP World History: Modern28791212%90%
Total2692,6184,877100%87%
Note: An additional 530 enrollments had a course type listed as Advanced Placement but did not match an AP SCED Code. Similarly, there existed 84 local course titles with AP in the title that did not have an AP SCED Code. Thus, it is very likely the data above underreports the number of students taking AP courses virtually. Since schools and students may have enrollments in multiple titles, row counts may not equal the total. The total row, however, reflects the number of unique schools and students. Data are not reported (NR) out of caution for confidentiality. Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table C8. 2023-24 Virtual Enrollments Percentage by Locale and Subject Area

Subject Area% Rural% Town% Suburb% City% Not Specified
Agriculture, Food, and Natural Resources0%0%0%0%0%
Architecture and Construction0%0%0%0%0%
Business and Marketing1%2%2%1%0%
Communication and Audio/Visual Technology0%0%1%0%0%
Engineering and Technology0%0%0%0%0%
English Language and Literature18%20%18%17%21%
Health Care Sciences0%1%0%0%0%
Hospitality and Tourism0%0%0%0%0%
Human Services1%2%1%1%1%
Information Technology1%1%2%3%0%
Life and Physical Sciences15%14%15%16%18%
Manufacturing0%0%0%0%0%
Mathematics17%16%17%19%19%
Military Science0%0%0%0%0%
Miscellaneous5%5%8%7%2%
Nonsubject Specific2%0%1%0%0%
Physical, Health, and Safety Education8%9%7%7%6%
Public, Protective, and Government Services0%0%0%0%0%
Religious Education and Theology0%0%0%0%0%
Social Sciences and History17%16%16%16%17%
Transportation, Distribution, and Logistics0%0%0%0%0%
Visual and Performing Arts8%8%6%8%11%
World Languages5%5%6%5%6%
Total100%100%100%100%100%

Table C9. 2023-24 Virtual Enrollment Pass Rates by Locale and Subject Area

Subject AreaRural Pass RateTown Pass RateSuburban Pass RateCity Pass RateNot Specified Pass Rate
Agriculture, Food, and Natural Resources78%87%85%NRNR
Architecture and Construction84%71%NRNRNR
Business and Marketing74%79%78%66%NR
Communication and Audio/Visual Technology73%84%76%68%NR
Engineering and Technology83%81%88%83%NR
English Language and Literature60%64%66%58%NR
Health Care Sciences83%76%78%78%NR
Hospitality and Tourism75%66%80%42%NR
Human Services75%73%74%78%NR
Information Technology77%62%68%73%NR
Life and Physical Sciences62%64%68%62%NR
Manufacturing95%NRNRNRNR
Mathematics58%62%65%59%NR
Military ScienceNRNRNRNRNR
Miscellaneous68%65%81%56%NR
Nonsubject Specific95%83%88%89%NR
Physical, Health, and Safety Education61%76%64%68%NR
Public, Protective, and Government Services78%77%84%NRNR
Religious Education and TheologyNRNRNRNRNR
Social Sciences and History64%68%68%61%NR
Transportation, Distribution, and LogisticsNRNRNRNRNR
Visual and Performing Arts63%70%65%71%NR
World Languages57%65%64%56%NR
Total63%67%68%62%NR
Note: Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table C10. 2023-24 Number and Percentage of Virtual Enrollments with Pass Rates by Student Sex and Subject Area

Subject Area# of Female Enrolls# of Male Enrolls% of Female Enrolls% of Male EnrollsFemale Pass RateMale Pass Rate
Agriculture, Food, and Natural Resources7084100%0%83%80%
Architecture and ConstructionNR3800%0%NR81%
Business and Marketing6,8166,6581%1%77%75%
Communication and Audio/Visual Technology2,0211,6170%0%76%73%
Engineering and Technology9381,3730%0%83%84%
English Language and Literature94,08592,90218%19%62%59%
Health Care Sciences2,7128521%0%82%70%
Hospitality and Tourism7775180%0%73%67%
Human Services6,5714,9731%1%73%72%
Information Technology7,4049,4781%2%72%72%
Life and Physical Sciences78,18976,44315%15%63%61%
Manufacturing1323610%0%89%89%
Mathematics88,75587,72817%18%60%58%
Military ScienceNRNR0%0%NRNR
Miscellaneous32,85832,1596%6%74%70%
Nonsubject Specific3,8734,2621%1%92%93%
Physical, Health, and Safety Education38,62237,5627%8%65%63%
Public, Protective, and Government Services1,1996370%0%84%74%
Religious Education and TheologyNRNR0%0%NRNR
Social Sciences and History85,31180,42816%16%65%62%
Transportation, Distribution, and LogisticsNR1150%0%NR87%
Visual and Performing Arts39,74635,8828%7%65%61%
World Languages28,77525,2096%5%61%55%
Total519,654500,007100%100%65%62%
Note: Data are not reported (NR) out of caution for confidentiality. Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table C11. 2023-24 Number of Schools, Virtual Students, and Virtual Enrollments with Percentage of Virtual Enrollments and Pass Rate by Virtual Method

Virtual Method# of Schools# of Students# of Enrolls% of EnrollsPass Rate
Blended Learning18811,59071,9827%68%
Digital Learning32020,52692,3449%63%
Online Course1,273128,675855,33584%63%
Total1,421154,0871,019,661100%63%
Note: Since schools and students may have enrollments across multiple Virtual Methods, row counts may not equal the total. The total row, however, reflects the number of unique schools and students.

Appendix D – Student Tables

Note: Click on the hyperlinked table number ro return to the section of the report that discusses the table.

Table D1. 2023-24 Number of Schools and Virtual Students with Percentage of Virtual Students and Percent Year over Year Change

Grade Level# of Schools# of Students% of Students% Change from Prior Year
K1062,7002%-17%
11243,4952%-12%
21293,2672%-4%
31563,3002%-12%
41593,2712%-10%
51863,6982%-9%
63025,1333%-6%
73486,5994%-7%
84587,9255%-9%
976720,31213%-1%
1084824,96616%-4%
1187529,13219%-2%
1287841,40427%2%
Total1,421154,087100%-3%
Note: Because some students took courses across multiple schools and grade levels, a student may be counted toward more than one school and grade level. The total row, however, reflects the number of unique students.

Table D2. 2023-24 Number of Schools, Virtual Students, Virtual Enrollments with Percentage of Virtual Students and Virtual Enrollments and Virtual Pass Rate by Student Sex

Student Sex# of Schools# of Students% of Students# of Enrolls% of EnrollsPass Rate
Female1,30778,90851%519,65451%65%
Male1,33475,20949%500,00749%62%
Total1,421154,087100%1,019,661100%63%
Note: The sum of the number of schools and students exceeds the total number because some schools enrolled students of both sexes and some students had enrollments across multiple schools where one school listed the student as one sex, but the other school reported a different value. The unique total was used to emphasize the true number of virtual students.

Table D3. 2023-24 Number of Schools, Virtual Students, Virtual Enrollments with Percentage of Virtual Students and Virtual Enrollments and Virtual Pass Rate by Race/Ethnicity

Race /Ethnicity# of Schools# of Students% of Students# of Enrolls% of EnrollsPass Rate
African-American or Black85130,03219%220,16422%56%
American Indian or Alaska Native3501,1401%7,3001%58%
Asian4613,0362%15,0791%80%
Hispanic or Latino93914,2429%96,4079%58%
Native Hawaiian or Pacific Islander971460%8420%59%
White1,32795,56362%606,44059%67%
Two or More Races9289,3556%69,2667%62%
Unknown1631,2211%4,1630%49%
Total1,421154,087100%1,019,661100%63%
Note: The sum of the number of schools and number of students exceeds the total number because a few students had enrollments across multiple schools where one school listed the student as one race/ethnicity, but the other school reported a different value. The unique total was used to emphasize the true number of virtual students.

Table D4. 2023-24 Number of Schools, Virtual Students, Virtual Enrollments with Percentage of Virtual Students and Virtual Enrollments and Virtual Pass Rate by Poverty Status

Poverty Status# of Schools# of Students% of Students# of Enrolls% of EnrollsPass Rate
Yes1,32598,75864%725,59371%58%
No1,22454,24235%288,99428%77%
Unknown1651,3601%5,0740%50%
Total1,421154,087100%1,019,661100%63%
Note: The sum of the number of schools and students exceeds the total number because some schools enrolled students across both poverty statuses and some students had enrollments across multiple schools where one school listed the student as one poverty status, but the other school reported a different value. The unique total was used to emphasize the true number of virtual students.

Table D5. 2023-24 Pass Rate Comparison for Students in Their Virtual and Non-Virtual Courses by Poverty Status

Poverty StatusVirtual Pass RateNon-Virtual Pass RateVirtual Pass Rate – Non-Virtual Pass Rate
Yes58%65%-7%
No77%87%-10%
Unknown50%28%23%
Total63%74%-10%
Note: The Virtual Pass Rate – Non-Virtual Pass Rate calculation was run prior to rounding. That rounding effect accounts for what may appear to be calculation errors.

Table D6. 2023-24 Percentage of Virtual Students and Virtual Enrollments in Poverty with Pass Rate by Virtual Type

Virtual Type% of Students in Poverty% of Enrolls from Students in PovertyPass Rate for Students in Poverty
Full-Time Virtual74%76%56%
Part-Time Virtual60%67%60%
Total64%71%58%

Table D7. 2023-24 Number and Percentage of Virtual Students by Free or Reduced-Price Lunch Category

Free or Reduced-Price Lunch Category# of Virtual Students# of All MI Students% of Virtual Students
Low FRL (<=25%)14,444225,7316%
Mid-Low FRL (>25% to <=50%)35,732410,3009%
Mid-High FRL (>50% to <=75%)48,088408,36512%
High FRL (>75%)58,060336,50617%
Missing192NANA
Total154,0871,373,68611%
Note: The sum of the number of students exceeds the total number because some students had enrollments across categories. The unique total was used to emphasize the true number of virtual students. Also, all Michigan K-12 schools with building codes were used to calculate the state figures. The 1.4M total also reflects the number of unique MI K-12 students.

Table D8. 2023-24 Number of Schools, Virtual Students, Virtual Enrollments with Percentage of Virtual Students and Virtual Enrollments and Virtual Pass Rate by Special Education Status

Special Education Status# of Schools# of Students% of Students# of Enrolls% of EnrollsPass Rate
Yes1,08120,50113%147,08114%56%
No1,365132,49886%867,50685%64%
Unknown1651,3601%5,0740%50%
Total1,421154,087100%1,019,661100%63%
Note: The sum of the school and student rows exceeds the total number because some students had enrollments across multiple schools where one school listed the student under a specific special education status, but the other school reported a different status. The unique total was used to emphasize the true number of virtual students.

Table D9. 2023-24 Number of Schools, Virtual Students, Virtual Enrollments with Percentage of Virtual Students and Virtual Enrollments and Virtual Pass Rate by Primary Disability

Primary Disability# of Schools# of Students% of Students# of Enrolls% of EnrollsPass Rate
Autism Spectrum Disorder3951,6608%11,5348%73%
Cognitive Impairment2831,2286%7,8325%56%
Deaf-BlindnessNRNR0%NR0%NR
Deaf or Hard of Hearing981271%8001%66%
Early Childhood Developmental Delay25NR0%2680%NR
Emotional Impairment6612,66713%20,37614%46%
Physical Impairment48710%5320%79%
Specific Learning Disability8577,91639%55,97638%56%
Speech and Language Impairment3311,6578%12,1608%70%
Severe Multiple ImpairmentNR1000%NR0%NR
Traumatic Brain Injury47560%3890%59%
Visual Impairment59650%3490%63%
Other Health Impairment7944,45122%33,08222%55%
MISSING/None-Listed1565573%3,5222%32%
Total1,08120,501100%147,081100%56%
Note: The sum of the student rows exceeds the total number because some students had enrollments across multiple schools where one school listed the student with a specific primary disability, but the other school reported a different primary disability. The unique total was used to emphasize the true number of virtual students. Data are not reported (NR) out of caution for confidentiality. Additionally, Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table D10. 2023-24 Number and Percentage of Virtual Students Compared to All MI Students with IEPs by Primary Disability

Primary Disability# of Virtual Students# of All MI Students with IEPs % of All MI Students with IEPs Who Took a Virtual Course% of All MI Students with IEPs
Autism Spectrum Disorder1,66027,5826%13%
Cognitive Impairment1,22816,7497%8%
Deaf-BlindnessNRNRNRNR
Deaf or Hard of Hearing1272,1626%1%
Early Childhood Developmental DelayNRNRNRNR
Emotional Impairment2,66710,36426%5%
Physical Impairment711,2526%1%
Specific Learning Disability7,91655,32414%25%
Speech and Language Impairment1,65760,0483%28%
Severe Multiple Impairment1002,6894%1%
Traumatic Brain Injury5639314%0%
Visual Impairment656969%0%
Other Health Impairment4,45131,12114%14%
MISSING/None-Listed557NANANA
Total20,501217,5699%100%
Note: The sum of the student rows exceeds the total number because some students had enrollments across multiple schools where one school listed the student with a specific primary disability, but the other school reported a different primary disability. The unique total was used to emphasize the true number of virtual students. Data are not reported (NR) out of caution for confidentiality.

Table D11. 2023-24 Number of Schools, Virtual Students, Virtual Enrollments with Percentage of Virtual Students and Virtual Enrollments and Virtual Pass Rate by Home-School/Nonpublic Student Status

Home-School or Nonpublic Student Status# of Schools# of Students% of Students# of Enrolls% of EnrollsPass Rate
No1,411146,60195%985,13397%62%
Yes837,5035%34,5283%93%
Total1,421154,087100%1,019,661100%63%
Note: The sum of the student rows exceeds the total number because a few students had enrollments that were recorded for both statuses. The unique total was used to emphasize the true number of virtual students.

Table D12. 2023-24 Number of Schools, Virtual Students, Virtual Enrollments with Percentage of Virtual Students and Virtual Enrollments and Virtual Pass Rate by Full- or Part-Time Status

Virtual Subset# of Schools# of Students% of Students# of Enrolls% of EnrollsPass Rate
Full-Time Virtual7650,80333%499,79349%60%
Part-Time Virtual1,345105,10168%519,86851%67%
Total1,421154,087100%1,019,661100%63%
Note: The sum of the student rows exceeds the total number because some students had enrollments in both full-time and part-time virtual schools. The unique total was used to emphasize the true number of virtual students.

Table D13. 2023-24 Pass Rate Comparison for Full- and Part-Time Virtual Students by Virtual Subset

Virtual SubsetVirtual Pass RateNon-Virtual Pass Rate
Full-Time Virtual60%89%
Part-Time Virtual67%73%
Total63%74%
Note: There were 24,887 non-virtual enrollments reported for Full-Time Virtual students.

Table D14. 2023-24 Number and Percentage of Virtual Students and Virtual Enrollments with Pass Rates by Students’ Percentage of Enrollments Taken Virtually

% of Enrollments Taken Virtually# of Students% of Students# of Enrolls% of EnrollsPass Rate
<25% of Enrolls Virtual51,94134%88,4969%75%
25% to 49% of Enrolls Virtual20,92614%109,89911%58%
50% to 74% of Enrolls Virtual13,4209%124,46912%55%
75% or More of Enrolls Virtual67,80044%696,79768%64%
Total154,087100%1,019,661100%63%

Table D15. 2023-24 Number and Percentage of Virtual Students and Virtual Enrollments from LEA Schools Only with Pass Rates by Students’ Percentage of Enrollments Taken Virtually

% of Enrollments Taken Virtually# of Students% of Students# of Enrolls% of EnrollsPass Rate
<25% of Enrolls Virtual48,44643%79,42613%77%
25% to 49% of Enrolls Virtual15,77914%74,05912%65%
50% to 74% of Enrolls Virtual8,6848%74,62812%59%
75% or More of Enrolls Virtual39,48035%385,95063%63%
Total112,389100%614,063100%64%

Table D16. 2023-24 Number and Percentage of Virtual Students and Virtual Enrollments from PSA Schools Only with Pass Rates by Students’ Percentage of Enrollments Taken Virtually

% of Enrollments Taken Virtually# of Students% of Students# of Enrolls% of EnrollsPass Rate
<25% of Enrolls Virtual2,9517%5,5631%61%
25% to 49% of Enrolls Virtual1,8465%10,6903%43%
50% to 74% of Enrolls Virtual1,5384%14,5474%49%
75% or More of Enrolls Virtual34,25784%358,04692%61%
Total40,592100%388,846100%60%

Table D17. 2023-24 Number of Schools, Virtual Students, Virtual Enrollments with Percentage of Virtual Students and Virtual Enrollments and Virtual Pass Rate by District Mobility

District Mobility# of Schools# of Students% of Schools# of Enrolls% of EnrollsPass Rate
Stable1,393132,88386%844,55783%69%
Incoming6578,9056%69,5997%49%
Outgoing92913,9299%101,34210%25%
Missing1631,2211%4,1630%49%
Total1,421154,087100%1,019,661100%63%

Table D18. 2023-24 Number and Percentage of Virtual Enrollments with Pass Rates by Known Poverty Status and District Mobility

District Mobility# of In Poverty Enrolls# of Not In Poverty Enrolls% of In Poverty Enrolls% of Not In Poverty EnrollsIn Poverty
Pass Rate
Not In Poverty
Pass Rate
Stable582,480261,28580%90%64%80%
Incoming59,8049,7788%3%48%57%
Outgoing83,30917,93111%6%21%42%
Total725,593288,994100%100%58%77%
Note: Table excludes 5,074 enrollments that had an Unknown for the PovertyFlag variable.

Table D19. 2023-24 Percentage of Virtual Enrollments by Locale and District Mobility

District Mobility% of Rural Enrolls% of Town Enrolls% of Suburban Enrolls% of City Enrolls% of Not Specified Enrolls
Stable82%83%85%84%69%
Incoming8%6%6%5%18%
Outgoing10%11%9%10%13%
Missing0%0%1%1%0%
Total100%100%100%100%100%

Table D20. 2023-24 Virtual Pass Rates by Locale and District Mobility

District MobilityRural Pass RateTown Pass RateSuburban Pass RateCity Pass RateNot Specified Pass Rate
Stable69%72%73%68%NR
Incoming46%64%60%47%NR
Outgoing22%30%32%20%NR
Missing24%71%49%46%NR
Total63%67%68%62%NR
Note: Pass Rate data are not reported (NR) if there were fewer than 10 schools, fewer than 25 students, or fewer than 100 enrollments.

Table D21. 2023-24 Percentage of Virtual Enrollments with Pass Rates by Full-Time (FT) or Part-Time (PT) Virtual Status and District Mobility

District Mobility% of FT Enrolls% of PT EnrollsFT Pass RatePT Pass Rate
Stable77%88%66%71%
Incoming11%3%49%48%
Outgoing12%8%27%22%
Missing0%1%43%51%
Total100%100%60%67%

Table D22. 2023-24 Number and Percentage of Part-Time Virtual Students and Virtual Enrollments with Pass Rate by Non-Virtual Performance (Minimum of 3 Non-Virtual Enrollments)

Non-Virtual Performance# of Students% of Students# of Enrolls% of EnrollsPass Rate
Passed All NV Courses38,87945%119,62833%84%
Did Not Pass 1 or 2 NV Courses15,62418%58,05016%70%
Did Not Pass 3 or More NV Courses31,68137%186,90151%45%
Total86,184100%364,579100%62%

Table D23. 2023-24 Number and Percentage of Virtual Students and Virtual Enrollments with Virtual Pass Rate by Virtual Course Performance

Virtual Course Performance# of Students% of Students# of Enrolls% of EnrollsPass Rate
Passed All79,64552%378,88637%100%
Passed Some, But Not All47,72631%484,93848%55%
Didn’t Pass Any26,71617%155,83715%0%
Total154,087100%1,019,661100%63%

Table D24. 2023-24 Number and Percentage of Virtual Students Who Did Not Pass Any Virtual Courses by the Number of Virtual Courses They Took

# of Virtual Courses Not Passed# of Students% of Students
1 to 28,83033%
3 to 42,70510%
5 to 65,51921%
7 to 83,42613%
9 to 101,3935%
11+4,84318%
Total26,716100%

Table D25. 2023-24 Number and Percentage of Virtual Students and Virtual Enrollments with Pass Rates by Virtual Usage

Virtual Usage# of Students% of Students# of Enrolls% of EnrollsPass Rate
1 to 2 Virtual Courses52,43034%74,3727%80%
3 to 4 Virtual Courses17,69111%61,7626%73%
5 or More Virtual Courses83,96654%883,52787%61%
Total154,087100%1,019,661100%63%

Table D26. 2023-24 Virtual Pass Rate by Virtual Method and Virtual Usage

Virtual UsageBlended Learning Pass RateDigital Learning Pass RateOnline Course Pass RateTotal Pass Rate
1 to 2 Virtual Courses79%71%80%80%
3 to 4 Virtual Courses64%64%74%73%
5 or More Virtual Courses67%61%61%61%
Total68%63%63%63%

Appendix E – State Assessment Tables

Note: Click on the hyperlinked table number to return to the section of the report that discusses the table.

Table E1. 2023-24 Comparison of Virtual and State Proficiency Rates on 11th Grade State Assessment Measures

AssessmentVirtual LearnersAll Learners Statewide
Evidence-Based Reading & Writing (SAT)41%51%
Mathematics (SAT)17%26%
Science (M-STEP)30%38%
Social Studies (M-STEP)32%40%
Note: Statewide assessment data were available from the MI School Data PortalSAT measures are on the College Readiness report. The M-STEP measures can be found on the High School State Testing Performance report.

Table E2. 2023-24 Comparison of Virtual and State Proficiency Rates on 8th Grade State Assessment Measures

AssessmentVirtual LearnersAll Learners Statewide
Evidence-Based Reading & Writing (SAT)55%65%
Mathematics (SAT)17%33%
Science (M-STEP)28%39%
Social Studies (M-STEP)19%30%
Note: Statewide assessment data were available from the MI School Data PortalSAT measures are on the College Readiness report. The M-STEP measures can be found on the High School State Testing Performance report.

Table E3. 2023-24 11th Grade State Assessment Proficiency Rates for Virtual Learners with Three or More Non-Virtual Enrollments by Non-Virtual Performance

AssessmentPassed All NVDid Not Pass 1 or 2 NVDid Not Pass 3 or More NVAll Learners Statewide
Evidence-Based Reading & Writing (SAT)64%35%21%51%
Mathematics (SAT)36%12%4%26%
Science (M-STEP)47%25%14%38%
Social Studies (M-STEP) 49%26%15%40%

Table E4. 2023-24 8th Grade State Assessment Proficiency Rates for Virtual Learners with Three or More Non-Virtual Enrollments by Non-Virtual Performance

AssessmentPassed All NVDid Not Pass 1 or 2 NVDid Not Pass 3 or More NVAll Learners Statewide
Evidence-Based Reading & Writing (SAT)68%58%39%65%
Mathematics (SAT)35%23%7%33%
Science (M-STEP)45%30%15%39%
Social Studies (M-STEP) 35%23%8%30%

Table E5. 2023-24 11th Grade State Assessment Proficiency Rates for Virtual Learners by Poverty Status

AssessmentIn PovertyNot In PovertyAll Virtual LearnersAll Learners Statewide
Evidence-Based Reading & Writing (SAT)27%59%41%51%
Mathematics (SAT)7%31%17%26%
Science (M-STEP)20%44%30%38%
Social Studies (M-STEP) 21%45%32%40%

Table E6. 2023-24 8th Grade State Assessment Proficiency Rates for Virtual Learners by Poverty Status

AssessmentIn PovertyNot In PovertyAll Virtual LearnersAll Learners Statewide
Evidence-Based Reading & Writing (SAT)46%74%55%65%
Mathematics (SAT)9%37%17%33%
Science (M-STEP)20%49%28%39%
Social Studies (M-STEP) 12%37%19%30%

Table E7. 2023-24 11th Grade State Assessment Proficiency Rates for Virtual Learners by Virtual Type

AssessmentPart-TimeFull-TimeAll Virtual LearnersAll Learners Statewide
Evidence-Based Reading & Writing (SAT)43%32%41%51%
Mathematics (SAT)20%7%17%26%
Science (M-STEP)31%24%30%38%
Social Studies (M-STEP) 33%24%32%40%

Table E8. 2023-24 8th Grade State Assessment Proficiency Rates for Virtual Learners by Virtual Type

AssessmentPart-TimeFull-TimeAll Virtual LearnersAll Learners Statewide
Evidence-Based Reading & Writing (SAT)57%52%55%65%
Mathematics (SAT)24%10%17%33%
Science (M-STEP)33%24%28%39%
Social Studies (M-STEP) 24%13%19%30%

Appendix F – Higher Performing Schools Tables

Note: Click on the hyperlinked table number to return to the section of the report that discusses the table.

Table F1. 2023-24 Number and Percentage of Schools with 80% or Higher Pass Rate by Virtual Learner Count Category

Virtual Learner CountSchool Count% of Schools
10 or Fewer22835%
11 to 257512%
26 to 508513%
51 to 998613%
100 or More17427%
Total648100%

Table F2. 2023-24 Number and Percentage of Schools and Virtual Enrollments from Schools with 80% or Higher Pass Rate by Virtual Count Category

Virtual Enroll Count# of Schools% of Schools# of Virtual Enrolls% of Virtual Enrolls
1 to 911618%4680%
10 to 2910917%2,0201%
30 to 49457%1,7871%
50 to 998613%6,1352%
100 or More29245%254,93096%
Total648100%265,340100%

Table F3. 2023-24 Number and Percentage of Schools with 80% or Higher Pass Rate by Number of Virtual Courses Offered

Virtual Courses Offered# of Schools% of Schools
10 or Fewer24638%
11 to 2515324%
26 to 5014222%
More than 5010717%
Total648100%

Table F4. 2023-24 Number and Percentage of Schools with 80% or Higher Pass Rate as a Percentage of All Virtual Schools

Entity Type# of Higher Performing Schools# of Virtual Schools% of Virtual Schools
ISD School122744%
ISD Unique Education ProviderNRNRNR
LEA School5761,25146%
LEA Unique Education ProviderNRNRNR
PSA School4912838%
State SchoolNRNRNR
Total6481,42146%
Note: Data are not reported (NR) out of caution for confidentiality.

Table F5. 2022-23 Number and Percentage of Schools with 80% or Higher Pass Rate by Locale

Locale# of Higher Performing Schools# of Virtual Schools% of Virtual Schools
Rural22448047%
Town8019940%
Suburb24251447%
City9822045%
Not Specified4850%
Total6481,42146%

Table F6. 2022-23 Number of Students and Virtual Enrollments with Pass Rate Data from Schools with 80% or Higher Pass Rate by Race/Ethnicity

Race/Ethnicity# of Students# of EnrollsPass Rate
African-American or Black7,04336,10188%
American Indian or Alaska Native3381,50487%
Asian1,6956,38193%
Hispanic or Latino4,69120,84089%
Native Hawaiian or Pacific Islander6124488%
Two or More Races3,11218,02187%
Unknown37191182%
White39,607181,33891%
Total56,860265,34090%

Table F7. 2023-24 Number of Students and Virtual Enrollments with Pass Rate Data from Schools with 80% or Higher Pass Rate by Poverty Status

Poverty Status# of Students% of Students# of Enrolls% of EnrollsPass Rate
Y27,63249%140,05453%88%
N28,88951%124,34547%93%
Unknown3931%9410%82%
Total56,860100%265,340100%90%

Table F8. 2023-24 Number and Percentage of Schools with 80% or Higher Pass Rate by School Free or Reduced-Price Lunch Category

Free or Reduced-Price Lunch Category# of Higher Performing Schools% of Higher Performing Schools# of All Virtual Schools% of All Virtual Schools
Low FRL (<=25%)10316%14969%
Mid-Low FRL (>25% to <=50%)22034%36760%
Mid-High FRL (>50% to <=75%)21233%53240%
High FRL (>75%)11117%36930%
Missing20%450%
Total648100%1,42146%

Table F9. 2023-24 Number and Percentage of Schools with 80% or Higher Pass Rate by Full- or Part-Time Status

Full- or Part-Time Status# of Higher Performing Schools% of Higher Performing Schools# of All Virtual Schools% of All Virtual Schools
Full-Time152%7620%
Part-Time63398%1,34547%
Total648100%1,42146%

Table F10. 2023-24 Number and Percentage of Schools with 80% or Higher Pass Rate by School Emphasis

School Emphasis# of Higher Performing Schools# of All Virtual Schools% of All Virtual Schools
Alternative Education5026719%
General Education5871,12952%
Special EducationNRNRNR
Vocational/CTENRNRNR
Total6481,42146%
Note: Data are not reported (NR) out of caution for confidentiality.

Table F11. 2023-24 Number and Percentage of Students* from Schools with 80% or Higher Pass Rate by Pass Rate Difference Category

Pass Rate Difference Category# of Students% of Students
Virtual Less than Non-Virtual2,29920%
Virtual Met/Exceeded Non-Virtual9,36580%
Total11,664100%
* Note: Only virtual learners who took a minimum of three virtual courses and three non-virtual courses are included in the table.

Appendix G – Completion Status Tables

Note: Click on the hyperlinked table number to return to the section of the report that discusses the table.

Table G1. 2023-24 Number and Percentage of Virtual Enrollments by Completion Status

Completion Status# of Enrolls% of Enrolls
Audited1,1300%
Completed/Failed142,34614%
Completed/Passed644,48463%
Incomplete93,5039%
Ongoing EnrolledNR0%
Tested OutNR0%
Withdrawn/Exited86,6548%
Withdrawn/Failing15,7252%
Withdrawn/Passing35,5423%
Total1,019,661100%
Note: Data are not reported (NR) out of caution for confidentiality.

Table G2. 2023-24 Percentage of Virtual Enrollments by Completion Status and Entity Type

Completion StatusISD School % of EnrollsISD UEP % of EnrollsLEA School % of EnrollsLEA UEP % of EnrollsPSA School % of Enrolls
Audited0%0%0%0%0%
Completed/Failed7%0%14%2%15%
Completed/Passed70%88%64%93%60%
Incomplete6%8%11%0%6%
Ongoing Enrolled0%0%0%0%0%
Tested Out0%0%0%0%0%
Withdrawn/Exited9%3%7%3%11%
Withdrawn/Failing2%0%0%0%3%
Withdrawn/Passing6%0%3%1%4%
Total100%100%100%100%100%
Note: UEP = Unique Education Provider. State School omitted due to limited enrollments.

Table G3. 2023-24 Number and Percentage of Full-Time Virtual Enrollments by Completion Status

Completion Status# of Enrolls% of Enrolls
AuditedNR0%
Completed/Failed66,80613%
Completed/Passed298,14160%
Incomplete56,90811%
Ongoing EnrolledNR0%
Tested Out1000%
Withdrawn/Exited40,9218%
Withdrawn/Failing13,6813%
Withdrawn/Passing23,1995%
Total499,793100%
Note: Data are not reported (NR) out of caution for confidentiality.

Table G4. 2023-24 Number and Percentage of Part-Time Virtual Enrollments by Completion Status

Completion Status# of Enrolls% of Enrolls
Audited1,0950%
Completed/Failed75,54015%
Completed/Passed346,34367%
Incomplete36,5957%
Ongoing EnrolledNR0%
Tested OutNR0%
Withdrawn/Exited45,7339%
Withdrawn/Failing2,0440%
Withdrawn/Passing12,3432%
Total519,868100%
Note: Data are not reported (NR) out of caution for confidentiality.

Table G5. 2023-24 Percentage of Virtual Enrollments by Completion Status and School Emphasis

Completion StatusAlt Ed % of EnrollsGen Ed % of EnrollsSpecial Ed % of Enrolls
Audited0%0%0%
Completed/Failed13%15%11%
Completed/Passed51%72%62%
Incomplete20%2%6%
Ongoing Enrolled0%0%0%
Tested Out0%0%0%
Withdrawn/Exited12%6%21%
Withdrawn/Failing1%2%0%
Withdrawn/Passing4%3%0%
Total 100%100%100%
Note: Reportable Programs and Vocational/CTE are not reported here because each had fewer than 10 schools.

Table G6. 2023-24 Percentage of Virtual Enrollments by Completion Status and Core Subject Area

Completion StatusEnglish % of EnrollsMath % of EnrollsScience % of EnrollsSocial Sci % of Enrolls
Audited0%0%0%0%
Completed/Failed15%16%14%14%
Completed/Passed61%59%62%64%
Incomplete10%11%10%9%
Ongoing Enrolled0%0%0%0%
Tested Out0%0%0%0%
Withdrawn/Exited8%9%9%8%
Withdrawn/Failing2%2%2%2%
Withdrawn/Passing4%3%3%3%
Total100%100%100%100%

Table G7. 2023-24 Percentage of Virtual Enrollments by Completion Status and Student Sex

Completion StatusFemales % of EnrollsMales % of Enrolls
Audited0%0%
Completed/Failed13%15%
Completed/Passed65%62%
Incomplete9%9%
Ongoing Enrolled0%0%
Tested Out0%0%
Withdrawn/Exited8%9%
Withdrawn/Failing2%2%
Withdrawn/Passing3%3%
Total100%100%

Table G8. 2023-24 Percentage of Virtual Enrollments by Completion Status and Race / Ethnicity

Completion StatusAfrican American or Black % of EnrollsAmerican Indian or Alaska Native % of EnrollsAsian % of EnrollsHispanic or Latino % of EnrollsWhite % of Enrolls Two or More Races % of EnrollsUnknown % of Enrolls
Audited0%0%0%0%0%0%0%
Completed/Failed18%16%7%13%13%16%11%
Completed/Passed56%58%80%58%67%62%49%
Incomplete9%13%3%13%9%7%9%
Ongoing Enrolled0%0%0%0%0%0%0%
Tested Out0%0%0%0%0%0%0%
Withdrawn/Exited11%8%5%11%7%9%25%
Withdrawn/Failing2%2%1%2%1%2%1%
Withdrawn/Passing5%3%4%4%3%4%5%
Total100%100%100%100%100%100%100%
Note: Only Race / Ethnicities with 1,000 or more students are reported in the table.

Table G9. 2023-24 Number and Percentage of Virtual Enrollments by Completion Status and Poverty Status

Completion StatusIn Poverty % of EnrollsNot In Poverty % of EnrollsUnknown % of Enrolls
Audited0%0%0%
Completed/Failed16%8%11%
Completed/Passed58%77%50%
Incomplete10%6%8%
Ongoing Enrolled0%0%0%
Tested Out0%0%0%
Withdrawn/Exited10%5%24%
Withdrawn/Failing2%1%1%
Withdrawn/Passing4%2%5%
Total100%100%100%

Table G10. 2023-24 Number and Percentage of Virtual Enrollments by Completion Status and Special Education Status

Completion StatusIn Special Ed % of EnrollsNot In Special Ed % of EnrollsUnknown % of Enrolls
Audited0%0%0%
Completed/Failed18%13%11%
Completed/Passed56%64%50%
Incomplete10%9%8%
Ongoing Enrolled0%0%0%
Tested Out0%0%0%
Withdrawn/Exited10%8%24%
Withdrawn/Failing2%1%1%
Withdrawn/Passing4%3%5%
Total100%100%100%

Table G11. 2023-24 Percentage of Virtual Enrollments by Completion Status for Students Who Did Not Pass Any of Their Virtual Courses

Completion StatusAt Least One % of Enrolls11 or More % of Enrolls
Audited0%0%
Completed/Failed28%24%
Completed/Passed0%0%
Incomplete24%36%
Ongoing Enrolled0%0%
Tested Out0%0%
Withdrawn/Exited28%24%
Withdrawn/Failing6%3%
Withdrawn/Passing13%12%
Total100%100%
]]>
https://michiganvirtual.org/wp-content/uploads/2024/03/abstract-cover.jpghttps://michiganvirtual.org/wp-content/uploads/2024/03/abstract-cover-150x150.jpg
Breaking Barriers: A Meta-Analysis of Educator Acceptance of AI Technology in Education https://michiganvirtual.org/research/publications/breaking-barriers-a-meta-analysis-of-educator-acceptance-of-ai-technology-in-education/ Thu, 21 Nov 2024 19:48:47 +0000 https://michiganvirtual.org/?post_type=publication&p=90448

The integration of technology into education has faced resistance for over a century, with each new innovation—from calculators to artificial intelligence (AI)—meeting skepticism from educators. This meta-analysis examines the predictors of technology adoption among teachers, extending foundational frameworks like the Technology Acceptance Model (TAM) to include modern AI tools. By analyzing over 60 studies, the research identifies key factors such as self-efficacy, perceived usefulness, technological complexity, and ethical concerns that influence adoption. With a particular focus on AI, the study explores how barriers like cost, time, and required pedagogical shifts amplify resistance, while highlighting the importance of training, institutional support, and transparency in fostering acceptance. These findings aim to equip educators, policymakers, and developers with actionable insights to bridge the gap between innovation and classroom practice, ensuring technology enhances learning while addressing teachers' concerns.

]]>

Introduction

The integration of technology in education has been a subject of debate and research for over a century, with educators’ resistance to adoption persisting despite rapid technological advancements. This historical context is crucial for understanding the current landscape of technology adoption in education, particularly as we face the emergence of artificial intelligence (AI) in educational settings. Much like the resistance faced by calculators and computers in the past, AI is encountering similar skepticism and barriers to adoption. This literature review aims to conduct an extension of a meta-analysis of recent studies to identify key predictors of technology adoption among educators, with the goal of informing targeted training programs that address the strongest predictors of acceptance.

The foundations of modern educational psychology, established by pioneers like Jean Piaget in the early 20th century, emphasized the importance of active, constructive learning experiences (Piaget, 1936). This perspective naturally aligns with the potential of technological tools, including AI, to provide rich, interactive learning environments. Later, works such as “How People Learn” (Bransford et al., 2000) further highlighted technology’s potential to support effective learning principles, principles that are now being extended to AI-enhanced educational tools.

Despite these theoretical underpinnings, the history of educational technology adoption reveals a pattern of resistance. Cuban (1986) documented cycles of enthusiasm and disappointment surrounding various technologies introduced in classrooms throughout the 20th century. This resistance has persisted into the digital age, with Ertmer (1999) identifying both external and internal barriers to technology integration in classrooms. Concerns about efficacy, job displacement, changing teacher roles, and the constant pressure to keep up with technological advancements have all contributed to ongoing resistance (Selwyn, 2011; Howard, 2013; Zhao & Frank, 2003). In addition, resistance is further complicated by lack of pedagogical training and administrator support, which push teachers to keep the status quo rather than ride the wave of each “new and next” educational gimmick (Michigan Virtual, 2024).AI, as the latest iteration of educational technology, is facing similar challenges.

The rapid evolution of technology in recent decades, culminating in the advent of AI, has intensified the challenges of adoption. The rise of artificial intelligence in education has sparked a new wave of apprehension among teachers, reminiscent of past technological innovations, and it’s not unwarranted; recent studies, including work from Michigan Virtual (2024), have shown that educators express significant concerns about AI’s role in the classroom, specifically around inappropriate use of AI, ethical concerns, and student overreliance on AI. Additionally, another Michigan Virtual study by McGehee (2024) indicates that students simply using AI doesn’t make much of a difference for students, but rather how they use it is what makes an impact, indicating that thoughtful and appropriate integration is key to success.

A Walton Family Foundation (2024) study also found that many teachers express less than supportive views of AI and that distrust and unfamiliarity increase with age. Rand’s (2024) study of K-12 educators found that many teachers still aren’t using AI despite the large adoption rates in other industries, though many districts planned to train teachers to do so by the end of the 2023-24 school year. Trust et al. (2023) found that teachers worry about AI’s potential to replace human instruction, erode critical thinking skills, and exacerbate academic dishonesty.

These concerns echo the historical pattern of resistance to new educational technologies documented by Cuban (1986).

Moreover, the rapid development and deployment of AI tools have left many educators feeling unprepared and overwhelmed. Zhai et al. (2021) reported that teachers often feel they lack the necessary skills and knowledge to effectively integrate AI into their teaching practices. This technological anxiety is compounded by ethical concerns surrounding AI, such as data privacy and the potential for bias in AI systems (Holmes et al., 2022). The perceived complexity of technology, identified as a strong predictor of resistance in previous studies (Mac Callum et al., 2014), is particularly pronounced in the case of AI adoption (McGehee, 2023).

Despite these concerns, there is also a growing recognition of AI’s potential to enhance education. Zawacki-Richter et al. (2019) highlighted AI’s capacity to personalize learning experiences and provide valuable insights into student performance, and recent work by Michigan Virtual (McGehee, 2024) has shown that specific usage types of AI are associated with significantly higher student achievement.

However, the realization of these benefits hinges on addressing teachers’ apprehensions and providing adequate support and training (McGehee, 2023; Michigan Virtual, 2024). As Selwyn (2019) argues, the successful integration of AI in education will require a careful balance between technological innovation and pedagogical wisdom, with teachers playing a central role in shaping AI’s implementation in the classroom.

Understanding this historical context is crucial for setting the stage for the literature analysis of recent research on predictors of technology adoption among educators. By examining studies from the past two decades, the aim is to identify the key factors that influence educators’ willingness and ability to integrate technology, including AI, into their teaching practices. These predictors may include, but are not limited to, teacher attitudes, institutional support, professional development opportunities, and perceived usefulness of technology.

This literature meta-analysis will synthesize findings from multiple studies to provide a comprehensive view of the current state of technology adoption in education. By understanding these predictors, educational leaders and policymakers can develop more effective strategies to support and encourage technology integration, including AI, addressing the persistent challenges that have characterized this field for over a century.

This literature meta-analysis will proceed as follows: First discussed is the outline of the methodology for selecting and analyzing relevant studies. Next, is the presentation of our findings on the most significant predictors of technology adoption among educators. Finally, the implications of these findings for educational practice and policy will be discussed, considering how they relate to the historical context of resistance to technology adoption in education, with a particular focus on preparing educators for the integration of AI in their teaching practices.

Method

This literature meta-analysis uses the following predictors of large categorical factors of technology adoption:

  • Perceived Ease of Use and Perceived Usefulness 
  • Behavioral Intention
  • Moderating Factors 

These predictor categories are taken from Scherer, Siddiq, & Tondeur’s (2019) meta-analysis and Granic’s (2022) meta-analysis on technology acceptance studies that largely used the Technology Acceptance Model or TAM instrument(s) (Davis, 1989) to study educational technology adoption. While not all studies included in this analysis used the TAM or its iterations, they observed or studied factors that are included in it or similar to it and were thus considered appropriate for inclusion. Factors not included in the TAM, and were not similar enough to be included as a TAM factor, are discussed separately.

This compilation is intended to be an extension of Scherer, Siddiq, & Tondeur’s, and Granic’s analyses, by including further studies and resources focused on AI in an effort to provide insight on how to properly support educators transitioning into the age of Artificial Intelligence.

Criteria for Inclusion and Analytical Methods

Each study that was selected was published within the last 20 years, uses teachers as the primary population, and focuses on technology adoption. All of the studies in Granic’s 2022 meta-analysis and Scherer, Siddiq, & Tondeur’s (2019) meta-analysis were included, with an addition of 16 AI technology adoption studies and 30 other general Edtech adoption studies.

Excluding the studies pulled from Grannic (2022) and Scherer, Siddiq, & Tondeur (2019), AI tools were used to find and organize resources that were relevant to this meta-analysis in addition to Google Scholar and the Google search engine. 

Search terms included: Artificial intelligence and Technology Acceptance Model, AI and TAM, AI, educators, and TAM, AI and educator acceptance, AI tools in education, adopting AI tools in education, AI tools and teacher adoption, teacher technology adoption, teachers and TAM, educators, and TAM.

The AI tools that were used for this meta-analysis extension were:

  • Chat GPT (4.0) – used for determining the strength and impact of studies, as well as identifying potential studies for inclusion
    • This was the primary tool used for assistance in the study regarding analyses – all other tools were used as checks against ChatGPT for accuracy, clarity, and reasoning to determine if there were any severe discrepancies).
  • Claude 3.5 (Anthropic) – used for determining the strength and impact of studies, as well as identifying potential studies for inclusion; additionally, it was used to assist in synthesizing findings for use in a summary table.
    • This was a supporting tool
  • Google Gemini – used for determining the strength and impact of studies, as well as identifying potential studies for inclusion
    • This was a supporting tool
  • Elicit – primarily used for identifying potential studies for inclusion; information from relevant studies was taken from here and imported into ChatGPT, Claude, and Gemini.
    • This was the primary AI tool used to identify potential studies, as well as extract information from studies. It is important to note that while this tool identifies specific information for studies and extracts it, the researcher still looked at each study and confirmed the extracted information.
  • Scite.ai – primarily used for identifying potential studies for inclusion; information from relevant studies was taken from here and imported into ChatGPT, Claude, and Gemini.
    • This was the secondary AI tool used to identify potential studies and to extract information from studies. It is important to note that while this tool identifies specific information for studies and extracts it, the researcher still looked at each study and confirmed the extracted information.

While fewer in number, studies that specifically dealt with AI as the given technology adoption were given special attention and double weight, considering that AI tools are the newest form of educational technology to be rapidly dispersing and permeating schools for teachers and students alike. This weighting was only used when comparing Gen EdTech adoption vs AI adoption, because there are far more existing studies regarding general technology adoption as compared to AI adoption, thus it was easier to make comparisons between the two.

When analyzing the studies that were selected for this meta-analysis, three categories were calculated on key variables that each study dealt with strength, impact, and amount of evidence.

The criteria and definitions for the strength, impact, and amount of evidence categories are as follows:

  • Strength – this is synonymous with the significance of (p-values) statistical hypothesis testing, correlation coefficients, and regression weights and coefficients of the variable across the studies it is concerned with. This is categorized as strong, moderate, or weak based on the findings across the number of supporting studies. These were taken verbatim, directly from the studies and then averaged by AI.
    • Strong – p values of less than .001, large F or t values, correlation coefficients of approximately [.6] or higher, large regression coefficients, and/or importance scores from models.
    • Moderate – p values of less than .05., moderate size F or t values, correlation coefficients of [.3] – [.5], moderate-sized regression coefficients, and/or importance scores from models.
    • Weak – p values very close to .05 or approaching statistical significance, small F or t values, correlation coefficients of less than [.3], and small regression coefficients and/or importance scores from models.
    • This was a categorical organization of studies by the researcher, followed by a comparison of frequencies of studies in each category by AI, resulting in an overall descriptor.
      • For example, if 10 studies reported strong relationships between two variables, and 4 reported moderate relationships, ChatGPT was asked to give an overall rating of the variable and would categorize it as strong because there are more than double the number of studies showing strong correlations or relationships than moderate, and none showing weak relationships.
  • Impact- this is concerned with effect sizes, population size, and the strength of each predictor across each study in its group. These are categorized into Very Low, Low, Moderate, High, Very High, and Essential categories, sometimes with a direction of positive or negative if the findings across all of the studies provide enough consensus.
    • This was similar to Strength but had to do with population size and effect sizes of results in studies included in the analysis. Studies that dealt with differences between groups had their impact descriptors directly pulled from the text and were categorized into the 6 bins mentioned above. In addition, the larger the sample size, the more weight the studies were given by AI. Then an overall categorization was calculated by ChatGPT by comparing the amount of studies in each category and their weights.
    • As with the previous category of Strength, this was a categorical organization of studies by the researcher, followed by a comparison of frequencies of studies in each category by AI, resulting in an overall descriptor.
      • For example, if 5 studies with large effect sizes (each with large N), and a strength score of high, then 5 studies with moderate effect sizes (small N) and a strength score of high, and finally 5 studies with small effect sizes (small N) and a strength score of moderate were included, then ChatGPT would take this information and categorize this variable likely into the High category, based on the frequencies of the characteristics. 
  • Amount of Evidence – this is concerned with how many studies support the finding. These are categorized into:
    • Low – fewer than 6 large N studies or fewer than 8 studies/resources with low N
    • Moderate – between 8 to 12 studies/resources, with a minimum of one high N resource or experimental/quasi-experimental methods
    • High – 12+ studies, or more than 8 studies/resources, with two or more high N or experimental/quasi-experimental methods
  • AI – Evidence – this is concerned with only the number of studies that support the predictor regarding educator adoption of Artificial Intelligence only
    • This category is specific to evidence that deals with AI adoption as the technology in question. This is a numerical value of the number of studies that support the factor

It is important to note that these results were also compared to the findings from the previous two meta-analyses. It is intended to be an extension of, and thus, any discrepancies between this study and the other two will be discussed.

Findings

Using the methods described above and replicating the categories from the TAM and Granic’s (2022) work, a summary table presents the findings from this meta-analysis extension.

The table of findings (Table 1) and subsequent text presents a comprehensive overview of factors influencing technology acceptance among educators based on a meta-analysis of the subject. The factors are categorized into three main groups: antecedents of Perceived Ease of Use (PEU) and Perceived Usefulness (PU), antecedents of Behavioral Intention (BI), and moderating factors, each of which are defined as follows:

  • Perceived Ease of Use
    • This refers to the degree to which a person believes that using a particular technology would be free of effort. It’s about how easy the user thinks it will be to learn and use the technology.
  • Perceived Usefulness
    • This is defined as the degree to which a person believes that using a particular technology would enhance their job performance or life in general. It’s about how beneficial or valuable the user thinks the technology will be.
  • Behavioral Intention
    • This refers to a person’s readiness to perform a given behavior. In the context of TAM, it’s the likelihood that a person will adopt and use the technology.
  • Moderating Factors
    • This refers to variables that can influence the main effect of other factors. This means the strength of certain factors can vary based on these.
  • The categories of Perceived Ease of Use (PEU) and Perceived Usefulness (PU) influence the user’s attitude toward using the technology, which in turn affects their Behavioral Intention (BI) to use it. The Behavioral Intention (BI) is what ultimately determines a person’s actual usage of a technology. 
  • Therefore, when interpreting the antecedents or factors that influence PEU and PU, it is important to understand that they are essentially once removed from influencing the actual adoption and use of a technology because they are factors that have relationships with factors that influence adoption. Factors of BI are not considered once removed because BI essentially is technology adoption, and therefore any factor that influences this is closer in relationship than any PEU or PU Factor. 
  • Moderating factors are just that; they are variables that have relationships with the strength of a factor depending upon the level of the moderating factor, an example of this would be Age moderates self-efficacy, which means that self-efficacy has significant differences depending on a participant’s age

Table 1 – Summary Table

A table summarizing key factors influencing perceived ease of use (PEU), perceived usefulness (PU), behavioral intention (BI), and moderating factors in technology adoption research. It is divided into three main categories: PEU and PU Antecedents, Behavioral Intention Antecedents, and Moderating Factors. Columns include the Factor name, Description, Strength of the factor's impact, the Impact level, Amount of Evidence, and AI Evidence. The table highlights relationships such as the strong effect of self-efficacy and system quality on PEU and PU, and the role of age and gender as moderating factors. Evidence levels and AI evidence are quantified numerically.

General TAM Model Factors – Gen Edtech Adoption

Among the PEU (Perceived Ease of Use) and PU (Perceived Usefulness) antecedents, several factors stand out as particularly influential, consistent with the previous findings of Grannic (2022) and Scherer, Siddiq, & Tondeur’s (2019) work.

Self-efficacy, an individual’s judgment of their capability to use technology, shows a strong strength with a very large impact and high amount of evidence. This underscores the critical role of teachers’ confidence in their technological abilities (Scherer & Teo, 2019; Holden & Rada, 2011; Joo et al., 2018; Chuang et al., 2020; Leem & Sung, 2019; Celik & Yesilyurt, 2013; Wang et al., 2021).

Perceived Enjoyment and Technological Complexity both demonstrate strong strength with large impacts and high evidence, highlighting the importance of user experience in technology adoption. Perceived Enjoyment is shown to significantly enhance the likelihood of adoption when technology is engaging and enjoyable to use (Cheung & Vogel, 2013; Teo & Noyes, 2011; Padilla-Meléndez et al., 2013; Mun & Hwang, 2003; Chang et al., 2017; Leem & Sung, 2019). Technological Complexity, meanwhile, indicates that as complexity increases, adoption likelihood decreases unless sufficient support is provided (Celik & Yesilyurt, 2013; Aldunate & Nussbaum, 2013; Calisir et al., 2014; Hsu & Chang, 2013; Hanif et al., 2018; Tarhini et al., 2014).

Facilitating Conditions, which refer to resources and technology factors affecting usage, are deemed essential with strong strength and high evidence. This emphasizes the crucial role of institutional support and infrastructure in promoting technology adoption (Venkatesh et al., 2003; Fathema et al., 2015; Moran et al., 2010; Teo et al., 2016; Lawrence & Tar, 2018; Chiu, 2021; O’Bannon & Thomas, 2014).

Anxiety, on the other hand, shows a strong negative impact, indicating that teachers’ apprehensions about technology can significantly hinder adoption (Celik & Yesilyurt, 2013; Calisir et al., 2014; Howard, 2013; Chang et al., 2017; Chiu, 2021; Vongkulluksn et al., 2018).

In terms of Behavioral Intention antecedents, Self-efficacy again emerges as a critical factor with essential positive impact and high evidence. This reinforces the importance of building teachers’ confidence in their technological abilities (Scherer & Teo, 2019; Holden & Rada, 2011; Joo et al., 2018). Subjective Norm and Perceived Playfulness both show moderate strength but large impacts, suggesting that social influences and intrinsic motivation play significant roles in shaping intentions to use technology (Cheung & Vogel, 2013; Tarhini et al., 2014; Teo et al., 2018; Padilla-Meléndez et al., 2013; Scherer & Teo, 2019; Park et al., 2012; Vongkulluksn et al., 2018).

The moderating factors provide additional nuance to understanding technology acceptance. Age moderately affects the relationships between several key factors and behavioral intention (Tarhini et al., 2014; Wang et al., 2009; O’Bannon & Thomas, 2014; Scherer & Teo, 2019; Wong et al., 2012), while Gender shows a weaker influence (Tarhini et al., 2014; Wong et al., 2012; Teo & van Schaik, 2012; Park et al., 2012). Individual-level Cultural Values demonstrate moderate strength and impact, suggesting that cultural context plays a role in technology acceptance (Teo et al., 2008; Nistor et al., 2013; Sánchez-Prieto et al., 2020; Chuang et al., 2020). Notably, Technological Innovation strongly moderates the relationships between subjective norm, perceived usefulness, and behavioral intention, highlighting the importance of keeping pace with evolving technologies in education (Venkatesh et al., 2003; Moran et al., 2010; Leem & Sung, 2019; Teo et al., 2021).

Below are two figures that visualize the strengths (Figure 1) and the impacts (Figure 2) of each variable in the TAM according to the AI assisted analysis. It is important to note that the values used to make these comparisons were calculated by AI, as described in the methodology section, but are summarized into categorical bins in Table 1.

Figure 1 – General EdTech Adoption factor strengths

A bar chart titled "Stacked Strength Levels Across Factors by Category (Excluding AI Evidence)", showcasing the strength levels of various factors influencing technology adoption. The chart is categorized into three groups: PEU and PU Antecedents (green), BI Antecedents (blue), and Moderating Factors (orange). Each factor is listed on the Y-axis, with their corresponding strength levels represented on the X-axis. Notable factors include Self-efficacy and System accessibility with high strength levels across categories, while factors like Gender and Individual-level cultural values show lower strength. The chart visually emphasizes the relative impact of each factor by category.

Figure 2 – General EdTech Adoption factor impacts

A bar chart titled "Impact Levels Across Factors by Category (Excluding AI Evidence)", displaying the impact levels of various factors influencing technology adoption. The chart is divided into three categories: PEU and PU Antecedents (purple), BI Antecedents (pink), and Moderating Factors (orange). Each factor is listed on the Y-axis, with their corresponding impact levels represented on the X-axis. Key observations include high impact levels for factors like Perceived playfulness, Self-efficacy, and Facilitating conditions, while Gender and Individual-level cultural values have lower impact levels. The visualization highlights the comparative influence of each factor within its category.

Non-TAM Factors – Gen EdTech Adoption

Perceived Risk, Expectancy, and Perceived Trust are important predictors of teachers’ technology adoption, closely related to but distinct from the traditional predictors and their antecedents in the Technology Acceptance Model (TAM). The TAM primarily focuses on two key broad categorical predictors: Perceived Usefulness and Perceived Ease of Use, which explain users’ acceptance of technology based on its utility and the effort required to use it; these larger constructs, again, can be predicted in turn by smaller groups of antecedents.

Perceived Risk involves the potential negative consequences or uncertainties that teachers associate with adopting new technology. Unlike the TAM, which doesn’t explicitly consider risk, Perceived Risk addresses teachers’ concerns about the reliability of technology, privacy issues, and the possibility of failure or negative outcomes (Howard, 2013; Teo et al., 2016). High perceived risks can deter educators from integrating technology into their teaching practices, as they might worry about disruptions, loss of classroom control, or being judged negatively by colleagues or administrators. By acknowledging these fears, schools can reduce perceived risks through reliable support and training, encouraging greater technology adoption.

Expectancy, similar to Perceived Usefulness in the TAM, refers to teachers’ beliefs about the likelihood that technology will lead to positive outcomes, such as enhanced instructional effectiveness or improved student learning. However, Expectancy expands beyond just usefulness by incorporating elements of motivation and anticipated success (Hanif et al., 2018; Chang et al., 2017; Moran et al., 2010; Chen et al., 2008; Davis, 1989; Teo & van Schaik, 2012). When teachers believe that technology can make their jobs easier or more engaging for students, they are more likely to use it. This aligns with the principles of the Expectancy Theory, which suggests that individuals are motivated to adopt behaviors they expect to lead to desired outcomes.

Perceived Trust is another crucial factor that goes beyond the traditional TAM components. It encompasses teachers’ confidence that the technology is reliable, secure, and capable of performing as needed, as well as trust in the organizations or institutions providing it(Teo et al., 2018; Scherer et al., 2021; Celik & Yesilyurt, 2013). In the TAM framework, trust is not explicitly addressed, yet it plays a significant role in technology adoption, particularly in environments like schools where ethical considerations and professional standards are paramount. When teachers trust the technology and its providers, they are more likely to adopt it, knowing it aligns with their educational goals and responsibilities. Building trust through consistent positive experiences and transparent communication about technology’s capabilities and limitations can significantly enhance adoption rates.

AI Adoption Factors

The adoption of AI tools by teachers in educational settings is influenced by a multifaceted array of factors that both align with and extend beyond the traditional components of the Technology Acceptance Model (TAM). While TAM emphasizes Perceived Ease of Use and Perceived Usefulness as central predictors of technology acceptance, the integration of AI-based educational technology requires a deeper examination of additional variables that reflect the unique characteristics and challenges associated with AI technologies (Chocarro et al., 2021; Wang et al., 2021).

Self-efficacy is a significant factor influencing AI tool adoption, showing strong evidence as an antecedent of both perceived ease of use and behavioral intention (Chatterjee & Bhattacharjee, 2020; Zhang et al., 2023; Ayanwale et al., 2022; Nja et al., 2023; Alhumaid et al., 2023). An individual’s confidence in their ability to use technology effectively is crucial; teachers who believe they possess the necessary skills to navigate AI tools are more likely to perceive these tools as user-friendly and are consequently more inclined to integrate them into their teaching practices (Choi, Jang, & Kim, 2022; Woodruff, Hutson, & Arnone, 2023; Nazaretsky, Cukurova, & Alexandron, 2021). This is consistent with the TAM, where self-efficacy directly contributes to perceived ease of use (Sánchez-Prieto et al., 2019; Zhang et al., 2021).

System Accessibility and Technological Complexity also emerge as critical determinants in the adoption of AI-based EdTech, closely mirroring the TAM’s focus on ease of use (Nazaretsky et al., 2021; Rico-Bautista et al., 2021; Chocarro, Cortiñas, & Marcos-Matás, 2021; Woodruff et al., 2023). The ease of access to AI tools and the simplicity with which they can be operated significantly influence their perceived usability (Wang et al., 2021; Nja et al., 2023; Al Darayseh, 2023). Technologies that are accessible and uncomplicated encourage adoption by minimizing the perceived effort required to use them, aligning with the TAM’s premise that simplicity enhances user acceptance (Sánchez-Prieto et al., 2019; Zhang et al., 2023; Cukurova et al., 2023).

Subjective Norms and Perceived Playfulness indicate moderate influence on both behavioral intentions and perceived usefulness (Wang et al., 2021; Chocarro et al., 2021; Al Darayseh, 2023; Ayanwale et al., 2022). Subjective norms, which refer to the influence of colleagues, administrators, and the broader educational community, can significantly shape teachers’ attitudes toward adopting AI tools (Choi et al., 2022; Zhang et al., 2021). If there is a positive perception of AI within these social circles, teachers are more likely to view these tools as beneficial, thus enhancing their perceived usefulness (Nazaretsky et al., 2021; Cukurova et al., 2023; Alhumaid et al., 2023). Perceived playfulness, or the intrinsic enjoyment derived from using AI tools, adds another layer to technology adoption, highlighting the importance of motivational factors that go beyond mere utility (Chocarro et al., 2021; Zhang et al., 2021; Nja et al., 2023). This suggests that in educational contexts, enjoyment and engagement can be critical for sustaining technology use, expanding the TAM’s traditional focus on functionality and ease (Woodruff et al., 2023; Chatterjee & Bhattacharjee, 2020).

Ethical Issues and Transparency are particularly relevant in the context of AI adoption and are not explicitly covered by TAM (Choi et al., 2022; Nazaretsky et al., 2021; Rico-Bautista et al., 2021). The integration of AI tools in education raises concerns about biases in AI algorithms, student data privacy, and the transparency of AI decision-making processes (Cukurova et al., 2023; Al Darayseh, 2023; Zhang et al., 2023). Teachers may hesitate to adopt AI tools if they perceive them as ethically questionable or if there is a lack of clarity about how decisions are made by these technologies (Nazaretsky et al., 2021; Choi et al., 2022; Alhumaid et al., 2023). This perceived lack of transparency can undermine trust, which is essential for the adoption of AI-based EdTech (Nja et al., 2023; Zhang et al., 2021). Addressing these concerns by providing clear, transparent explanations of AI functionality and ensuring ethical standards are upheld is crucial for fostering trust and acceptance (Rico-Bautista et al., 2021; Chatterjee & Bhattacharjee, 2020).

Anxiety about using AI technologies represents another factor that differs from traditional TAM components (Ayanwale et al., 2022; Chatterjee & Bhattacharjee, 2020; Woodruff et al., 2023). Anxiety reflects personal traits that cause apprehension or fear when engaging with new technology. This factor can have both direct and inverse effects on technology adoption (Al Darayseh, 2023; Cukurova et al., 2023; Nazaretsky et al., 2021). Teachers who experience high levels of anxiety may perceive AI tools as difficult to use or may doubt their usefulness, hindering adoption (Woodruff et al., 2023; Zhang et al., 2021; Choi et al., 2022). Conversely, with adequate support and training, anxiety can be mitigated, enhancing perceived ease of use and fostering a more positive attitude toward AI tools (Wang et al., 2021; Nja et al., 2023; Alhumaid et al., 2023).

Cost and Time considerations also play a significant role in the adoption of AI-based EdTech, adding another dimension not explicitly covered by TAM (Cukurova et al., 2023; Alhumaid et al., 2023; Rico-Bautista et al., 2021; Nazaretsky et al., 2021). Teachers are more likely to adopt AI tools if they perceive them as cost-effective and time-saving (Wang et al., 2021; Zhang et al., 2023; Nja et al., 2023; Al Darayseh, 2023). However, if the adoption of these tools is seen as requiring substantial financial investment or adding to teachers’ workloads without clear benefits, resistance is likely (Woodruff et al., 2023; Alhumaid et al., 2023; Choi et al., 2022). This factor includes both the initial time and cost to learn and implement AI tools, as well as ongoing maintenance and the need for continuous professional development (Cukurova et al., 2023; Chatterjee & Bhattacharjee, 2020).

The factor, Required Shift in Pedagogy, is a further consideration influencing AI adoption, extending beyond the traditional TAM framework (Nja et al., 2023; Al Darayseh, 2023; Choi et al., 2022; Woodruff et al., 2023). AI tools often necessitate a change in teaching methods, requiring educators to rethink how they deliver content and assess student learning (Cukurova et al., 2023; Chatterjee & Bhattacharjee, 2020; Zhang et al., 2023). The perceived need to modify pedagogical practices can create resistance, particularly among teachers accustomed to traditional methods, even if the perceived usefulness and ease of use of AI tools are high (Nazaretsky et al., 2021; Alhumaid et al., 2023; Sánchez-Prieto et al., 2019).

Finally, factors such as Individual-Level Cultural Values, Age, Gender, and Technological Innovation also moderate the adoption of AI tools, reflecting broader societal attitudes and individual differences that shape teachers’ willingness to embrace new technologies (Sánchez-Prieto et al., 2019; Zhang et al., 2023; Chocarro et al., 2021; Al Darayseh, 2023). Cultural values can influence perceptions of technology’s role in education, while demographic variables such as age and gender may affect comfort levels and attitudes toward AI (Alhumaid et al., 2023; Wang et al., 2021; Cukurova et al., 2023). The perception of technological innovation can encourage adoption if AI tools are seen as cutting-edge solutions that enhance teaching or discourage it if viewed as disruptive to established practices (Choi et al., 2022; Nazaretsky et al., 2021; Rico-Bautista et al., 2021).

Below is a visual that organizes the amount of evidence (number of studies) that supports a TAM variable’s importance when it comes to AI adoption as the dependent variable as opposed to general technology adoption.

Figure 3 – TAM Factors AI Adoption Evidence

A bar chart titled "Combined AI Evidence Across Factors", representing the amount of AI-generated evidence supporting various factors influencing technology adoption. The factors are listed on the Y-axis, with the total AI evidence quantified on the X-axis. Self-efficacy shows the highest level of AI evidence, followed by System accessibility and Perceived playfulness. Lower levels of AI evidence are observed for factors such as Individual-level cultural values and Perceived enjoyment. The chart highlights the disparity in AI evidence across different factors, with some significantly more supported than others.

Synthesis

To make sense of the large amount of information gathered, the researcher attempted to estimate the strength of influence of each of the factors given their outcome: General EdTech adoption or AI-Based EdTech adoption. While there is much more evidence for these factors in general EdTech adoption, the research took into consideration both the amount of literature available on AI adoption and the length of time the tools have been available to perform research and evaluation. Below is a figure (Figure 4) comparing the two types of adoption predictors and their estimated strengths, which were calculated by AI as described earlier.

Figure 4 – GenEdTech vs AI-Based EdTech Adoption Factors

A bar chart titled "Comparison of Adoption Predictors for General EdTech and AI-Based EdTech", illustrating the Strength of Influence (1-5) for various predictors across two categories: General EdTech (yellow) and AI-Based EdTech (orange). Predictors are listed on the X-axis, while their influence strength is represented on the Y-axis. Self-efficacy shows the highest influence across both categories, with Anxiety also standing out, particularly for AI-Based EdTech. Technological Innovation exhibits a higher strength for AI-Based EdTech, while Perceived Enjoyment/Playfulness has a lower strength overall. The chart highlights differences and similarities in predictor strengths between the two categories.

Key Comparisons and Contrasts

  • Self-efficacy:
    • A critical factor for both general EdTech and AI-based EdTech. Confidence in using technology is crucial across both types.
  • Subjective Norms:
    • More influential for general EdTech. Social influence is less significant for AI-based tools, where personal judgment plays a bigger role.
  • Perceived Enjoyment and Perceived Playfulness:
    • More impactful for general EdTech, indicating that enjoyment and intrinsic motivation are key for adopting traditional technologies, whereas AI tools are viewed more for their practical utility.
  • System Quality and System Accessibility:
    • Important for both general and AI-based EdTech. The quality and ease of access to technology are consistently significant factors.
  • Technological Complexity:
    • A major barrier for both, but particularly pronounced for AI-based EdTech (tied for 2nd strongest influencer in AI adoption with Technological Innovation as opposed to its 4-way tie for 2nd strongest influencer in Gen EdTech adoption) due to the perceived advanced nature of many of these tools according to the literature.
      • The strength of Technological Complexity and Technological Innovation are discussed to be very much related across multiple studies regarding AI adoption because they are concerned with the perception of the technology in many cases.
  • Facilitating Conditions:
    • Essential for both types of technologies. The availability of resources and support strongly influences adoption.
  • Anxiety:
    • Significant for both, with a higher impact on AI-based EdTech due to the perceived complexity and novelty associated with AI tools.
  • Moderating Factors (Age, Gender, Cultural Values, Technological Innovation):
    • These factors have a moderate influence across both general and AI-based EdTech, with innovation perceived as slightly more influential for AI tools due to the large impact gen AI like LLms have had in education and the world in general.

Model

As a companion to the synthesis, a visual model was constructed to help explain the relationships between predictive factors and AI adoption. It depicts the strengths of relationships with lines of varying thicknesses, with the thick and more pronounced lines indicating stronger relationships and vice versa.

A conceptual diagram illustrating the relationships between factors influencing AI adoption within the framework of the Technology Acceptance Model (TAM). At the center are two constructs: Perceived Ease of Use and Perceived Usefulness. Perceived Ease of Use is influenced by factors such as self-efficacy, system accessibility, system quality, technology complexity, anxiety, facilitating conditions, and subjective norms. Perceived Usefulness is influenced by many of the same factors and additional contextual considerations such as ethical concerns, pedagogical shifts, and cost and time. Moderating factors, including age, gender, technological innovation, and cultural values, refine the impact of these constructs on Behavioral Intention, which leads directly to AI adoption. The diagram uses arrows to represent the direction and strength of relationships, with thicker arrows indicating major relationships and thinner arrows representing minor ones. A gradient color bar in the corner indicates the importance of factors, with darker blue representing higher importance. Shapes distinguish TAM factors, represented as rectangles, from non-TAM factors, shown as ellipses. The overall flow highlights the interconnected pathways leading to AI adoption.

Key Components in the Diagram

  1. Legend and Symbols:
    • Blue Gradient Bar (Top Left): Indicates the importance of each factor, with darker colors representing more important factors.
    • Arrows:
      • Thick Arrows: Major relationships that strongly influence other components.
      • Thin Arrows: Minor relationships that have a weaker influence.
    • Shapes:
      • Rounded Rectangles: Different types of factors affecting AI adoption.
      • Blue Ellipses and Circles: Non-TAM factors (not part of the original Technology Acceptance Model) that impact the process.
      • Yellow and Light Blue Rectangles: TAM factors that are central to the Technology Acceptance Model.
      • Green Rectangle: Behavioral intention, representing the intent to adopt AI.
      • Hexagon: Final AI adoption outcome.
  2. TAM Factors (Core Factors in Technology Acceptance Model):
    • Perceived Ease of Use: Reflects the user’s belief that using AI will be free of effort.
    • Perceived Usefulness: Refers to the belief that AI will enhance the user’s effectiveness or job performance.
  3. Non-TAM Factors (Additional Factors in the Extended Model):
    • System Quality: The overall performance and reliability of the AI system, which affects both ease of use and usefulness.
    • Technology Complexity: How difficult or complex the AI technology is to understand and use, impacting perceived ease of use.
    • Anxiety: Users’ level of apprehension or discomfort with AI, influencing perceived ease of use.
    • Self-Efficacy: The user’s confidence in their ability to use AI effectively.
    • System Accessibility: How easily users can access and engage with the AI technology, impacting perceived ease of use.
    • Perceived Enjoyment & Playfulness: Users’ perception of AI as enjoyable or fun, affecting their willingness to use it.
  4. Additional Contextual Factors:
    • Facilitating Conditions: External support, resources, or infrastructure that help users adopt AI.
    • Subjective Norms: Social pressures or expectations that influence users’ perceived ease of use and usefulness.
    • Ethical Considerations: Ethical concerns regarding AI use, which influence the perceived usefulness and acceptance of AI.
    • Pedagogical Shift: Changes in teaching or training approaches due to AI, impacting perceived usefulness.
    • Cost & Time: Resources required for adopting AI, which influence perceived usefulness.
  5. Moderators:
    • Age, Gender, Technological Innovation, and Cultural Values: Factors that modify the effect of ease of use and usefulness on behavioral intention. These variables help explain differences in AI adoption across demographic or cultural groups.

Flow and Relationships

  • Direct Influences on Perceived Ease of Use:
    • System Quality, Technology Complexity, Anxiety, Self-Efficacy, and System Accessibility directly impact users’ perception of how easy AI is to use.
    • Facilitating Conditions and Subjective Norms also play a role, though they are more indirect and have a minor influence.
  • Direct Influences on Perceived Usefulness:
    • Factors such as Ethical Considerations, Pedagogical Shift, Cost & Time, Subjective Norms, and Perceived Enjoyment & Playfulness contribute to how useful the AI is perceived.
    • System Quality and Technology Complexity impact usefulness as well, emphasizing the importance of a well-designed system that isn’t overly complex.
  • Influences on Behavioral Intention:
    • Perceived Ease of Use and Perceived Usefulness are the two primary drivers of behavioral intention. They influence whether users intend to adopt AI.
    • Moderators such as age, gender, technological innovation, and cultural values adjust the impact of ease of use and usefulness on behavioral intention, recognizing that these factors are not universally felt in the same way by all users.
  • Final Outcome – AI Adoption:
    • Behavioral Intention ultimately leads to AI adoption, where a higher intention to use AI translates to a greater likelihood of adoption.

Implications and Recommendations for AI Adoption in Education

The adoption of AI-based educational technology (EdTech) shares several predictors with general EdTech adoption but also presents unique challenges that require additional considerations. Understanding these similarities and differences has significant implications for educators, policymakers, and developers working to integrate AI into educational settings.

Similarities between AI Adoption and General EdTech Adoption

  1. Perceived Ease of Use (PEU) and Perceived Usefulness (PU)
    Both AI and general EdTech adoption are heavily influenced by teachers’ perceptions of how easy the technology is to use (PEU) and how useful it is for enhancing teaching and learning (PU) (Davis, 1989). Factors like self-efficacy, system quality, and facilitating conditions play critical roles in shaping these perceptions (Scherer, Siddiq, & Tondeur, 2019; Venkatesh et al., 2003).
  2. Behavioral Intention (BI)
    Self-efficacy and system accessibility are crucial in predicting BI, which represents the teacher’s intention to adopt the technology—an aspect essential for any technology adoption in educational contexts (Scherer, Siddiq, & Tondeur, 2019). Similarly, Fathema, Shannon, & Ross (2015) highlight that behavioral intention in educational settings is driven by the teachers’ confidence and access to the technology.
  3. Moderating Factors
    Age, gender, and cultural values also moderately influence both AI and general EdTech adoption, suggesting that demographic and cultural factors shape how different groups perceive and adopt technology (Tarhini, Hone, & Liu, 2014; Sánchez-Prieto et al., 2020). These moderating factors help explain diverse adoption rates and attitudes among teachers.

Differences between AI Adoption and General EdTech Adoption

  1. Technological Complexity and Anxiety
    AI tools are often perceived as more complex than general EdTech, presenting a higher barrier to adoption, which requires a more significant focus on training and support to enhance teachers’ confidence and ability to use these tools effectively (Howard, 2013; Nazaretsky, Cukurova, & Alexandron, 2021). Anxiety frequently accompanies perceived complexity, particularly with AI tools, as teachers may fear engaging with technologies they do not fully understand (Celik & Yesilyurt, 2013).
  2. Ethical Concerns and Transparency
    AI adoption introduces ethical concerns, such as data privacy, algorithmic bias, and transparency in decision-making processes, which are less prevalent in general EdTech (Selwyn, 2019; Holmes et al., 2022). Addressing these issues is crucial for building trust and ensuring responsible AI use in education (Chocarro, Cortiñas, & Marcos-Matás, 2021).
  3. Cost and Time
    The perceived cost and time associated with adopting AI tools are often higher than with general EdTech. Implementing AI may require significant investment in hardware, software, and professional development, adding a burden that schools must address to facilitate adoption (Woodruff, Hutson, & Arnone, 2023). Such concerns echo Cuban’s (1986) findings on the historical costs associated with adopting educational technology.
  4. Required Shift in Pedagogy
    Unlike general EdTech, which can often be integrated with minimal changes to teaching methods, AI tools may necessitate significant shifts in pedagogy. Educators may need to rethink instructional strategies, assessments, and classroom management techniques to effectively incorporate AI into their practices (Trust et al., 2023; Zhang et al., 2023).

Implications for Educators and Policymakers

Given these similarities and differences, several key implications arise for those involved in integrating AI into education. It is essential to note that these actions primarily depend on support from education leaders and administrators, as many are beyond the scope of what a classroom teacher can accomplish.

  1. Need for Comprehensive Training and Support
    Training programs must go beyond basic technology skills to include in-depth knowledge of AI tools, emphasizing the development of self-efficacy and reducing anxiety related to technological complexity (Ertmer, 1999). Alhumaid et al. (2023) also stress the importance of targeted training in building teachers’ confidence with AI tools.
  2. Focus on Ethical Use and Transparency
    Educational institutions must develop guidelines on data privacy, algorithmic transparency, and equitable AI use (Selwyn, 2011; Al Darayseh, 2023). Transparent communication about how AI systems work and their ethical implications is fundamental to fostering trust and encouraging adoption.
  3. Consideration of Cost and Time Investments
    Policymakers and school administrators should consider the higher costs and time investments associated with AI adoption. Providing financial support for purchases, as well as allocating time for professional development, can alleviate these barriers (Zhai et al., 2021; Buabeng-Andoh, 2012).
  4. Support for Pedagogical Shifts
    As AI tools may require changes in teaching methods, there should be support structures in place to help teachers adapt their pedagogy. Providing resources, exemplars, and collaborative opportunities will enable educators to explore new instructional strategies enabled by AI technologies (Zawacki-Richter et al., 2019; Chuang, Shih, & Cheng, 2020).

Recommendations for Promoting AI Adoption

Based on the aforementioned implications, the following recommendations are proposed to promote AI adoption in educational settings. Strong support from district leaders and administrators will be critical, as subjective norms and facilitating conditions depend heavily on institutional support. Below in Table 2 are organized recommendations paired with resources followed by a detailed breakdown of each.

Table 2 – Recommendations

RecommendationPredictor(s) AddressedMichigan Virtual Resource
Develop Targeted and Comprehensive Training ProgramsAnxiety, Pedagogical Shift, Self-EfficacyAI Planning Framework for Districts
Integration Framework
Michigan Virtual AI Workshops
Michigan Virtual AI Courses
Simplify AI Tools and Ensure UsabilityTechnological Complexity, Self EfficacyEducator AI Support
AI Video Library
Strengthen Institutional Support and Facilitate AccessSystem Accessibility, Facilitating Conditions, Cost and TimeAI Integration GuideIntegration Framework
Promote Ethical Awareness and TransparencyEthical Considerations, Cost and TimeAI Usage Guidelines
Highlight Practical Benefits and Encourage InnovationPedagogical Shift, Subjective Norms, Perceived Enjoyment and PlayfulnessAI Resource Bank

Student Usage of AI
  1. Develop Targeted and Comprehensive Training Programs
    Professional development should not only enhance teachers’ technical skills but also address unique aspects of AI, such as ethical considerations and pedagogical integration. Tailoring these programs to different demographic groups can address varying levels of comfort with technology (Ertmer, 1999; Alhumaid et al., 2023).
  2. Simplify AI Tools and Ensure Usability
    Collaboration with developers to create intuitive, user-friendly AI tools, along with usability guides, can reduce technological complexity (Nazaretsky, Cukurova, & Alexandron, 2021).
  3. Strengthen Institutional Support and Facilitate Access
    Robust institutional support through resources, technical assistance, and community-building channels allows teachers to share experiences and learn collaboratively (Woodruff, Hutson, & Arnone, 2023).
  4. Promote Ethical Awareness and Transparency
    Clear guidelines on AI ethics are crucial. Workshops and discussions on ethical use will help build trust and ensure teachers understand the broader implications of AI in their classrooms (Selwyn, 2011; Al Darayseh, 2023).
  5. Highlight Practical Benefits and Encourage Innovation
    Large-scale studies, case studies, and personal stories that showcase AI’s benefits in enhancing learning and teaching efficiency can inspire adoption and demonstrate AI’s value in solving educational challenges (Zhang et al., 2023; Choi, Jang, & Kim, 2022).

Limitations and Transparency

This meta-analysis was conducted with extensive use of AI tools, as discussed in the methodology section, but not without professional researcher supervision and checks for accuracy. This approach marks a departure from traditional meta-analytic studies, which often adhere to specific, manual procedures for analyzing and selecting studies (Gough, Oliver, & Thomas, 2017). In contrast, this study leveraged artificial intelligence, specifically large language models (LLMs), to streamline data categorization, sorting, and initial analytics.

While these methods diverge from traditional practices, they are not without merit or justification. Hullman (2024) emphasizes the practical and accurate potential of using LLMs as tools for categorization and data analysis across diverse datasets. ChatGPT operated in this study as a rule-driven assistant, categorizing, sorting, and performing preliminary analytics to deliver digestible results, which the researcher then used for comparative analysis across categories.

Research supports the idea that AI can perform categorization tasks with near-human accuracy. For instance, Khraisha, Put, Kappenberg, Warraitch, and Hadfield (2024) found that while humans generally outperform AI in meta-analytic procedures, LLMs are highly capable of following structured instructions for categorization and basic analytics, achieving “almost perfect performance on par with humans.” Similarly, studies by Shank and Wilson (2023) and Chelli et al. (2024) discuss the strengths of AI in handling large datasets when guided by specific parameters.

Conversely, some studies (Chelli et al., 2024; Cheloff, 2023) indicate limitations in using LLMs exclusively for systematic literature reviews due to issues with accuracy and recall. This study, however, mitigated these concerns by using LLMs not as exclusive tools but as aids in finding and sorting studies under researcher-defined rules, with all outputs cross-verified for accuracy (Lewis et al., 2023).

Ultimately, while AI in research and analytics is still in its early stages and has limitations, it also presents advantages, offering efficiency and accuracy comparable to human performance when used with appropriate guidance (Hancock & Beebe, 2023; Song, 2024).

Conclusions

This meta-analysis confirms the persistence of key factors influencing educators’ adoption of technology, with specific emphasis on AI in educational settings. As noted in the introduction, resistance to technology in education—from calculators to computers—has followed a familiar trajectory, and AI is no exception. This study, building on foundational models like the Technology Acceptance Model (TAM) and extending the work of Scherer, Siddiq, and Tondeur (2019) and Grannic (2022), highlights that Perceived Ease of Use (PEU) and Perceived Usefulness (PU) continue to be central determinants in teachers’ acceptance of AI technologies. However, AI introduces complexities beyond those seen with previous technologies.

In particular, critical factors such as Self-Efficacy, Cost & Time, and the Required Pedagogical Shift emerged as highly influential in this meta-analysis. Self-efficacy, or teachers’ confidence in their ability to use AI tools, is a significant predictor of adoption, underscoring the need for targeted professional development and support. This echoes Ertmer’s (1999) findings about the importance of training in overcoming technological resistance. Cost & Time, often overlooked in broader technology adoption discussions, play a more pronounced role in AI adoption, as teachers perceive AI tools to require significant financial investment and time to learn. This aligns with Zhai et al. (2021) findings that teachers feel unprepared and overwhelmed by the demands of new technology, a barrier that could hinder adoption unless schools provide resources and time for teachers to adapt.

The Required Pedagogical Shift further complicates AI adoption. Unlike previous technologies that have been integrated with extensive support to help teachers understand how to utilize the tools effectively in their current pedagogical contexts, AI tools often necessitate a fundamental rethinking of teaching strategies and classroom management. Teachers may resist adopting AI because it challenges traditional teaching practices, an issue identified in both this meta-analysis and by Trust et al. (2023), who noted similar concerns about the impact of AI on instructional methods and critical thinking skills. Overcoming this resistance will require not just technical training but support for pedagogical innovation.

Anxiety, a recurring theme in technology adoption, was another strong predictor of resistance. Educators expressed significant apprehension about the complexity and perceived risks of AI, particularly in relation to job displacement and the ethical challenges of AI in education, such as data privacy and bias. This anxiety, as reported by Holmes et al. (2022) and highlighted in our findings, must be addressed by increasing transparency, providing ethical guidelines, and offering continuous support to educators as they integrate AI into their practices.

Beyond these core factors, System Accessibility and Technological Complexity remain central barriers to AI adoption as teachers struggle with the perceived difficulty of navigating AI tools. This echoes historical patterns of resistance, as seen in Cuban (1986)’s documentation of educators’ reluctance to adopt past technologies. Overcoming these obstacles, much like with earlier innovations, will depend on creating user-friendly systems and reducing the perceived burden on teachers.

While addressing these factors is crucial, it is also essential to acknowledge the value of resistance to technology adoption, as highlighted by The Friction Project (Sutton & Rao, 2024), and Noise: A flaw in human judgment, which argues that friction can serve as a healthy barrier, encouraging thoughtful consideration and critical questioning of new tools before widespread adoption (Kahneman, Sibony & Sunstein, 2021). In this context, resistance to AI may signal legitimate concerns that should be addressed rather than dismissed. For instance, educators’ reluctance to adopt AI may stem from valid ethical considerations, such as privacy concerns and the impact on students’ critical thinking skills. Recognizing these concerns and addressing them thoughtfully can ensure that adoption is more purposeful and aligned with educational goals, rather than simply following technological trends.

In conclusion, this meta-analysis identifies a set of predictors—PEU, PU, Self-Efficacy, Anxiety, Cost & Time, and Required Pedagogical Shift—that both align with historical trends and reveal new challenges unique to AI. While core principles of technology acceptance, such as Perceived Usefulness and Self-Efficacy, continue to shape adoption, the high demands of AI in terms of time, cost, and pedagogical adjustment suggest that this latest technological wave requires more comprehensive and tailored support than its predecessors. Addressing these factors, along with mitigating ethical concerns, reducing anxiety, and recognizing the constructive role of resistance, will be essential for overcoming barriers and ensuring that AI can effectively enhance educational practices. This approach aligns with the ideas of Piaget (1936) and Bransford et al. (2000), who advocated for thoughtful integration of technology to foster meaningful learning experiences while also taking into account the reasoning for friction in AI adoption and addressing it.

References

Agarwal, R., & Prasad, J. (1998). A conceptual and operational definition of personal innovativeness in the domain of information technology. Information Systems Research, 9(2), 204-215.

Al Darayseh, A. (2023). Acceptance of artificial intelligence in teaching science: Science teachers’ perspective. Computers and Education: Artificial Intelligence, 4, 100132.

Alhumaid, K., Ali, S., Waheed, A., Zahid, E., & Habes, M. (2021). COVID-19 & elearning: Perceptions & attitudes of teachers towards e-learning acceptance in the developing countries. Multicultural Education, 7(2), 100-115.

Alhumaid, K., Naqbi, S., Elsori, D., & Mansoori, M. (2023). The adoption of artificial intelligence applications in education. International Journal of Data and Network Science, 7(1), 457-466.

Aldunate, R., & Nussbaum, M. (2013). Teacher adoption of technology. Computers in Human Behavior, 29(3), 519-524.

Ayanwale, M. A., Sanusi, I. T., Adelana, O. P., Aruleba, K. D., & Oyelere, S. S. (2022). Teachers’ readiness and intention to teach artificial intelligence in schools. Computers and Education: Artificial Intelligence, 3, 100099.

Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience, and school (Expanded ed.). National Academy Press.

Buabeng-Andoh, C. (2012). Factors influencing teachers’ adoption and integration of information and communication technology into teaching: A review of the literature. International Journal of Education and Development using Information and Communication Technology, 8(1), 136-155.

Calisir, F., Altin Gumussoy, C., Bayraktaroglu, A. E., & Karaali, D. (2014). Predicting the intention to use a web‐based learning system: Perceived content quality, anxiety, perceived system quality, image, and the technology acceptance model. Human Factors and Ergonomics in Manufacturing & Service Industries, 24(5), 515-531.

Celik, V., & Yesilyurt, E. (2013). Attitudes to technology, perceived computer self-efficacy and computer anxiety as predictors of computer supported education. Computers & Education, 60(1), 148-158.

Chatterjee, S., & Bhattacharjee, K. K. (2020). Adoption of artificial intelligence in higher education: A quantitative analysis using structural equation modelling. Education and Information Technologies, 25, 3443-3463.

Chelli, A., Collins, T., Darwish, M., Fan, Y., & Kraus, M. (2024). The limitations of AI in literature reviews: A comparative study. Journal of Digital Scholarship, 11(2), 45-62.

Chelli, M., Descamps, J., Lavoué, V., Trojani, C., Azar, M., Deckert, M., … & Ruetsch-Chelli, C. (2024). Hallucination Rates and Reference Accuracy of ChatGPT and Bard for Systematic Reviews: Comparative Analysis. Journal of Medical Internet Research, 26, e53164.

Cheloff, S. (2023). AI and systematic reviews: Challenges and considerations. New York, NY: Research Press.

Cheloff, A. Z., Pochapin, M., & Popov, V. (2023). S1726 Publicly Available Generative Artificial Intelligence Programs Are Currently Unsuitable for Performing Meta-Analyses. Official journal of the American College of Gastroenterology| ACG, 118(10S), S1287.

Chen, I. J., Yang, K. F., Tang, F. I., Huang, C. H., & Yu, S. (2008). Applying the technology acceptance model to explore public health nurses’ intentions towards web-based learning: A cross-sectional questionnaire survey. International journal of nursing studies, 45(6), 869-878.

Cheung, R., & Vogel, D. (2013). Predicting user acceptance of collaborative technologies: An extension of the technology acceptance model for e-learning. Computers & education, 63, 160-175.

Chang, C. T., Hajiyev, J., & Su, C. R. (2017). Examining the students’ behavioral intention to use e-learning in Azerbaijan? The general extended technology acceptance model for e-learning approach. Computers & Education, 111, 128-143.

Cheng, Y. M. (2019). How does task-technology fit influence cloud-based e-learning continuance and impact? Education + Training, 61(4), 480-499.

Chiu, T. K. (2021). Applying the self-determination theory (SDT) to explain student engagement in online learning during the COVID-19 pandemic. Journal of Research on Technology in Education, 54(1), 14-30.

Chocarro, R., Cortiñas, M., & Marcos-Matás, G. (2021). Teachers’ attitudes towards chatbots in education: a technology acceptance model approach considering the effect of social language, bot proactiveness, and users’ characteristics. Educational Studies, 49, 295 – 313.

Choi, S.Y., Jang, Y., & Kim, H. (2022). Influence of Pedagogical Beliefs and Perceived Trust on Teachers’ Acceptance of Educational Artificial Intelligence Tools. International Journal of Human–Computer Interaction, 39, 910 – 922.

Chuang, H. H., Shih, C. L., & Cheng, M. M. (2020). Teachers’ perceptions of culturally responsive teaching in technology-supported learning environments. British Journal of Educational Technology, 51(6), 2442-2460.

Cuban, L. (1986). Teachers and machines: The classroom use of technology since 1920. Teachers College Press.

Cukurova, M., Miao, X., & Brooker, R. (2023, June). Adoption of artificial intelligence in schools: unveiling factors influencing teachers’ engagement. In International conference on artificial intelligence in education (pp. 151-163). Cham: Springer Nature Switzerland.

Darayseh, A.A. (2023). Acceptance of artificial intelligence in teaching science: Science teachers’ perspective. Comput. Educ. Artif. Intell., 4, 100132.

Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319-340.

Ertmer, P. A. (1999). Addressing first- and second-order barriers to change: Strategies for technology integration. Educational Technology Research and Development, 47(4), 47-61.

Fathema, N., Shannon, D., & Ross, M. (2015). Expanding the Technology Acceptance Model (TAM) to examine faculty use of Learning Management Systems (LMSs) in higher education institutions. Journal of Online Learning & Teaching, 11(2), 210-232.

Gough, D., Oliver, S., & Thomas, J. (2017). An introduction to systematic reviews. London, UK: SAGE Publications.

Granić A. (2022). Educational Technology Adoption: A systematic review. Education and information technologies, 27(7), 9725–9744. https://doi.org/10.1007/s10639-022-10951-7

Guillén-Gámez, F. D., & Mayorga-Fernández, M. J. (2020). Identification of variables that predict teachers’ attitudes toward ICT in higher education for teaching and research: A study with regression. Sustainability, 12(4), 1312.

Hancock, R., & Beebe, S. (2023). The impact of large language models on data analytics: A study of efficiency and accuracy. Journal of Data Science and AI, 6(1), 29-47.

Hanif, A., Jamal, F. Q., & Imran, M. (2018). Extending the technology acceptance model for use of e-learning systems by digital learners. Ieee Access, 6, 73395-73404.

Holden, H., & Rada, R. (2011). Understanding the influence of perceived usability and technology self-efficacy on teachers’ technology acceptance. Journal of Research on Technology in Education, 43(4), 343-367.

Howard, S. K. (2013). Risk-aversion: Understanding teachers’ resistance to technology integration. Technology, Pedagogy and Education, 22(3), 357-372.

Hsu, H. H., & Chang, Y. Y. (2013). Extended TAM model: Impacts of convenience on acceptance and use of Moodle. US-China Education Review, 3(4), 211-218.

Hsu, L. (2020). Factors affecting adoption of digital teaching in elementary school English: A mixed methods study. Computer Assisted Language Learning, 1-23.

Hullman, J. (2024). Practical applications of LLMs in data categorization and analysis. Journal of Computational Methods, 8(1), 12-25.

Hullman, J. (2024, June 24). Forking paths in LLMs for data analysis. Statistical Modeling, Causal Inference, and Social Science. https://statmodeling.stat.columbia.edu/2024/06/24/forking-paths-in-llms-for-data-analysis/

Joo, Y. J., Park, S., & Lim, E. (2018). Factors influencing preservice teachers’ intention to use technology: TPACK, teacher self-efficacy, and technology acceptance model. Educational Technology & Society, 21(3), 48-59.

Kahneman, D., Sibony, O., & Sunstein, C. R. (2021). Noise: A flaw in human judgment. New York: Little, Brown Spark.

Khraisha, Q., Put, S., Kappenberg, J., Warraitch, A., & Hadfield, K. (2024). Can large language models replace humans in systematic reviews? Evaluating GPT‐4’s efficacy in screening and extracting data from peer‐reviewed and grey literature in multiple languages. Research Synthesis Methods.

König, J., Jäger-Biela, D. J., & Glutsch, N. (2020). Adapting to online teaching during COVID-19 school closure: Teacher education and teacher competence effects among early career teachers in Germany. European Journal of Teacher Education, 43(4), 608-622.

Lawrence, J. E., & Tar, U. A. (2018). Factors that influence teachers’ adoption and integration of ICT in teaching/learning process. Educational Media International, 55(1), 79-105.

Leem, J., & Sung, E. (2019). Teachers’ beliefs and technology acceptance concerning smart mobile devices for SMART education in South Korea. British Journal of Educational Technology, 50(2), 601-613.

Lewis, C., Martinez, F., Olson, H., & Chen, L. (2023). Ensuring accuracy in AI-assisted systematic reviews: A mixed-methods approach. International Journal of AI in Research, 2(4), 205-223.

Liu, H., Wang, L., & Koehler, M. J. (2019). Exploring the intention-behavior gap in the technology acceptance model: A mixed-methods study in the context of foreign-language teaching in China. British Journal of Educational Technology, 50(5), 2536-2556.

Mac Callum, K., Jeffrey, L., & Kinshuk. (2014). Factors impacting teachers’ adoption of mobile learning. Journal of Information Technology Education: Research, 13, 141-162.

Mailizar, M., Burg, D., & Maulina, S. (2021). Examining university teachers’ acceptance of learning management system (LMS) in Indonesia: A rasch analysis approach. Education and Information Technologies, 26(4), 4089-4108.

McGehee, N. (2023) Balancing the Risks and Rewards of AI Integration for Michigan Teachers. Michigan Virtual. https://michiganvirtual.org/research/publications/balancing-the-risks-and-rewards-of-ai-integration-for-michigan-teachers/

McGehee, N. (2024, June 21). AI in education: Student usage in online learning. Michigan Virtual Learning Research Institute. https://michiganvirtual.org/research/publications/ai-in-education-student-usage-in-online-learning/

Michigan Virtual. (2024). AI in Education: Exploring Trust, Challenges, and the Push for Implementation. https://michiganvirtual/research/publications/ai-in-education-exploring-trust-challenges-and-the-push-for-implementation/

Mohammadi, H. (2015). Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Computers in Human Behavior, 45, 359-374.

Moran, M., Hawkes, M., & Gayar, O. E. (2010). Tablet personal computer integration in higher education: Applying the unified theory of acceptance and use technology model to understand supporting factors. Journal of educational computing research, 42(1), 79-101.

Muhaimin, M., Habibi, A., Mukminin, A., Saudagar, F., Pratama, R., Wahyuni, S., … & Indrayana, B. (2019). A sequential explanatory investigation of TPACK: Indonesian science teachers’ survey and perspective. Journal of Technology and Science Education, 9(3), 269-281.

Mun, Y. Y., & Hwang, Y. (2003). Predicting the use of web-based information systems: self-efficacy, enjoyment, learning goal orientation, and the technology acceptance model. International journal of human-computer studies, 59(4), 431-449.

Nagy, J. T. (2018). Evaluation of online video usage and learning satisfaction: An extension of the technology acceptance model. International Review of Research in Open and Distributed Learning, 19(1).

Nam, C. S., Bahn, S., & Lee, R. (2013). Acceptance of assistive technology by special education teachers: A structural equation model approach. International Journal of Human-Computer Interaction, 29(5), 365-377.

Nazaretsky, T., Cukurova, M., & Alexandron, G. (2021). An Instrument for Measuring Teachers’ Trust in AI-Based Educational Technology. LAK22: 12th International Learning Analytics and Knowledge Conference.

Nistor, N., Göğüş, A., & Lerche, T. (2013). Educational technology acceptance across national and professional cultures: a European study. Educational Technology Research and Development, 61(4), 733-749.

Nja, C. O., Idiege, K. J., Uwe, U. E., Meremikwu, A. N., Ekon, E. E., Erim, C. M., … & Cornelius-Ukpepi, B. U. (2023). Adoption of artificial intelligence in science teaching: From the vantage point of the African science teachers. Smart Learning Environments, 10(1), 42.

O’Bannon, B. W., & Thomas, K. (2014). Teacher perceptions of using mobile phones in the classroom: Age matters! Computers & Education, 74, 15-25.

Padilla-Meléndez, A., del Aguila-Obra, A. R., & Garrido-Moreno, A. (2013). Perceived playfulness, gender differences and technology acceptance model in a blended learning scenario. Computers & Education, 63, 306-317.

Park, S. Y., Nam, M. W., & Cha, S. B. (2012). University students’ behavioral intention to use mobile learning: Evaluating the technology acceptance model. British journal of educational technology, 43(4), 592-605.

Piaget, J. (1936). Origins of intelligence in the child. Routledge & Kegan Paul.

Qasem, A. A. A., & Viswanathappa, G. (2020). The integration of multiliteracies in digital learning environments: A study of teacher readiness. International Journal of Technology in Education and Science, 4(4), 265-279.

Rico-Bautista, D., Medina-Cardenas, Y., Coronel-Rojas, L. A., Cuesta-Quintero, F., Maestre-Gongora, G., & Guerrero, C. D. (2021). Smart university: key factors for an artificial intelligence adoption model. In Advances and Applications in Computer Science, Electronics and Industrial Engineering: Proceedings of CSEI 2020 (pp. 153-166). Singapore: Springer Singapore.

Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York: Free Press.

Salloum, S. A., Alhamad, A. Q. M., Al-Emran, M., Monem, A. A., & Shaalan, K. (2019). Exploring students’ acceptance of e-learning through the development of a comprehensive technology acceptance model. IEEE access, 7, 128445-128462.

Sánchez-Prieto, J. C., Olmos-Migueláñez, S., & García-Peñalvo, F. J. (2017). MLearning and pre-service teachers: An assessment of the behavioral intention using an expanded TAM model. Computers in Human Behavior, 72, 644-654.

Sánchez-Prieto, J.C., Cruz-Benito, J., Therón, R., & García-Peñalvo, F.J. (2019). How to Measure Teachers’ Acceptance of AI-driven Assessment in eLearning: A TAM-based Proposal. Proceedings of the Seventh International Conference on Technological Ecosystems for Enhancing Multiculturality.

Sánchez-Prieto, J. C., Huang, F., Olmos-Migueláñez, S., García-Peñalvo, F. J., & Teo, T. (2020). Exploring the unknown: The effect of resistance to change and attachment on mobile adoption among secondary pre-service teachers. British Journal of Educational Technology, 51(3), 626-643.

Sánchez-Cruzado, C., Santiago Campión, R., & Sánchez-Compaña, M. T. (2021). Teacher digital literacy: The indisputable challenge after COVID-19. Sustainability, 13(4), 1858.

Scherer, R., Siddiq, F., & Tondeur, J. (2019). The technology acceptance model (TAM): A meta-analytic structural equation modeling approach to explaining teachers’ adoption of digital technology in education. Computers & Education, 128, 13-35.

Scherer, R., & Teo, T. (2019). Unpacking teachers’ intentions to integrate technology: A meta-analysis. Educational Research Review, 27, 90-109.

Scherer, R., Howard, S. K., Tondeur, J., & Siddiq, F. (2021). Profiling teachers’ readiness for online teaching and learning in higher education: Who’s ready? Computers in Human Behavior, 118, 106675.

Selwyn, N. (2011). Education and technology: Key issues and debates. Continuum International Publishing Group.

Shank, M., & Wilson, J. (2023). AI’s role in data-intensive research: From sorting to summarizing. Washington, DC: Academic Research Institute.

Song, L. (2024). Efficiency and reliability of AI in research synthesis. Journal of Advanced Research Methods, 9(2), 34-50.

Song, Y., & Kong, S. C. (2017). Investigating students’ acceptance of a statistics learning platform using technology acceptance model. Journal of Educational Computing Research, 55(6), 865-897.

Sutton, R. I., & Rao, H. (2024). The friction project: How smart leaders make the right things easier and the wrong things harder. St. Martin’s Press.

Tarhini, A., Hone, K., & Liu, X. (2014). Measuring the moderating effect of gender and age on e-learning acceptance in England: A structural equation modeling approach for an extended technology acceptance model. Journal of Educational Computing Research, 51(2), 163-184.

Teo, T. (2009). Modeling technology acceptance in education: A study of pre-service teachers. Computers & Education, 52(2), 302-312.

Teo, T. (2010). A path analysis of pre-service teachers’ attitudes to computer use: applying and extending the technology acceptance model in an educational context. Interactive Learning Environments, 18(1), 65-79.

Teo, T. (2011). Factors influencing teachers’ intention to use technology: Model development and test. Computers & Education, 57(4), 2432-2440.

Teo, T., & Noyes, J. (2008). Development and validation of a computer attitude measure for young students (CAMYS). Computers in Human Behavior, 24(6), 2659-2667.

Teo, T., & Noyes, J. (2011). An assessment of the influence of perceived enjoyment and attitude on the intention to use technology among pre-service teachers: A structural equation modeling approach. Computers & Education, 57(2), 1645-1653.

Teo, T., & van Schaik, P. (2012). Understanding the intention to use technology by preservice teachers: An empirical test of competing theoretical models. International Journal of Human-Computer Interaction, 28(3), 178-188.

Teo, T., Luan, W. S., & Sing, C. C. (2008). A cross-cultural examination of the intention to use technology between Singaporean and Malaysian pre-service teachers: an application of the Technology Acceptance Model (TAM). Educational Technology & Society, 11(4), 265-280.

Teo, T., Huang, F., & Hoi, C. K. W. (2018). Explicating the influences that explain intention to use technology among English teachers in China. Interactive Learning Environments, 26(4), 460-475.

Teo, T., Zhou, M., & Noyes, J. (2016). Teachers and technology: Development of an extended theory of planned behavior. Educational Technology Research and Development, 64(6), 1033-1052.

Teo, T., Huang, F., & Hoi, C. K. W. (2019). Towards a new model of teachers’ integration of technology: A conceptual framework. British Journal of Educational Technology, 50(5), 2476-2493.

Teo, T., Huang, F., & Hoi, C. K. W. (2021). Explicating the influences that explain intention to use technology among English teachers in China. Interactive Learning Environments, 29(3), 429-443.

Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425-478.

Vongkulluksn, V. W., Xie, K., & Bowman, M. A. (2018). The role of value on teachers’ internalization of external barriers and externalization of personal beliefs for classroom technology integration. Computers & Education, 118, 70-81.

Walton Family Foundation. (2024). AI Chatbots in Schools. https://www.waltonfamilyfoundation.org/learning/the-value-of-ai-in-todays-classrooms

Wang, Y. S., Wu, M. C., & Wang, H. Y. (2009). Investigating the determinants and age and gender differences in the acceptance of mobile learning. British Journal of Educational Technology, 40(1), 92-118.

Wang, Y., Liu, C., & Tu, Y. F. (2021). Factors affecting the adoption of AI-based applications in higher education. Educational Technology & Society, 24(3), 116-129.

Wong, K. T., Teo, T., & Russo, S. (2012). Influence of gender and computer teaching efficacy on computer acceptance among Malaysian student teachers: An extended technology acceptance model. Australasian Journal of Educational Technology, 28(7), 1190-1207.

Woodruff, K., Hutson, J., & Arnone, K. (2023). Perceptions and barriers to adopting artificial intelligence in K-12 education: A survey of educators in fifty states.

Yuen, A. H., & Ma, W. W. (2008). Exploring teacher acceptance of e‐learning technology. Asia‐Pacific Journal of Teacher Education, 36(3), 229-243.

Zhang, C., Schießl, J., Plößl, L., Hofmann, F., & Gläser-Zikuda, M. (2023). Acceptance of artificial intelligence among pre-service teachers: a multigroup analysis. International Journal of Educational Technology in Higher Education, 20(1), 49.

Zhang, X., Tlili, A., Shubeck, K., Hu, X., Huang, R., & Zhu, L. (2021). Teachers’ adoption of an open and interactive e-book for teaching K-12 students Artificial Intelligence: a mixed methods inquiry. Smart Learning Environments, 8, 1-20.

Zhao, Y., & Frank, K. A. (2003). Factors affecting technology uses in schools: An ecological perspective. American Educational Research Journal, 40(4), 807-840.

]]>
https://michiganvirtual.org/wp-content/uploads/2024/05/iStock-1575986196.jpghttps://michiganvirtual.org/wp-content/uploads/2024/05/iStock-1575986196-150x150.jpg
A Look Back At 3 Years of Michigan Virtual Research https://michiganvirtual.org/research/publications/a-look-back-at-3-years-of-michigan-virtual-research/ Thu, 17 Oct 2024 20:08:46 +0000 https://michiganvirtual.site.strattic.io/?post_type=publication&p=89557

This publication synthesizes three years of original research from the Michigan Virtual Learning Research Institute to further our collective understanding of topics such as effective practices, mentors, professional learning, and AI.

]]>

Introduction

Since the last Research in Review series written in 2020, Michigan Virtual has published over 30 research reports and over 30 research blogs. These publications cover topics ranging from effective practices to emerging Artificial Intelligence (AI) guidance needs. Over the past three years, these publications have offered valuable insight into the online teaching and learning landscape. To that end, while each publication holds unique value, there is room to move our collective understanding forward by seeing how the research fits together over time and the narrative that emerges. As such, MVLRI sought to review, synthesize, and offer practical takeaways from the original research published in the past three years.

Methods

Resources in the “All Publications” and “The Digital Backpack” sections of the Michigan Virtual website were considered for inclusion. All published reports and blogs determined to include original, generalizable research were included in the synthesis. Resources were then thematically grouped. Resources within each category were reviewed again for accuracy in interpretation and to determine their relationship to other resources in the same category. Out of this process, the core findings and practical implications were identified. What is presented below is the synthesized understanding from the original research included. Resources were reviewed to form a broad understanding of the topic and determine what MVLRI has contributed and learned holistically about each theme since 2021; however, not every finding from every resource is included. All resources, including those not used in this synthesis, are available on the Michigan Virtual website.

What We Know About Professional Learning

Previous research on professional learning (PL) demonstrates that PL is a crucial aspect of educators’ development and careers as it can positively impact instructional quality (Bowman et al., 2022; Gesel et al., 2021), student outcomes (Capraro et al., 2016; Gore et al., 2021; Roth et al., 2019), and connectedness to colleagues and the field (Burrows et al., 2021). Thus, Michigan Virtual sought to understand and address educators’ motivations, needs, and preferences relating to PL, as these aspects can positively impact educators and their students. 

Meeting PL requirements was consistently identified across two Michigan Virtual reports as strongly motivating teachers’ enrollment (Cuccolo & Green, 2024; Cuccolo & DeBruler, 2023). Similarly, obtaining SCECHs (State Continuing Education Clock Hours, required for renewing certificates and licenses; Michigan Department of Education, 2020) for free or at a low-cost drove enrollment (Cuccolo & Green, 2024). However, non-SCECH courses had higher completion and lower drop rates than SCECH courses (Cuccolo & DeBruler, 2023). Educators may enroll in SCECH classes to meet specific needs quickly but disengage once those needs are met. Alternatively, educators may initially select many courses and then prune their selection based on interest, time constraints, or other factors. 

Educators’ main goal when enrolling in a professional learning course is to improve their teaching effectiveness, and they prefer certain course design elements such as video/audio, readings, and scenarios (Cuccolo & Green, 2024). PL courses should emphasize engaging design elements that educators prefer while providing practical examples and real-world applications, making the content engaging and relevant to educators’ needs. Practice-focused course design elements and assignments are vital, enhancing skill implementation and confidence, which may positively impact student outcomes down the road. Indeed, courses should incorporate practice opportunities to help educators increase their confidence and the likelihood of skill implementation, as about half of educators plan to apply course content directly to their classrooms (Cuccolo & Green, 2024). 

Practice opportunities seemed especially relevant for social-emotional learning courses, as educators want to apply what they’ve learned in their classrooms. These findings suggest that SEL courses would be most beneficial when they are tailored to the specific needs of educators within the context of their schools and communities. Courses should also incorporate elements that foster communication and collaboration among educators, replicating essential SEL skills to help educators apply course content (Timke & DeBruler, 2022). In terms of format, online, asynchronous courses align well with educators’ busy schedules, and providing affordable courses helps make professional learning accessible (Cuccolo & Green, 2024).

Professional Learning Key Takeaways

  • Providing free or inexpensive SCECHs can keep professional learning accessible and help motivate enrollment in PL courses while offering online and asynchronous options that accommodate educators’ busy schedules.
  • Designing courses that align with educators’ preferences can help address engagement gaps. By and large, educators reported preferring video/audio materials, readings, real-world scenarios, and access to course resources. 
  • Educators appreciate when their PL provides opportunities to practice and receive feedback on the skills and concepts they are learning.
  • To whatever degree possible, PL courses should be tailored to the specific needs of educators and their school communities and incorporate opportunities for communication, collaboration, and application. 
  • Educators reported that they primarily enrolled in PL to satisfy professional learning requirements. However, they appreciated the flexibility in the specific course topic or content, highlighting the need for some level of choice in learning.

The Critical Role of Mentor Support

Michigan K-12 students taking online courses must be provided with a mentor (Michigan Department of Education, 2022). Previous research has suggested that mentors support students by nurturing them, monitoring their learning, and facilitating communication, all of which are important for student success in online courses (Borup, 2019). Importantly, having a mentor can improve online course pass rates (Roblyer et al., 2008; Lynch, 2019). 

Like Borup’s (2019) conceptualization of mentor responsibilities, Cuccolo & DeBruler (2023) found that mentors considered building relationships with students, monitoring student progress in their online course(s), and motivating students to fully engage with course content the three most crucial strategies for supporting students. Mentors should intentionally incorporate these practices into their routines to help support the success of the students they work with. 

Mentors’ student load varies by years of experience and the economic category of the school the mentor works in. For example, first-year mentors had approximately 20 students assigned to them on average, similar to mentors in their fourth and fifth years. In contrast, mentors in their second or third year, and those who had five or more years of experience, had over 30 students assigned to them. Similarly, mentors where more than 75% of students qualified for free-reduced price lunch had more than double the number of students assigned to them as mentors where less than 25% of students qualified (Cuccolo & DeBruler, 2023). Because the number of students a mentor is assigned could impact their availability and ability to work closely with students, administrators should balance student loads among mentors and be mindful of their other responsibilities.

Mentoring Key Takeaways

  • It is crucial that administrators mindfully allocate students to mentors. They should consider the mentors’ experience, current student load, and existing responsibilities. 
  • Providing ongoing professional learning opportunities and peer support is critical, especially for mentors working in buildings with few other mentors and those with less experience. Professional learning and peer support promote understanding of the mentor role and enable them to support students effectively.
  • The gradebook, a popular tool within the SLP (Student Learning Portal), allows mentors to build relationships, monitor student progress, and motivate students to engage with course content. Reviewing the gradebook tool individually with students enables mentors to monitor student progress, encourage self-regulated learning and the development of metacognitive skills (e.g., reviewing instructor feedback), check in about any difficulties, and celebrate wins. Reviewing the gradebook with the student may also create opportunities for building rapport.

Effective Practices in Online Teaching and Learning

The COVID-19 pandemic highlighted the necessity for research-backed online and virtual learning strategies to help teachers engage students and reach disengaged ones (Harrington & DeBruler, 2021; DeBruler & Harrington, 2024). Further, pass rates are typically lower for online courses than in-person ones, indicating a need to identify practical and effective online teaching strategies (Freidhoff et al., 2024). 

Two Michigan Virtual reports published since 2020 have highlighted the considerable overlap between practices and strategies used to engage all students and to address disengaged students (Harrington & DeBruler, 2021; DeBruler & Harrington, 2024). The strategies used by educators are often multi-purpose but center around maintaining/strengthening communication, building relationships, and tailoring/personalizing approaches (Harrington & DeBruler, 2021; DeBruler & Harrington, 2024). Teachers report a strong alignment between their relationship-building strategies and those they perceive as “very effective.” These strategies include using a welcoming tone, responding promptly, providing personalized feedback, showing empathy, and clearly communicating course expectations (Cuccolo & Green, 2024). Communication, primarily through feedback, can increase student awareness of their course progress, build/maintain relationships, and motivate or encourage students to engage more fully in course content. Teachers note the role of feedback in building relationships, highlighting the importance of personalizing their feedback to students. Using students’ preferred names or including something specific learned about a student, such as a hobby, can go a long way (Cuccolo & Green, 2024). 

Teachers increased the effectiveness of their communication by quickly and consistently replying to students’ messages, using multiple communication channels, and being flexible with students’ preferred forms of communication. While teachers use various methods to communicate with students, BrightSpace (LMS), the SLP, and email are reported to be the most common (Cuccolo & Green, 2024). Teachers also recognize the importance of responding promptly and estimate that 97% of student-initiated communications typically receive a reply within 24 hours (Cuccolo & Green, 2024). In addition to these strategies, for disengaged students in particular, teachers reported drawing on the support of adults close to the student (e.g., mentors and guardians; Harrington & DeBruler, 2021; DeBruler & Harrington, 2024).

Students’ enrollment timing, initial course access, and timing of assignment submissions might serve as early indicators to online teachers about which students may benefit from intervention (Zweig, 2023). Accessing a course and submitting an assignment within the first week was significantly associated with higher final course grades. Alternatively, students who did not access or submit an assignment within the first week had significantly lower final grades than their peers. Additionally, students in schools with a high percentage receiving free or reduced-price lunches (>50%) had delayed course access and assignment submissions, ultimately lowering grades. Teachers should closely monitor students’ course access and assignment submissions within the first week of a course.

In addition to monitoring when students submit assignments, teachers should consider monitoring which assignments students submit. Students commonly submit assignments out of alignment with course pacing guides, perhaps based on assignment requirements or characteristics (sometimes called “cherry-picking”), but this is not necessarily advantageous for their grades. Cuccolo & DeBruler (2024) stated that students who completely adhered to the pacing guide had final grades 9.5 points higher than students who deviated from the pacing guide at least once. When looking more closely at the relationship between assignment submissions and grades, researchers noted that the extent to which students submit assignments out of order is most impactful. In other words, moving around slightly within a unit will likely have minimal impact, while moving between units and doing so with a high frequency will likely negatively impact final grades. While it is possible that submitting assignments out of order is part of a broader pattern of student characteristics or behaviors that influence academic achievement, it is recommended that instructors and mentors continue to monitor student progress relative to the pacing guide and encourage adherence. Course pacing guides help to ensure students receive properly scaffolded content and assignments as well as appropriately timed feedback that can contribute to their academic growth.

Finally, certain course design choices encourage students to engage with course content, which may be helpful for their course performance. Zweig (2022) found that students engaged with approximately 40% of the interactive course elements available to them in their online science courses. Flashcards and quizzes were particularly popular with students. This is a promising finding, as it can help guide course design choices and boost student performance. Zweig (2022) explains that students who engaged with interactive course elements had higher unit grades. These students also earned a higher percentage of points compared to the total they attempted, meaning they were more successful in their assignment attempts. Taken together, carefully selecting and strategically placing interactive course elements coupled with monitoring when and which assignments are submitted may be an effective practice for engaging students and ensuring their success in online courses.

Effective Practices Key Takeaways

  • Teachers can strengthen communication and teacher-student relationships by responding quickly and consistently to student messages. They should consider using various methods (e.g., email, video conferencing) to cater to students’ preferences. It is also important to regularly provide progress reports/updates to students, mentors, and guardians. 
  • Teachers should
    • Use tailored, specific, and personalized feedback as a relationship-building tool to motivate students, help them identify areas for improvement, and encourage them to engage more deeply with course content.
    • Consider tailored and personalized approaches by offering multiple forms of content delivery (e.g., text, video), and give students some choice in how they engage with course material and assignments.
  • Students must get access to and start working on their online courses promptly. Online programs should monitor and support students by leveraging indicators such as enrollment timing, initial course access, and assignment submissions. Lack of course access or assignment submission within the first week should prompt outreach to students, mentors, and/or guardians. Similarly, mentors or guardians should be engaged to support students if they become disengaged. 
  • Teachers and mentors should encourage students to follow the course pacing guide via announcements, reminders, or personal communication. Advise students against submitting assignments out of order, which will likely negatively impact their final grades. Adhering to course pacing guides ensures students receive properly scaffolded content and feedback as they progress through a course.
  • Online course designers and teachers can consider incorporating interactive elements like flashcards and quizzes to boost student engagement. These elements should be placed carefully within the course, perhaps after crucial concepts, to help enhance learning and performance.

The Impact of AI on Education

Michigan Virtual has begun a series of research studies on educator and student perceptions and the use of AI. The hope is that through gaining a better understanding of educators’ and students’ beliefs and use, education, training, and guidelines can be developed to meet their needs.

Artificial Intelligence (AI) is rapidly emerging as an innovative and disruptive technology within education. AI can be leveraged within educational settings to benefit students and teachers by streamlining repetitive and administrative tasks, personalizing learning experiences, providing additional tutoring, proofreading, differentiating assignments and materials, and more (Michigan Virtual, 2024; McGehee, 2023). Similarly, AI use may also be shrouded in ethical, privacy, and data concerns, alongside risks about the perpetuation of biases, misinformation, and cheating (McGehee, 2023; McGehee, 2024). 

Currently, perceptions and experience using AI may vary based on one’s job role/function, with building and district administrators having higher levels of trust and experience with AI than teachers (Michigan Virtual, 2024; McGehee, 2023). Further, K-12 teachers use AI significantly less than non-K12 educators and have the most negative perceptions of AI (McGehee, 2023). Opinions are nuanced among those who have used AI, with many users acknowledging both benefits and drawbacks (McGehee, 2023). When looking at specific perceived benefits and discussing the positive aspects of AI, many note the potential for advancing personalized learning. When discussing the potential drawbacks of AI, many note the potential for academic dishonesty and inequitable access (i.e., gaps in knowledge, use, and access between students from low and high SES groups; McGehee, 2023). 

To this end, AI use and perceptions may relate to student achievement (McGehee, 2024). By understanding these relationships, teachers, building, and district administrators can leverage them to further desired student learning outcomes. Students who reported using AI and those who did not had almost identical grades, although only a small portion of students sampled reported using AI (8% or 166 students; McGehee, 2024). Students who used AI typically did so to “explain complicated concepts or principles in simpler terms” and “conduct research or find information.” Looking more closely at the data revealed how students used AI may be essential. Students who used AI as a tool (AI was used for particular tasks to get a specific result (e.g., calculation) and as a facilitator (enabled students to take on the main task of learning still) had higher grades than non-AI users and those who only used AI as a tool. Teaching students to leverage AI effectively, mainly to promote critical thinking and creative problem-solving, may benefit students (McGehee, 2024). 

The need to teach students to leverage AI effectively, coupled with the finding that only 30% of district administrators reported that their school, school board, or governing body officially adopted AI policy or guidelines (Michigan Virtual, 2024), perhaps suggests a path forward. Given that 80% of educators feel that AI will play a “very significant” or “somewhat significant” role in education in the next five years and building and district administrators see AI integration as a priority, guidelines and policies should consider incorporating research that highlights how AI use can benefit students and teachers, leverage potential benefits, address concerns and drawbacks, and incorporate diverse perspectives (given that a small but not insignificant portion of educators have no interest and low trust in AI; Michigan Virtual, 2024).

AI Key Takeaways 

  • Many educators are already using AI both personally and professionally, and students are using AI academically. Districts must address this reality, create policies and guidelines to govern this use, and facilitate best practices. 
  • Not all educators and students hold favorable perceptions of AI. Building and district administrators should incorporate diverse perspectives on AI integration and mindfully address stakeholder (e.g., teacher, guardian, student) concerns and reluctance. 
  • There is a clear need for policies, guidance, and guidelines on AI use, as many administrators, teachers, (and perhaps to a lesser extent) students are already using these tools. Targeted professional development should be provided, particularly regarding data privacy, ethical use of AI, use of AI as a tool and facilitator of learning (to promote critical thinking and creative problem solving), and subject-specific approaches to AI integration.

References

Cuccolo, K., & DeBruler, K. (2023). Examining mentors’ navigation of online environments and use of student support practices. Michigan Virtual. https://michiganvirtual.org/research/publications/examining-mentors-navigation-of-online-environments

Cuccolo, K., & DeBruler, K. (2023). Evaluating professional learning course offerings and educator engagement. Michigan Virtual. https://michiganvirtual.org/research/publications/evaluating-professional-learning-course-offerings-and-educator-engagement/

Cuccolo, K., & DeBruler, K. (2024). Out of order, out of reach: Navigating assignment sequences for STEM success. Michigan Virtual. https://michiganvirtual.org/research/publications/out-of-order-out-of-reach-navigating-assignment-sequences-for-stem-success/

Cuccolo, K., & Green, C. (2024). Maximizing professional learning through educators’ perceptions of utility and self-efficacy in pedagogy-focused courses. Michigan Virtual. https://michiganvirtual.org/research/publications/maximizing-professional-learning/

Cuccolo, K., & Green, C. (2024). Starting Strong: Understanding Teacher-Student Communication in Online Courses. Michigan Virtual.

DeBruler, K., & Harrington, C. (2024). Key strategies for supporting disengaged and struggling students in virtual learning environments. Michigan Virtual. https://michiganvirtual.org/research/publications/key-strategies-for-supporting-disengaged-and-struggling-students-in-virtual-learning-environments/

Freidhoff, J. R., DeBruler, K., Cuccolo, K., & Green, C. (2024). Michigan’s k-12 virtual learning effectiveness report 2022-23. Michigan Virtual. https://michiganvirtual.org/research/publications/michigans-k-12-virtual-learning-effectiveness-report-2022-23/

Harrington, C., & DeBruler, K. (2021). Key strategies for engaging students in virtual learning environments. Michigan Virtual. https://michiganvirtual.org/research/publications/key-strategies-for-engaging-students-in-virtual-learning-environments/

McGehee, N. (2023). Balancing the risks and rewards of AI integration for Michigan teachers. Michigan Virtual. https://michiganvirtual.org/research/publications/balancing-the-risks-and-rewards-of-ai-integration-for-michigan-teachers/

McGehee, N. (2024). AI in education: Student usage in online learning. https://michiganvirtual.org/research/publications/ai-in-education-student-usage-in-online-learning/

Michigan Virtual. (2024). AI in education: Exploring trust, challenges, and the push for implementation. https://michiganvirtual.org/research/publications/ai-in-education-exploring-trust-challenges-and-the-push-for-implementation/

Timke, E., & DeBruler, K. (2022). Educators’ perceptions of online SEL professional learning courses. Michigan Virtual University. https://michiganvirtual.org/research/publications/sel-pd-effectiveness-perceptions/

Zweig, J. (2022). Student engagement with interactive course elements in supplementary online science courses. Michigan Virtual. https://michiganvirtual.org/research/publications/student-engagement-with-interactive-course-elements-in-supplementary-online-science-courses/

Zweig, J. (2023). The first week in an online course: Differences across schools. Michigan Virtual. https://michiganvirtual.org/research/publications/first-weeks-in-an-online-course/

]]>
Starting Strong: Understanding Teacher-Student Communication in Online Courses https://michiganvirtual.org/research/publications/understanding-teacher-student-communication/ Fri, 27 Sep 2024 18:53:19 +0000 https://michiganvirtual.site.strattic.io/?post_type=publication&p=89351

This study focused specifically on understanding the relationship between teacher communication and relationship-building practices in the first four weeks of an online course. Pairing communication data pulled from the student learning portal, survey data, data from focus group interviews, and a review of best practices and Michigan Virtual instructional policies gave researchers a holistic view of teacher-student communication patterns and recommendations for relationship-building in an online environment.

]]>

Abstract

Positive student-teacher interactions can serve as the basis for strong relationships, which may benefit academic outcomes for students. These relationships may be particularly pivotal early on in students’ experience in a course. Understanding teachers’ beliefs and behaviors surrounding communication and relationship-building can help ensure alignment with best practices, benefiting teachers and students. Through surveys, focus groups, and a review of best practices and Michigan Virtual instructional policies, the current study found alignment between teachers’ perceptions, behaviors, and best practices. Teachers primarily communicated with students through the Student Learning Portal, Brightspace, and email to provide reminders, respond to student-initiated communication, and provide feedback. The top five strategies they used—welcoming tone, clear expectations, personalized feedback, prompt responses, and empathy—align with those they found most effective, though in a different order. Both instructional policies and teacher pedagogy should continue to emphasize best practices for communication and relationship-building.

Introduction

Student success in online courses depends on many intertwining factors relating to the student, course design, instructional pedagogy, and more (Curis & Werth, 2015; Liu & Cavanaugh, 2011; Hosler & Arend, 2012). Cultivating student success in online courses requires a multi-pronged approach (Michigan Virtual, n.d.; Roblyer et al., 2008). Communication and relationship-building are two interconnected pedagogical elements that can play a key role in promoting student success.

Positive student-teacher interactions marked by characteristics such as trust, belonging, respect, support, and connection serve as the basis for building strong relationships (Duong, et al., 2019; Kincade et al., 2020), and these relationships are associated with student engagement (Brewster & Bowen, 2004; Duong et al., 2019) and achievement (Cornelius-White, 2007; Curtis & Werth, 2015; Hambre & Pianta, 2001; Hwang et al., 2021; Li et al., 2022; Mensah & Koomson, 2020; Roorda et al., 2011). Instructors of online courses can foster rich student-teacher interactions through communication, feedback, encouragement, and promoting discussion even when they do not interact with students face-to-face (Boston et al., 2019). 

Research suggests that high-quality student-teacher interactions are associated with academic achievement as reflected by state test scores (Allen et al., 2013), end-of-course grades (Hawkins et al., 2013), and perceived learning (Caspi & Blau, 2008; Joosten & Cusatis, 2019; Kum-Yeboah et al., 2017; Richardson, 2001). Further, when students are asked to reflect on their learning experiences, student-teacher interactions and communication are often central themes (e.g., Borup et al., 2019). For example, students from traditionally marginalized groups expressed that student-teacher interactions and open communication prompted learning and bolstered their academic self-concept. Having an accessible instructor who provided feedback and support gave students more opportunities to ask questions and created discussions that promoted understanding (Kumi-Yeboah, et al., 2017). Overall, students who receive more interaction with their online course instructors report being more satisfied with their experience (Turley & Graham, 2019). Finding strategies for increasing student engagement and academic success is vital, as research indicates that pass rates are typically lower for online courses than in-person ones (Freidhoff et al., 2024). The importance of teacher-student interaction, in conjunction with research highlighting the importance of student engagement at the beginning of a course (Zweig, 2023), points to a need to better understand how teachers interact with students in the initial weeks of a course and associations with student achievement. Given the positive associations between relationship-building behaviors, communication, and student outcomes, this study aims to understand the nature of teacher-student interactions in Michigan Virtual (MV) courses. The focus will be on understanding the methods, purposes, and outcomes associated with communication during the first four weeks of a course. An analysis of current MV teacher practices and training materials will be conducted alongside a review of best practices to facilitate alignment between theory/research and practice.

The Current Study

Given the positive associations between communication, relationship-building, and student outcomes, this study aims to understand the nature of teacher-student interactions in Michigan Virtual courses. Previous research has shown engagement during the first weeks of a course to be predictive of student outcomes like final grades (Zweig, 2023). As such, examining communication and relationship-building early on (within the first four weeks) may help identify important links to student achievement and opportunities for intervention. An analysis of current MV teacher practices and training materials was conducted alongside a review of best practices to facilitate alignment between theory/research and practice.

The goals of this study produced the following research questions: 

  1. How often, by what means, and for what reasons do teachers communicate with students within the first four weeks of a course?
    • How often do students initiate communication? What percentage of student-initiated communications receive a teacher reply within 24 hours? 
    • What is the frequency of one-to-one communication? What is the frequency of one-to-many communication?
    • Is the frequency of teacher communication associated with students’ final course grades?
  2. What are teachers’ beliefs about relationship-building in a virtual learning environment? How are these beliefs reflected in the way they approach instruction in the first four weeks of a course? 
  3. What are considered best practices for online teacher-student communication? How does this align with Michigan Virtual teacher training, behaviors, and recommendations?

Methods

Researchers conducted a mixed methods study to address the research questions outlined above.

Qualitative Methods

Researchers conducted five focus groups of full-time Michigan Virtual teachers to understand communication and relationship-building practices used by teachers in the first four weeks of an online course. Grouped by content area, researchers met with groups of 4-7 teachers for approximately 15 minutes each. Each group was asked 2-3 different open-ended questions that pertained to their experiences communicating and building relationships with students in their online courses, specifically during the first four weeks.

Quantitative Methods

A survey of 19 questions assessing teachers’ frequency and perceptions of communication and relationship-building practices in the first four weeks of a course was sent out to all full- and part-time Michigan Virtual instructors. 97 instructors responded to the survey (74 part-time and 23 full-time instructors), with approximately 40.00% of the sample having between 6 and 10 years of online teaching experience. To obtain more detailed information about communication in Michigan Virtual online courses, a member of MV’s technology integration team pulled data such as teachers’ incoming/outgoing messages and students’ course enrollment details, including final course grades, from the Student Learning Portal (SLP). Analyses were restricted to the three most highly enrolled courses (according to enrollment data from Michigan Virtual’s 2022-23 annual report) in each of the core subject areas of English Language and Literature, Life and Physical Sciences, Mathematics, and Social Sciences and History (review Appendix A for a list of these 12 courses). This allowed researchers to analyze the relationship between teacher communication frequency and final grades. Taken together, the SLP data, survey data, and focus group data provided researchers with a holistic and rich look at relationship-building and communication practices in online courses.

Results

Results from this mixed-methods study are organized and presented by research question. 

How often, by what means, and for what reasons do teachers communicate with students within the first four weeks of a course? 

Teachers’ communication tool use

According to survey data, within the first four weeks of a course, the top three communication tools used daily by teachers were BrightSpace (LMS), the SLP (Student Learning Portal), and emailing individual students. At the top of the list of communication tools teachers reported not using were ‘other’ tools, text messages, and phone calls. Table 1 depicts how often teachers use specific tools to communicate with their students in the first four weeks of a course. 

Table 1. Percentage of Communication Tool Use Among Educators

Communication
Tool
Daily4-6 times per week2-3 times per weekOnce a weekLess than weeklyDid not use
BrightSpace31.96%82.50%19.59%30.93%51.50%41.20%
SLP25.77%15.46%21.65%25.77%11.34%0.00%
Email individual students20.62%14.43%21.65%17.53%20.62%51.50%
Other___13.21%18.90%37.70%37.70%0.00%77.36%
Office hours41.20%10.30%51.50%45.36%21.65%22.68%
Email students one-to-many30.90%51.50%10.31%40.21%32.99%82.50%
Teacher feed30.90%10.30%20.62%74.23%10.30%0.00%
Video conferencing (e.g., Zoom)10.30%0.00%92.80%17.53%36.08%36.08%
Text messages10.30%41.20%10.31%82.50%29.90%46.39%
Phone call10.30%30.90%10.31%82.50%36.08%41.24%

During focus group conversations, teachers described how they use some of these tools to encourage reciprocated communication with their students, specifically during the first few weeks of a course. For example, teachers often make introductory videos in their Teacher Feed to “let students know I am a real teacher” and record personalized video responses to students’ introductory discussion board posts to help build rapport. One teacher shared how they send students messages in the SLP “introducing myself, making sure students know various ways they can contact me, and providing a few tips for success in the course.” Several teachers noted using a different communication tool—surveys—to get to know their students better by asking things like their preferred name, pronouns, why they are taking the course, their goal in the course, a bucket list item, and “anything else they think I should know about them, which opens up opportunities for students to share important information.” Teachers use these communication tools to humanize and personalize their interactions with students. 

Frequency of teacher-student communication

Teachers (n = 45) sent approximately 124 messages on average (M = 124.84, SD = 156.76) in the first four weeks of a course. However, this number varied substantially from three to 721. Looking at the median number of messages, half of the teachers in the sample sent less than 72 messages within the first four weeks, and half of them sent more than 72 messages. Looking more closely at communication patterns indicates that teachers sent an average of about three messages per student (M = 2.83, SD = 1.36). The total number of messages sent by teachers varied by course enrollment size, with teachers sending more messages as enrollments increased. To review the number of messages teachers sent based on course and enrollment, refer to Figure 1 below.

Figure 1. Enrollments and Messages by Course

Looking specifically at the number of messages teachers sent to individual students, an average of 119 messages were sent within the first four weeks (M = 119.13, SD = 150.73), with half of the teachers sending more than 70 messages to individual students and half sending less than that. The number of messages sent by teachers to individual students ranged from one to 15. Teachers sent just under three personal/individual messages per student in the first four weeks of a course (M = 2.75, SD = 1.30). Half of the teachers sent just over two messages per student, and half sent less than that.

Reasons for teacher communication with students

When asked about the top three reasons teachers communicate with students during the first four weeks, providing reminders about assignments, policies, important dates, etc., was selected most often and followed closely by replying to student-initiated communication. The third most common reason teachers communicated with students was providing feedback about student learning. See Figure 2 for a full breakdown of reasons teachers communicated with students during the first four weeks of a course.

Figure 2. Reasons for Teacher Communication with Students

How often do students initiate communication? What percentage of student-initiated communications receive a teacher reply within 24 hours?

According to teachers, in the first four weeks of a course, the top three tools used daily by students were the SLP, email, and text messages. Conversely, teachers reported that attending office hours and using video conferencing were communication tools least commonly used by students. Review Table 2 below for information on how often students initiated communication with their teacher(s) via specific communication tools.

Table 2. Percentage of Communication Tool Use Among Students as Reported by Teachers

Communication
Tool
Daily4-6 times per week2-3 times per weekOnce a weekLess than weeklyDid not use
SLP23.96%14.58%25.00%17.71%15.63%31.30%
Email19.59%17.53%20.62%20.62%18.56%30.90%
Text message31.60%21.10%63.20%94.70%30.53%48.42%
BrightSpace30.90%72.20%12.37%11.34%29.90%36.08%
Office hours21.10%0.00%21.10%12.63%25.26%57.89%
Other___19.60%0.00%0.00%19.60%19.60%94.12%
Teacher feed10.50%0.00%73.70%17.89%40.00%33.68%
Video Conferencing (e.g., Zoom)0.00%21.30%53.20%13.83%24.47%54.26%
Phone call0.00%21.10%84.20%63.20%25.26%57.89%

According to teachers, the most commonly reported reason students reached out was to obtain clarification about course or assignment requirements. Questions about course content or grades were also commonly reported reasons students initiated communication. Contacting teachers for emotional or social support was not widely reported in the first four weeks. See Figure 3 to see why students contacted their instructors during the first four weeks. Michigan Virtual policy states teachers should reply to any student communication within 24 hours. Most teachers indicated that this was a reasonable expectation (n = 92, 94.85%) and were able to meet this expectation. Indeed, teachers estimated that, on average, 97% of student-initiated communications received a reply within 24 hours. Some teachers contextualized their responses, indicating this was a reasonable expectation, barring unforeseen life circumstances or students reaching out on weekends.

Figure 3. Reasons for Student Outreach During First Four Weeks

What is the frequency of one-to-one communication? What is the frequency of one-to-many communication?

Course-wide teacher-student communication was most likely to happen once a week (29.09%), whereas the frequency of communication with individual students was more distributed. Approximately twenty-nine percent of teachers reported communicating with students individually daily, in contrast with 10.42% communicating with individual students less than weekly. This likely reflects the use of course-wide communication for course updates, announcements, reminders, and messages that apply to all students, whereas individual communication likely centers around providing feedback, replying to student messages, and personalized/individualized communication. Notably, all teachers reported using both types of communication, and no teachers reported a lack of communication during the first four weeks of the course. See Figure 4 for the frequency of teachers’ communication with individual students and Figure 5 for the frequency of teachers’ course-wide communication.

Figure 4. Frequency of Individual Teacher-Student Communication

Figure 5. Frequency of Course-Wide Teacher-Student Communication

Is the frequency of teacher communication associated with students’ final course grades?

While outliers (extreme cases) are typically removed from a dataset before analyses, all students in our sample had completed their courses, and their data represents real cases of students who have completed a semester of online learning. Because of the variation in student performance, analyses were run with and without outliers. While the descriptive information about grades changes based on the inclusion or exclusion of these outliers, the relationship between teacher-initiated messages and students’ final grades remains similar, so the data is presented with outliers included for concision. 

With outliers included, students averaged grades of 77.67% (SD = 22.91) in their courses; however, grades ranged from 0.11% to 99.76%. A positive but not statistically significant correlation existed between the number of messages teachers sent to students during the first four weeks of a course and the student’s final course grade (p > .05, tau = 0.005).

To further understand how communication relates to students’ grades, the data was segmented into quartiles (percentiles) based on their grades. Students, regardless of grade, received a similar number of messages from their teachers. Students in the 25th percentile (those with the lowest grades) received the fewest messages (M = 3.15), and students in the 50th and 75th percentile received 3.24 and 3.23 messages each, on average. The means being so closely clustered together suggests very little variation in the number of messages teachers send to students. This lack of variation may have obscured the relationship between communication and final grades.

What are teachers’ beliefs about relationship-building in a virtual learning environment?

When asked about what practices were effective for building relationships in the first four weeks of a course, responding promptly (n = 89, 91.75%), using a welcoming tone (n = 85, 87.63%), and providing personalized feedback (n = 79, 81.44%) were the most commonly reported strategies that teachers believed were “very effective.” The practices teachers perceived as the least effective (the highest percentages of those strategies teachers indicated were “not effective”) were holding office hours (n = 33, 34.02%) and scheduling check-ins (n = 15, 15.46%). Review Figure 6 below for details on how effective teachers perceived all the strategies and practices included in the survey. 

Figure 6. Perceived Effectiveness of Specific Strategies & Practices

How are these beliefs reflected in the way they approach teaching in the first four weeks of a course? 

During focus group conversations, teachers acknowledged how important relationship-building is in an online course despite it being more challenging than in a traditional face-to-face classroom. Because there is some “mystery about the person behind the screen,” teachers felt that being positive and aware of their tone was very important. They explained that because of the perceived tone in your online communications, “you may come off differently than you would in person.” Messages delivered online or through email can feel cold or formal, “missing that tone your voice would have delivered if you were face-to-face, so you have to work hard to convey your compassion.” This is consistent with survey results indicating that using a welcoming tone is a practice teachers feel is very effective in relationship-building. 

Teachers also indicated that providing personalized feedback—another strategy perceived as very effective when it comes to relationship-building—such as consistently using students’ preferred names in their feedback “seems to go a long way with students.” One teacher mentioned that they try to tie something personal about the student into their feedback so that students know the feedback is directed specifically toward them. Personalizing feedback in this way also helps to emphasize that the teacher listens and pays attention to the information students share. 

The survey also asked teachers how effective they perceived specific strategies and practices to be and if they used them during the first four weeks of their online course(s). As expected, there was an overlap between the strategies teachers perceived as very effective and those they reported using in the first four weeks. The top five strategies teachers reported using—using a welcoming tone (100%), communicating course expectations clearly (100%), providing personalized feedback (100%), responding promptly (100%), and showing empathy (97.94%)—are the same top five strategies (just in a different order) that teachers believe are most effective at building relationships with students during the first four weeks of a course.

What are considered best practices for online teacher-student communication? How does this align with Michigan Virtual teacher training, behaviors, and recommendations?

To better understand how Michigan Virtual teachers leverage communication and relationship-building best practices in their online courses, an interview was conducted with Dr. Shannon Smith, Michigan Virtual’s senior director of student learning, alongside a review of Michigan Virtual training materials and recommendations. Dr. Smith explained that Michigan Virtual draws on both the National Standards for Quality Online Teaching (NSQOT) and Danielson’s Framework for Teaching (FFT) and pointed to a document used to evaluate and guide Michigan Virtual online teachers that pulls from both models—A Crosswalk of the NSQ Teaching Standards and the Danielson Framework. These best practices, including those focused on relationship-building and communication, are incorporated into Michigan Virtual’s teacher training materials, behaviors, and recommendations (e.g., The Michigan Virtual Way: Expectations and Success Indicators for Michigan Virtual Instructors).

There was close alignment between best practices drawn from the NSQOT and FFT, recommendations made in the Michigan Virtual Way document, and teachers’ behaviors and beliefs as assessed in the current study. Firstly, most student-initiated communications received a reply within 24 hours. Indeed, responding promptly was the most commonly reported strategy educators endorsed as being “very effective.” In their communications with students, teachers draw on NSQOT and FFT principles by ensuring their communication is focused on supporting students’ academic engagement and success and utilizing various communication methods (e.g., email, SLP, text, etc.). Feedback, which was emphasized across best practices frameworks, was also perceived as crucial by teachers, as evidenced by 81.44% of teachers believing it to be a “very effective” practice. Feedback was the third most commonly reported reason teachers reached out to students and was highlighted in the focus groups as pivotal for student success. Table 3 below illustrates the alignment between what the National Standards for Quality Online Teaching, Danielson’s Framework for Teaching, and Michigan Virtual consider to be best practices related to teacher-student communication, relationship-building, and personalized feedback.

Table 3. Alignment Between Danielson’s FFT, the NSQOT, and Michigan Virtual Best Practices for Communication and Relationship-Building

Danielson’s Framework for Teaching (FFT)National Standards for Quality Online Teaching (NSQOT)Michigan Virtual
3a: Communicating About Purpose and Content

Elements of Success: 

• Purpose for learning and criteria for success
• Specific expectations
• Explanations of content
• Use of academic language

While any communication with or between students has a direct connection to many of the components of learning environments, communication related to the purposes of learning, the expectations for activities, and the content itself are essential aspects of instruction that support (or hinder) students’ intellectual engagement and academic success.
B1 The online teacher uses digital pedagogical tools that support communication, productivity, collaboration, analysis, presentation, research, content delivery, and interaction.

D4 The online teacher establishes relationships through timely and encouraging communication using various formats.

Regardless of who the online teacher is communicating with, effective communication methods are necessary for successful two-way communication. 
Return communications within 24 hours of receipt, Monday-Friday.

The instructor MUST provide a Welcome Letter to all students (include guardians and mentors) outlining clear expectations for class participation. 

The instructor makes initial contact with students within the first five days of class.

The instructor is expected to reach out beyond email or messages to the mentor and/or guardian by phone if a student has not engaged consistently in the course after the first month of enrollment.
3d: Using Assessment for Learning

Elements of Success: 

• Clear standards for success
• Monitoring student understanding
• Timely, constructive feedback
D5 The online teacher helps learners reach content mastery through instruction and quality feedback using various formats.

The online teacher provides actionable, specific, and timely feedback. 
Score and provide feedback on student-submitted assignments within 72 hours (96 hours for ELA and AP courses) of submission, Monday-Friday. 

Assignments should receive specific, detailed, and individualized feedback. 

Feedback should include the use of scoring rubrics where available but must also include a comment on areas of strength and/or areas in need of improvement.

Individualized feedback is professional, positive, personal, and encouraging. 

Conclusions

During the first four weeks of a course, teachers primarily use BrightSpace, the Student Learning Portal (SLP), and individual emails to communicate with students. Many teachers noted that the SLP is particularly effective, as students must log in to the SLP before accessing Brightspace and their course(s), making messages more visible.

On average, teachers (n = 45) sent about 119 messages—roughly two per student—during this period. In a virtual setting, teachers emphasize the importance of crafting messages that convey a welcoming, compassionate tone to foster positive relationships. Teachers reported strategies such as using tone-checking tools (e.g., Grammarly) and incorporating personalized feedback to create a connection. Best practices for cultivating positive relationships and communicating effectively include:

  • Responding promptly to students.
  • Using a welcoming tone.
  • Consistently providing specific, constructive, and timely feedback to students. Incorporating students’ personal details (e.g., hobbies, first names) can help tailor/personalize feedback to the student.

Students most commonly use the SLP, email, and text messages during the first four weeks of a course.  Because students’ communication preferences vary, it is important for teachers to be open and responsive to what seems to work for each student. However, teachers may prioritize communication via the SLP and email, which both parties widely use.

Students typically reach out to clarify course requirements, ask about content, or inquire about grades. Teachers can prepare FAQ documents and resources to streamline responses to address these student concerns. When students ask specific questions, teachers may gently encourage proper pacing, time management, and self-reflection to help students develop these skills because helping students strengthen or establish metacognitive skills can help them succeed with online learning (Xu et al., 2023; Zion et al., 2015). Contacting teachers for emotional or social support was not widely reported in the first four weeks. This may point to the need for consistent and respectful communication over time for building strong teacher-student relationships (Duong et al., 2019; Kincade et al., 2020).

Although a positive relationship was observed between the number of messages sent and student grades, it was not statistically significant. This may be due to the study’s focus on the first four weeks when teachers have limited data to identify struggling students. Additionally, the uniformity in the number of messages sent by teachers may have obscured the relationship between grades and communication. Statistical significance often depends on factors like sample size or variability in the data. Indeed, across all quartiles of grades, students received a similar number of messages from their teachers, and this uniformity may have made it difficult to detect an effect. Michigan Virtual teachers may already be engaging in best practices for communication with their students, and thus, there was not much variation in the number of messages sent across students with differing levels of course performance. In other words, students who would likely be helped by interventions focused on communication are likely already receiving it, thus, leaving little room for improvement. 

Despite the lack of statistical significance, the importance of communication for relationship-building remains clear. Teachers should continue using best practices—personalized, timely feedback and communication—as these strategies are supported by research and teacher experience. Found within Michigan Virtual’s professional learning portal, a series of courses specific to online teaching and learning—the content of which was written by Michigan Virtual teachers—focuses on these best practice strategies. The Level 1 series consists of eight courses designed for educators new to online teaching, including Communicating in Online Classrooms and Grading and Feedback. The Level 2 series also consists of eight similar courses but is designed for educators with experience with online teaching and learning. These courses may provide both new and seasoned online teachers with strategies for communicating effectively and building relationships with their online students.

Effective teacher-student communication is crucial in online courses. Teachers should:

  • Leverage the SLP and email as primary communication tools.
  • Use tone-checking tools and tailor feedback with personal details.
  • Prepare FAQs and resources to address common student questions.
  • Foster metacognitive skills to aid student success in online learning.

Continued focus on timely and personalized communication should be part of teacher training and professional development during the first weeks and throughout the course. While the study did not find a significant link between communication and grades, the practical importance of communication for building relationships remains vital for student success. Future research should explore the role of student-initiated communication and the reciprocal nature of communication to better understand its impact on academic outcomes. Another consideration for future research is exploring the role of teacher-student communication throughout a course to determine if measuring reciprocal communication for a longer length of time provides a more accurate understanding of its impact on student academic outcomes in online courses.

Appendix A

Appendix A: Courses Included in SLP Data

NCES Subject AreaCourseN Enrollment
English Language and LiteratureMythology and Folklore: Legendary 165
American Literature A – English 11-12 59
American Literature B – English 11-12 43
Life and Physical SciencesMedical Terminology384
Chemistry A48
Chemistry B31
MathematicsMathematics of Personal Finance229
Mathematics in the Workplace145
Geometry A73
Social Sciences and HistoryCriminology310
Economics203
Civics191

References

Borup, J., Chambers, C. B., & Stimson, R. (2019). K-12 student perceptions of online teacher and on-site facilitator support in supplemental online courses. Online Learning, 23(4), 253-280. 

Brake, A. Right from the Start: Critical Classroom Practices for Building Teacher–Student Trust in the First 10 Weeks of Ninth Grade. Urban Rev 52, 277–298 (2020). https://doi.org/10.1007/s11256-019-00528-z

Brewster, A. B., & Bowen, G. L. (2004). Teacher support and the school engagement of Latino middle and high school students at risk of school failure. Child and Adolescent Social Work Journal21, 47-67.

Cornelius-White, J. (2007). Learner-centered teacher-student relationships are effective: A meta-analysis. Review of Educational Research77(1), 113-143.

Corry, M., Ianacone, R., & Stella, J. (2014). Understanding online teacher best practices: A thematic analysis to improve learning. E-Learning and Digital media11(6), 593-607. 

Cuccolo, K. & DeBruler, K. (2024). Out of Order, Out of Reach: Navigating Assignment Sequences for STEM Success. Michigan Virtual. https://michiganvirtual.org/research/publications/out-of-order-out-of-reach-navigating-assignment-sequences-for-stem-success/

Curtis, H., & Werth, L. (2015). Fostering student success and engagement in a K-12 online school. Journal of Online Learning Research, 1(2), 163-190.

DeBruler, K. & Harrington, C. (2024). Key Strategies for Supporting Disengaged and Struggling Students in Virtual Learning Environments. Michigan Virtual. https://michiganvirtual.org/research/publications/key-strategies-for-supporting-disengaged-and-struggling-students-in-virtual-learning-environments/

Duong, M. T., Pullmann, M. D., Buntain-Ricklefs, J., Lee, K., Benjamin, K. S., Nguyen, L., & Cook, C. R. (2019). Brief teacher training improves student behavior and student-teacher relationships in middle school. School Psychology34(2), 212. 

Freidhoff, J. R., DeBruler, K., Cuccolo, K., & Green, C. (2024). Michigan’s k-12 virtual learning effectiveness report 2022-23. Michigan Virtual. https://michiganvirtual.org/research/publications/michigans-k-12-virtual-learning-effectiveness-report-2022-23/

Hamre, B. K., & Pianta, R. C. (2001). Early teacher-child relationships and the trajectory of children’s school outcomes through eighth grade. Child development72(2), 625-638.

Harrington, C. & DeBruler, K. (2021). Key strategies for engaging students in virtual learning environments. Michigan Virtual University. https://michiganvirtual.org/research/publications/key-strategies-for-engaging-students-in-virtual-learning-environments/

Hosler, K. A., & Arend, B. D. (2012). The importance of course design, feedback, and facilitation: Student perceptions of the relationship between teaching presence and cognitive presence. Educational Media International, 49(3), 217-229.

Hwang, N., Kisida, B., & Koedel, C. (2021). A familiar face: Student-teacher rematches and student achievement. Economics of Education Review, 85, 102194.

Heilporn, G., Lakhal, S., & Bélisle, M. (2021). An examination of teachers’ strategies to foster student engagement in blended learning in higher education. International Journal of Educational Technology in Higher Education18, 1-25.

Jou, Y. T., Mariñas, K. A., & Saflor, C. S. (2022). Assessing cognitive factors of modular distance learning of K-12 students amidst the COVID-19 pandemic towards academic achievements and satisfaction. Behavioral Sciences12(7), 200.

Joosten, T., & Cusatis, R. (2019). A Cross-Institutional Study of Instructional Characteristics and Student Outcomes: Are Quality Indicators of Online Courses Able to Predict Student Success?. Online Learning, 23(4), 354-378.

Kincade, L., Cook, C., & Goerdt, A. (2020). Meta-analysis and common practice elements of universal approaches to improving student-teacher relationships. Review of Educational Research90(5), 710-748. 

Lavy, S., & Naama-Ghanayim, E. (2020). Why care about caring? Linking teachers’ caring and sense of meaning at work with students’ self-esteem, well-being, and school engagement. Teaching and Teacher Education91, 103046.

Leitão, N., & Waugh, R. F. (2007). Students’ views of teacher-student relationships in primary school. Annual International Educational Research.

Li, X., Bergin, C., & Olsen, A. A. (2022). Positive teacher-student relationships may lead to better teaching. Learning and Instruction, 80, 101581.

Liu, F., & Cavanaugh, C. (2011). High enrollment course success factors in virtual school: Factors influencing student academic achievement. International Journal on E-learning, 10(4), 393-418.

Martin, A. J., & Collie, R. J. (2019). Teacher-student relationships and students’ engagement in high school: Does the number of negative and positive relationships with teachers matter? Journal of Educational Psychology111(5), 861.

Martin, A. J., & Dowson, M. (2009). Interpersonal relationships, motivation, engagement, and achievement: Yields for theory, current issues, and educational practice. Review of Educational Research79(1), 327-365.

Mensah, B., & Koomson, E. (2020). Linking teacher-student relationship to academic achievement of senior high school students. Social Education Research, 102-108.

Michigan Virtual. (n.d.). Student Guide to Online Learning. Retrieved from https://michiganvirtual.org/resources/guides/student-guide/

Miller, K. E. (2021). A Light in Students’ Lives: K-12 Teachers’ Experiences (Re) Building Caring Relationships During Remote Learning. Online learning25(1), 115-134.

Prewett, S. L., Bergin, D. A., & Huang, F. L. (2019). Student and teacher perceptions on student-teacher relationship quality: A middle school perspective. School Psychology International40(1), 66-87.

Roblyer, M. D., Davis, L., Mills, S. C., Marshall, J., & Pape, L. (2008). Toward practical procedures for predicting and promoting success in virtual school students. American Journal of Distance Education, 22(2), 90–109.

Roorda, D. L., Koomen, H. M., Spilt, J. L., & Oort, F. J. (2011). The influence of affective teacher-student relationships on students’ school engagement and achievement: A meta-analytic approach. Review of Educational Research, 81(4), 493-529.

Straub, E. O. (2024, January 15). Giving good online feedback. University of Michigan. https://onlineteaching.umich.edu/articles/giving-good-online-feedback/

Turley, C., & Graham, C. (2019). Interaction, student satisfaction, and teacher time investment in online high school courses. Journal of Online Learning Research5(2), 169-198.

Van Leeuwen, A., & Janssen, J. (2019). A systematic review of teacher guidance during collaborative learning in primary and secondary education. Educational Research Review27, 71-89.

Xu, Z., Zhao, Y., Zhang, B., Liew, J., & Kogut, A. (2023). A meta-analysis of the efficacy of self-regulated learning interventions on academic achievement in online and blended environments in K-12 and higher education. Behaviour & Information Technology, 42(16), 2911-2931. https://doi.org/10.1080/0144929X.2022.2151935

Zion, M., Adler, I., & Mevarech, Z. (2015). The effect of individual and social metacognitive support on students’ metacognitive performances in an online discussion. Journal of Educational Computing Research, 52(1), 50-87. https://doi.org/10.1177/0735633114568855

Zweig. J. (2023). The first week in an online course: Differences across schools. Michigan Virtual. https://michiganvirtual.org/research/publications/first-weeks-in-an-online-course/

]]>
Maximizing Professional Learning through Educators’ Perceptions of Utility and Self-Efficacy in Pedagogy-Focused Courses https://michiganvirtual.org/research/publications/maximizing-professional-learning/ Mon, 15 Jul 2024 15:56:12 +0000 https://michiganvirtual.site.strattic.io/?post_type=publication&p=87835

Educators’ perceptions of the utility of information and beliefs about their ability to leverage what they’ve learned are important aspects of their experiences in professional learning (PL) courses, as they are associated with engagement and implementation of course content. A survey sent out to educational professionals who recently completed a pedagogy-focused PL course from Michigan Virtual revealed that just under half of respondents plan to implement what they’ve learned in their classrooms and strongly believe in their ability to be reflective practitioners and implement specific strategies in their teaching pedagogy.

]]>

Introduction

What is Professional Learning?

Professional learning (PL) or professional development (PD) typically focuses on developing teachers’ knowledge, skills, or beliefs (e.g., self-efficacy, or the belief in one’s abilities; Gesel et al., 2021). PL enhances teachers’ knowledge, pedagogical strategies, beliefs, or other characteristics that may relate to the quality of their teaching (Bowman et al., 2022; Gesel et al., 2021). In some instances, PL may also include mandatory compliance courses, aligning with certain state or federal standards (e.g., Bloodborne Pathogens). 

Professional Learning Matters: Engagement and Teaching Outcomes

Educators’ engagement in professional learning is pivotal as it has the potential to expand their pedagogy. While research examining the impact of PL on teaching practices and student learning is still emerging, some promising findings exist (e.g., Roth et al., 2019). Indeed, research has suggested PL can positively impact student achievement (Blank & Alas, 2010; Yoon et al., 2007). Interviews with teachers about their experiences revealed that being highly engaged in PL leads to improved outcomes in the classroom. In turn, this classroom success motivates teachers to further engage in growing and expanding their pedagogy (Ji, 2021). As such, the structure of PL must engage educators. For example, students whose teachers participated in practice-focused PL outperformed peers whose teachers participated in content-focused PL on a content knowledge test (Taylor et al., 2017). Overall, there is a close relationship between teachers’ engagement in PL and the impact of PL on teaching outcomes. 

So what helps foster teachers’ engagement in professional learning? In her blog, Street Data And Empathy: Revealing What Educators Truly Want From Professional Learning, Michigan Virtual professional learning specialist Anne Perez found that educators wanted asynchronous, expert-driven PL incorporating research-based strategies (Perez, 2023). Educators want choices in how they receive instruction and opportunities to apply the knowledge. Across all career stages, educators agree that PL must be application-based and relevant to their roles (Masuda et al., 2013). Professional learning that allows educators to see how to obtain desired teaching outcomes is crucial (Ji, 2021). Relatedly, highlighting the value or utility of the skills for future situations can foster interest (i.e., utility value; Kale, 2018). Indeed, teachers’ willingness to engage in PL is closely tied to its perceived value or importance (Masuda et al., 2013). For example, Zhang & Liu (2019) noted that teachers’ perceptions of the utility of PL communities predicted their engagement with online PL.

The Role of Utility Value and Self-Efficacy in Professional Learning

Looking at more specific practices, PL positively influenced middle and high school teachers’ classroom technology integration by increasing their perceptions of its utility (Bowman et al., 2022). In other words, when teachers saw perceived value in using technology through exposure during PL, they were more likely to integrate it into their teaching practices. As such, professional learning targeting utility values may be especially effective at increasing specific instructional skills. In addition to increasing teacher’s perceptions of the values of the skills being taught, improving their self-efficacy (i.e., belief in one’s ability) surrounding the skills through practice is important. Increasing teachers’ self-efficacy (particularly through providing practice opportunities) may be an important way to encourage instructors to persist in practicing new or challenging skills (Gesel et al., 2021; Palermo & Thomson, 2018).

Indeed, self-efficacy has been noted to predict pedagogical changes to instruction and assessments used in the classroom (Palmero & Thomson, 2018). For example, teachers with high levels of self-efficacy had more positive attitudes and higher levels of implementation for a new PE curriculum for junior high school students in Greece. These teachers also intended to continue implementing new lesson plans in the future (Gorozidis & Papaioannou, 2011). Teachers’ self-efficacy may be related to student achievement. A study of sustained professional learning to improve students’ science achievement found that teachers’ self-efficacy and the number of PL hours positively predicted fourth and sixth-graders performance on a state-standardized science test (Lumpe et al., 2012). Taken together, self-efficacy seems to be important for fostering and maintaining pedagogical changes.   

Finally, increasing teachers’ self-efficacy is important because it has been associated with satisfaction, job retention, lowered feelings of depersonalization and emotional exhaustion, increased feelings of professional accomplishment, and improved student outcomes (as cited in Crawford et al. 2021; Evers et al., 2002). All in all, educators’ expectations (“Can I do this?”) and attitudes (“Is this useful?”) are important pieces of their experiences with PL. The current study aimed to assess educators’ engagement in PL courses offered at Michigan Virtual, focusing specifically on feelings of self-efficacy and utility value after taking pedagogically focused PL courses. 

The Current Study

A previous report examined educators’ engagement patterns and course offerings through Michigan Virtual, finding that educators were generally satisfied with their professional learning (PL) courses, which they pursued to meet specific requirements. Audio/visual elements were reported as the most engaging and helpful for learning. This companion piece aims to deepen our understanding of educators’ goals for enrolling in PL courses through Michigan Virtual and how the course content meets their needs. By understanding educators’ feelings of utility value and self-efficacy, we can better understand how they plan to apply what they’ve learned and ensure that courses are designed to facilitate optimal learning outcomes. This is particularly important as the perception of utility value and self-efficacy during professional learning can foster positive outcomes for educators and their students.

The goals of the current study led to the following specific research questions:

  • What are educators’ goals when enrolling in a course? 
  • What do educators hope to achieve or get out of their courses?
  • What course design elements (e.g., videos, readings, discussion boards) do educators find most valuable?
  • How do educators plan to apply course content to their current roles?
  • After completing a course, to what extent do educators believe they can engage in certain pedagogical behaviors? 
  • After completing a course, to what extent do educators feel they have demonstrated their knowledge and skills and can apply course concepts in their roles?

Methods 

Educators who completed pedagogy-focused professional learning courses at Michigan Virtual within a 90-day period (December 26th, 2023 to March 19th, 2024) were contacted via email (n = 3083) to take a brief survey about their experience. The survey was developed based on a literature review of established teacher self-efficacy, utility value measures, and expert input. The survey consisted of 12 questions—11 close-ended and one open-ended. For a list of included courses see Appendix A. 

Results

Sample / Data Overview

Excluding those who completed less than 50% of the survey (n = 38, 17.67%) resulted in a total sample size of n = 177. Of the 177 respondents, the two most populous courses were ‘Differentiated Instruction: Maximizing Learning for All’ (n = 25, 14.12%) and ‘Social-Emotional Learning: Introduction to SEL’ (n = 20, 11.30%).

Most educators indicated their primary role was ‘Teacher’ (n = 124, 70.06%), while a smaller proportion of respondents were ‘Other’ (n = 15, 8.47%), ‘School Support Staff’ (n = 10, 5.65%), ‘Administrator’ (n = 8, 4.52%), ‘Educational Coach’ (n = 7, 3.95%), ‘Paraprofessional/Educator Support Staff’ (n = 7, 3.95%), ‘Counselor’ (n = 3, 1.69%), or ‘Specialist’ (n = 3, 1.69%). ‘Other’ responses included childcare providers and assistant professors/teachers. Because educators could skip questions, instances where the total number of responses is less than 177 are noted. See Figure 1 below. 

Figure 1. Educators’ Primary Role

What are educators’ goals when enrolling in a PL course?

Educators’ primary goal when enrolling in PL was obtaining ‘free or inexpensive SCECHs’ (n = 45, 25.42%) closely followed by ‘fulfilling professional development requirements’ (n = 42, 23.73%). Figure 2 below shows all the primary reasons for enrolling in PL courses through Michigan Virtual. In the survey, we also asked about their additional reasons for enrollment. 

The most common additional or secondary reason for enrollment was that the course content ‘addresses specific classroom or professional needs’ (n = 64, 37.43%). Among those who did not select it as the primary reason, ‘free or inexpensive SCECH’ (n = 64, 37.43%) was commonly chosen as a secondary reason for enrollment. The total number of responses to this question was 171.

Figure 2. Primary Reason for Enrollment

What do educators hope to achieve or get out of their PL courses?

Educators were given a list of goals for their experiences with their professional learning course and asked to select all that applied (see Figure 3 below). Because educators could choose multiple goals, the total number of responses exceeds the number of educators who responded (n = 177). The data presented reflects the number of times that option was selected. ‘Understanding how to teach more effectively’ was chosen 114 times (69.09%) as the most populous goal for respondents’ professional learning experience. Conversely, offering ‘time and space to think’ was selected only 57 times (34.55%) as the least populous goal for engagement in PL. 

Figure 3. Goals for Professional Learning Experience

What course design elements (e.g., videos, readings, discussion boards) do educators find most valuable?

Educators were given a pre-defined list of course design elements and asked to choose all options they believed were useful or valuable to their experience in the course (see Figure 4 below). Because educators could choose multiple course design elements, the total number of responses exceeds the number of educators who responded to this question (n = 176). All course elements were selected as being ‘useful or valuable’ at least once, although the frequency with which elements were chosen varied. ‘Video/Audio’ elements were selected most often (n = 123) while ‘Podcasts’ were selected the fewest times (n = 8). 

Educators were then asked which of their previously selected course design elements were the most useful or valuable. The top two course design elements educators chose as most useful or valuable were ‘Video/Audio’ (n = 55) and ‘Readings’ (n = 33). 

Figure 4. Course Design Elements Indicated as Useful or Valuable in the Course

How do educators plan to apply course content to their current roles? 

The survey included one open-ended question asking educators how they plan to use the course content in their current roles. After coding their responses, about half (n = 60, 54.05%) of the responses  indicated an intention to ‘apply the information and skills directly to their classroom practices and pedagogy.’ Within these responses, educators highlighted how courses helped them move towards student-centered learning practices by focusing on personalization and providing choice and voice. For example, one educator reported the course content will help them include “more variety” when helping students learn concepts and “to differentiate instruction.” Educators also reported moving towards mastery/competency-based approaches:

“Instead of whole group instruction, I have grouped students by ability and offered different ways for students to show mastery.”

The remainder of the responses were split into a few categories shown in Table 1 below.

Table 1. Plan for Applying Course Content

Themen (%)Example
Apply Directly to Classroom60 (54.05%)“This course showed me that making the content relevant to the lives of students will allow them to engage.”
Relationship-building and Communication Practices16 (14.41%)“To build better relationships with my students.”
Build Community or Share PL Learnings13 (11.71%)“Explaining the process to my teachers & showing them how to use that knowledge in the classroom.”
Understand/Aware of Social Emotional Needs10 (9.01%)“Help create an effective and safe learning environment for all students.”
Reflective Practice8 (7.21%)“I will use what I learned in this course to reflect upon how to approach the second round of assessment in my new and current place of employment.”
Teaching Hiatus
4 (3.60%)“I am taking a year off and am only now applying for a new position.”
Note: total n = 111

Just under 15% of responses emphasized applying course content to ‘improve relationship-building and communication with students, guardians, and colleagues’ (n = 16). Intending to ‘share what participants learned’ was also common, as approximately 11.71% of the responses (n = 13) indicated a plan to take what they’ve learned back to their teachers and districts to encourage professional learning and the use of best practices. ‘Understanding the importance of social-emotional health’ was also a common takeaway from course content (n = 10, 9.01%), as highlighted by one educator who noted,

“SEL has a very direct impact on my role with behavior regulation, modification, and support within students. Making a deeper connection with students based on their background and interests will be a core part of my position as an educator.”

A smaller subset of responses (n = 8, 7.21%) noted leveraging course content to become more ‘reflective about their pedagogy.’ A few educators noted that they did not have concrete plans for implementing the knowledge and skills gained from their engagement with PL as they were not currently teaching. These responses were categorized as being on a ‘Teaching Hiatus’ (n = 4, 3.60%). 

After completing a PL course, to what extent do educators believe they can engage in certain pedagogical behaviors?

Educators reported their beliefs about their abilities on two sets of pedagogical behaviors: working directly with students (student-focused pedagogy) and being reflective practitioners (teacher-focused pedagogy). 

Student-Related Pedagogy Behaviors

After completing their professional learning course, educators were asked about their beliefs in their abilities to engage in several student-focused pedagogy behaviors. Educators were most likely to report being ‘quite a bit’ confident in their ability to ‘motivate students who show low interest in school work’ (35.23%). Regarding beliefs about ‘getting students to believe they can do well in school,’ educators were equally likely to believe ‘a great deal’ and ‘quite a bit’ (32.95%) in their skills. Thirty-two percent of educators believed ‘quite a bit’ in their ability to ‘help students value learning.’ Educators’ beliefs about their abilities to ‘assist families in helping their children do well in school’ were a bit more split; however, 29.14% believed in their abilities ‘quite a bit.’

Educators also generally believed ‘quite a bit’ in their abilities across several assessment-focused pedagogy measures. Thirty-three percent of teachers believed ‘quite a bit’ that they could ‘craft good questions for students,’ and 32.00% believed ‘quite a bit’ in their ability to ‘provide an alternative explanation or example when students are confused.’ Beliefs surrounding ‘using a variety of assessment strategies’ were more spread out, but 29.14% believed in their skills ‘quite a bit.’ Notably, 36.57% of educators believed ‘a great deal’ in their ability to ‘implement alternative strategies in the classroom.’ After completing their respective PL courses, very few participants (~1-6%) did not believe in their abilities ‘at all.’ Figure 5 and Table 2 show the percentage of educators who rated their self-efficacy on various student-focused pedagogy behaviors.  

Figure 5. Self-Efficacy for Student-Focused Pedagogy

Table 2. Self-Efficacy for Student-Focused Pedagogy

BehaviorA Great DealQuite a BitSomeVery LittleNot at AllTotal n
Motivate27.84%35.23%23.86%5.68%1.14%176
Do Well32.95%32.95%24.43%4.55%1.14%176
Value Learning28.00%32.00%28.00%6.29%1.14%175
Assist Families22.29%29.14%27.43%9.14%6.29%175
Craft Good Questions30.29%33.14%23.43%5.71%1.71%175
Assessment28.57%29.14%26.86%7.43%2.86%175
Alternative Explanation30.86%32.00%24.00%6.86%1.71%175
Alternative Strategies36.57%28.57%24.57%3.43%2.86%175

Teacher Related Pedagogy Behaviors

Educators were asked about their beliefs in their abilities to engage in multiple teacher-related pedagogy behaviors after taking their professional learning course. Overall, educators were ‘a great deal’ confident in their skills post-professional learning. Indeed, 47.59% of educators believed ‘a great deal’ in their ability to ‘reflect on their own classroom strategies and practices,’ while approximately 40% reported ‘a great deal’ of confidence in ‘reflecting on students’ learning and progress.’ 

Similarly, educators were confident in their ability to be growth-oriented teachers as 44.85% believed ‘a great deal’ in their ability to ‘identify areas for growth and improvement in their teaching practices’ and 36.20% felt ‘a great deal’ confident in their ability to ‘seek out feedback about teaching or instructional approaches.’ Most educators also believed ‘a great deal’ (39.52%) or ‘quite a bit’ (38.92%) in their ability to ‘engage in meaningful professional learning/development.’ Finally, 42.92% of educators reported believing ‘a great deal’ in their ability to ‘consider the importance of understanding students’ cultural backgrounds and experiences.’ Figure 6 and Table 3 show the percentage of educators who rated their self-efficacy on teacher-focused pedagogy behaviors.  

Figure 6. Self-Efficacy for Teacher-Focused Pedagogy Behaviors

Table 3. Self-Efficacy for Teacher-Focused Pedagogy Behaviors

BehaviorA Great DealQuite a BitSomeVery LittleNot at AllTotal n
Reflect Own Practices47.59%36.75%12.05%3.61%0.00%166
Reflect Student Learning40.61%36.97%20.00%1.82%0.61%165
Growth44.85%33.94%20.61%0.61%0.00%165
Feedback36.20%28.83%30.06%3.07%1.84%163
Meaningful Learning39.52%38.92%17.96%2.99%0.60%167
Cultural Background41.92%31.14%22.75%2.99%1.20%167

After completing a PL course, to what extent do educators feel they have demonstrated their knowledge and skills and can apply course concepts to their roles? 

After completing their respective PL courses, most participants believed ‘quite a bit’ in their ability to demonstrate and apply their skills and knowledge. Indeed, over half (56.55%) of educators believed ‘quite a bit’ that their ‘knowledge of the content in this course has been accurately assessed’ and that they ‘grasped the content covered in this course’ (60.35%). Similarly, just over half of educators reported ‘quite a bit’ of confidence in their ability to ‘apply what they’ve learned in this course to their current role’ (55.03%) and ‘incorporate what they’ve learned in this course into lesson plans within the next six months’ (50.60%). Finally, just under half (48.24%) believed ‘a great deal’ in the ‘importance of their role as an educator.’ Very few, if any, participants reported not believing in their ability to use or demonstrate skills and knowledge at all (0-2%). Figure 7 and Table 4 show the percentage of educators who rated the utility value regarding knowledge and skills derived from concepts covered in their course.

Figure 7. Utility Value

Table 4. Utility Value

ValueA Great DealQuite a BitSomeVery LittleNot at AllTotal n
Knowledge Assessed21.43%56.55%20.83%1.19%0.00%169
Grasped Content24.85%60.36%14.79%0.00%0.00%169
Current Role20.71%55.03%20.71%2.96%0.59%169
Lesson Plan25.60%50.60%19.64%2.38%1.79%169
Importance48.24%31.76%18.24%1.18%0.59%170

Conclusions

Educators indicated their primary goal for enrolling in professional learning courses through Michigan Virtual was to ‘obtain free or inexpensive SCECHs,’ highlighting the importance of providing accessible course offerings. Indeed, a previous analysis indicated that PL courses offered through Michigan Virtual cost about $38.00 on average; however, half of the courses offered cost less than $5.00 (Cuccolo & DeBruler, 2023). 
Educators’ secondary goal for enrolling in professional learning courses was to ‘meet professional development requirements.’ The structure of most PL course offerings (online, asynchronous) likely aligns with educators’ preferences, allowing them to meet professional development requirements in a way that works for them and their unique schedules (Perez, 2023).

The most reported goal for engaging in pedagogy-focused professional learning was to ‘teach more effectively’. Given research suggesting that educators benefit the most from practice-focused PL (Taylor et al., 2017), this finding underscores the importance of providing educators a chance to practice skills taught, emphasizing the skill’s usefulness and increasing confidence surrounding skill implementation as these may influence teachers’ use and persistence in the skills and student outcomes (Ji, 2021; Kale, 2018; Masusa et al., 2013; Zhang & Liu, 2019). 

The top three most useful and valuable course elements were ‘video/audio,’ ‘readings,’ and ‘scenarios,’ underscoring educators’ desire for engaging content and design elements that provide “real world” or practical examples (Cuccolo & DeBruler, 2023). Scenarios may help educators envision the skills or knowledge in their classrooms or how they may implement what they’re learning in similar contexts. To maximize outcomes, PL courses should continue to draw educators in with engaging design elements, while providing plenty of opportunities to practice. 

Just under half of the educators sampled had plans to use the course content ‘directly in their classrooms.’ However, just over 13% didn’t have plans to leverage course content in any way. While a portion of that was due to respondents not currently teaching, it is important to emphasize the utility of content and provide opportunities for practice so that provided information can reach students and colleagues.  

Most respondents believed ‘quite a bit’ in their abilities to ‘apply what they’ve learned to their work in the classroom with students.’ While this is a positive finding as self-efficacy has been associated with student achievement and increased job satisfaction (Crawford et al., 2021; Evers et al., 2002; Gorozidis & Papaioannou, 2011; Lumpe et al., 2012; Palmero & Thomson, 2018), we should continue to strive towards increasing educators’ beliefs in their abilities to further the benefits of PL for both students and teachers. After completing their PL courses, educators believed ‘a great deal’ in their abilities to be ‘reflective practitioners.’ PL courses should continue to provide opportunities for and emphasize the importance of reflection.

Like the number of respondents who had plans to apply course concepts in their classrooms directly, 55% of educators reported being ‘quite a bit’ confident in their ability to ‘apply course concepts.’ Because research suggests that self-efficacy (beliefs in abilities) is related to pedagogical changes, there may be an overlap between those who felt confident in applying course concepts and those who planned to do so. This perhaps underscores the importance of increasing educators’ self-efficacy through opportunities for practice and feedback. Most educators did indicate feeling ‘quite a bit’ confident in their understanding of course content. 

Importantly, about 48% of educators believed ‘a great deal’ in the ‘importance of their role.’ Previous research has indicated that teachers’ perceptions of the values in teaching (both personally and for society) were associated with unique dimensions of teaching quality such as classroom management and instructional clarity (Ouwehand et al., 2022) thus, continuing to build educators’ perceptions of their value may have important implications for their pedagogy.

References 

Bowman, M. A., Vongkulluksn, V.W., Jiang, Z., & Xie, K. (2022). Teachers’ exposure to professional development and the quality of their instructional technology use: The mediating role of teachers’ value and ability beliefs. Journal of Research on Technology in Education, 54(2), 188-204, https://doi.org/10.1080/15391523.2020.1830895

Cuccolo, K., & DeBruler, K. (2023). Evaluating Professional Learning Course Offerings and Educator Engagement. Michigan Virtual. https://michiganvirtual.org/research/publications/evaluating-professional-learning-course-offerings-and-educator-engagement/

Darling-Hammond, L., Hyler, M. E., Gardner, M. (2017). Effective Teacher Professional Development. Palo Alto, CA: Learning Policy Institute. This report can be found online at https://learningpolicyinstitute.org/product/teacher-prof-dev.

Evers, W. J., Brouwers, A., & Tomic, W. (2002). Burnout and self‐efficacy: A study on teachers’ beliefs when implementing an innovative educational system in the Netherlands. British Journal of Educational Psychology, 72(2), 227-243. https://doi.org/10.1348/000709902158865

Gesel, S. A., LeJeune, L. M., Chow, J. C., Sinclair, A. C., & Lemons, C. J. (2021). A meta-analysis of the impact of professional development on teachers’ knowledge, skill, and self-efficacy in data-based decision-making. Journal of Learning Disabilities, 54(4), 269-283. https://doi.org/10.1177/0022219420970196 

Gorozidis, G., & Papaioannou, A. (2011). Teachers’ self-efficacy, achievement goals, attitudes and intentions to implement the new Greek physical education curriculum. European Physical Education Review, 17(2), 231-253. https://doi.org/10.1177/1356336X11413654 

Kale, U., & Akcaoglu, M. (2018). The role of relevance in future teachers’ utility value and interest toward technology. Educational Technology Research and Development, 66(2), 283-311. https://doi.org/10.1007/s11423-017-9547-9

Lumpe, A., Czerniak, C., Haney, J., & Beltyukova, S. (2012). Beliefs about teaching science: The relationship between elementary teachers’ participation in professional development and student achievement. International Journal of Science Education, 34(2), 153-166. https://doi.org/10.1080/09500693.2010.551222

Masuda, A. M., Ebersole, M. M., & Barrett, D. (2013). A qualitative inquiry: Teachers’ attitudes and willingness to engage in professional development experiences at different career stages. Delta Kappa Gamma Bulletin, 79(2), 6-14.

Ouwehand, K. H., Xu, K. M., Meeuwisse, M., Severiens, S. E., & Wijnia, L. (2022, March). Impact of school population composition, workload, and teachers’ utility values on teaching quality: Insights from the Dutch TALIS-2018 data. In Frontiers in Education, 7,  815795. Frontiershttps://www.frontiersin.org/articles/10.3389/feduc.2022.815795/full

Palermo, C., & Thomson, M. M. (2019). Large-scale assessment as professional development: Teachers’ motivations, ability beliefs, and values. Teacher Development, 23(2), 192-212. https://doi.org/10.1080/13664530.2018.1536612

Perez, A. (2023, July 27). Street Data And Empathy: Revealing What Educators Truly Want From Professional Learning. [Blog]. Michigan Virtual. Retrieved from https://michiganvirtual.org/blog/street-data-and-empathy

Roth, K. J., Wilson, C. D., Taylor, J. A., Stuhlsatz, M. A., & Hvidsten, C. (2019). Comparing the effects of analysis-of-practice and content-based professional development on teacher and student outcomes in science. American Educational Research Journal, 56(4), 1217-1253. https://doi.org/10.3102/0002831218814759

Taylor, J. A., Roth, K., Wilson, C. D., Stuhlsatz, M. A., & Tipton, E. (2017). The effect of an analysis-of-practice, videocase-based, teacher professional development program on elementary students’ science achievement. Journal of Research on Educational Effectiveness, 10(2), 241-271. https://doi.org/10.1080/19345747.2016.1147628

Yoon, K. S., Duncan, T., Lee, S. W.-Y., Scarloss, B., & Shapley, K. (2007). Reviewing the evidence on how teacher professional development affects student achievement (Issues & Answers Report, REL 2007–No. 033). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southwest. Retrieved from http://ies.ed.gov/ncee/edlabs 

Zhang, S., & Liu, Q. (2019). Investigating the relationships among teachers’ motivational beliefs, motivational regulation, and their learning engagement in online professional learning communities. Computers & Education, 134, 145-155. https://doi.org/10.1016/j.compedu.2019.02.013

Appendix

Appendix A (click to expand)
Course Names
5D/D+ Classroom Environment & Culture
5D/D+ Student Engagement
Changing Minds to Address Poverty – Achievement Mindset
Changing Minds to Address Poverty – Engagement Mindset
Changing Minds to Address Poverty – Enrichment Mindset
Changing Minds to Address Poverty – Mindsets Matter (Introduction)
Changing Minds to Address Poverty – Positivity Mindset
Changing Minds to Address Poverty – Relational Mindset
Changing Minds to Address Poverty in the Classroom – Graduation Mindset
Changing Minds to Address Poverty in the Classroom – Rich Classroom Climate Mindset
Differentiated Instruction: Maximizing Learning for All
Discover Sign Language
Educator Evaluation in MI: Measurement of Student Growth
Educator Evaluation in MI: Preparing for Formative Review
Elementary Career Awareness eResources in MeL
Equity in Online Learning for Multilingual Students
Essential Instructional Practices in Early Literacy: K-3 Essential 1
Essential Instructional Practices in Early Literacy: K-3 Essential 2
Essential Instructional Practices in Early Literacy: K-3 Essential 3
Essential Instructional Practices in Early Literacy: K-3 Essential 4
Essential Instructional Practices in Early Literacy: K-3 Essential 5
Essential Instructional Practices in Early Literacy: K-3 Essential 6
Essential Instructional Practices in Early Literacy: K-3 Essential 7
Essential Instructional Practices in Early Literacy: K-3 Essential 9
Essential Instructional Practices in Early Literacy: Pre-K Essential 1
Essential Instructional Practices in Early Literacy: Pre-K Essential 2
Essential Instructional Practices in Early Literacy: Pre-K Essential 5
Essential Instructional Practices in Early Literacy: Pre-K Essential 7
Essential Instructional Practices in Early Literacy: Pre-K Essential 9
Essential Instructional Practices in Early Literacy: School-wide and Center-wide Practices
Fostering Student Agency Through Positive Relationships
Grammar Refresher
Grammar Refresher II
Inquiry-Based Learning in Secondary Mathematics Education
Inquiry-Based Learning in Secondary Science Education
Intro to Computer Science for Middle School Educators
Intro to Online Course Facilitation: Practical Knowledge
Intro to Universal Design for Learning: Action & Expression
Introduction to Blended Learning for School Leaders
Introduction to Early Childhood Standards of Quality for B-K
Introduction to Phenomenal Science
Modern Classrooms Project Essentials
Online National Standards 2: Course Content & Design
Online National Standards 3: Assessment
Online Teacher’s Guide: Announcements
OTL Level 1 – Course Content
OTL Level 1 – Discussion Boards
OTL Level 1 – Grading and Feedback
OTL Level 1 – Supporting Exceptional Students
Rethinking Classroom Practices with ChatGPT
Social-Emotional Learning: Adult SEL and Self-Care
Social-Emotional Learning: Creating a Professional Culture Based on SEL
Social-Emotional Learning: Embedding SEL Schoolwide
Social-Emotional Learning: Equity Elaborations
Social-Emotional Learning: Integrating SEL within MTSS
Social-Emotional Learning: Introduction to SEL
Social-Emotional Learning: Trauma-Informed Support
Teaching Preschool: A Year of Inspiring Lessons
The Creative Classroom
TRAILS Social and Emotional Learning (SEL) Curriculum
Whole Child & Continuous Improvement: A Deeper Understanding (MI Only)
]]>
https://michiganvirtual.org/wp-content/uploads/2024/07/iStock-1405974698.jpghttps://michiganvirtual.org/wp-content/uploads/2024/07/iStock-1405974698-150x150.jpg
AI in Education: Student Usage in Online Learning https://michiganvirtual.org/research/publications/ai-in-education-student-usage-in-online-learning/ Fri, 21 Jun 2024 15:16:51 +0000 https://michiganvirtual.site.strattic.io/?post_type=publication&p=87751

The integration of AI in education has the potential to revolutionize the learning experience for students. AI-powered tools and platforms can provide personalized learning experiences tailored to individual students' needs, learning styles, and abilities.

]]>

Introduction

The rapid advancement of artificial intelligence (AI) technology has permeated various aspects of modern society, including the field of education. As AI systems become increasingly sophisticated, their potential applications in the classroom setting have garnered significant attention from educators, researchers, and policymakers alike. Large Language Models, or LLMs, like ChatGPT or Claude, are types of AI that have the ability to interact with users in a responsive capacity for a variety of applications across almost any subject, including teaching and learning. One of the critical areas of interest is the usage of AI by students for educational purposes, which presents both opportunities and challenges.

The integration of AI in education has the potential to revolutionize the learning experience for students. AI-powered tools and platforms can provide personalized learning experiences tailored to individual students’ needs, learning styles, and abilities (Zawacki-Richter et al., 2019). For example, intelligent tutoring systems, which have been around for much longer than the new LLMs, can adapt to a student’s level of understanding and provide targeted feedback and support, fostering a more effective and engaging learning process (Beck, Stern, & Haugsjaa, 1996; Kulik & Fletcher, 2016).

A very recent study by the Walton Family Foundation (2024) that focused on parents, teachers, and students found that in the past year, AI has become deeply integrated into the education system, with increasing awareness by teachers, parents, and students and a roughly 50% usage rate across all three groups. Despite a rise in negative perceptions, the general sentiment among these groups remains positive, particularly among those who have personally used AI and see its beneficial applications in educational settings.

According to a Rand study (2024), as of fall 2023, 18 percent of K–12 teachers regularly used AI for teaching purposes, and another 15 percent had experimented with AI at least once. AI usage was more prevalent among middle and high school teachers, especially those teaching English language arts or social studies. Most teachers who integrated AI into their teaching employed virtual learning platforms, adaptive learning systems, and chatbots on a weekly basis. The primary applications of these AI tools included customizing instructional content to meet students’ needs and generating educational materials.

By the conclusion of the 2023–2024 academic year, 60 percent of school districts aim to have provided AI training for teachers, with urban districts being the least likely to implement such training. In interviews, educational leaders expressed a stronger emphasis on encouraging AI adoption among teachers rather than formulating policies for student use, citing the potential of AI to streamline teachers’ tasks.

Most other current research on the effectiveness of AI in education is focused on adult learners and generally leans towards the positive when concerned with achievement and perceptions (Kumar & Raman, 2022; Zheng et al., 2023).These studies also show that adult students believe AI is best served for usage by students and teachers in assistance of assignment creation or completion, as well as tutoring systems or research assistants as opposed to automated assessment use cases.

However, most K-12 teachers, parents, and students feel that their schools are not adequately addressing AI. They report a lack of policies, insufficient teacher training, and unmet demands for preparing students for AI-related careers. This absence of formal school policies means that AI is often used without official authorization, leaving students, parents, and teachers to navigate its use independently. There is a strong preference among all stakeholders for policies that explicitly support and thoughtfully integrate AI into education.

Emerging literature indicates that students have varying opinions on the usage of AI tools, and that AI has the potential to assist students in learning and ultimately increase achievement, regardless of whether the learners are adults or children (Martínez, Batanero, Cerero, & León, 2023; Trisoni et. al, 2023).

Furthermore, AI can assist students in various tasks, such as information retrieval, research, writing, and problem-solving. Language models like ChatGPT, developed by OpenAI, have demonstrated remarkable capabilities in generating human-like text, potentially aiding students in their writing assignments and research projects (Bender et al., 2021; Zheng, Niu, Zhong, & Gyasi, 2023). Additionally, AI-powered virtual assistants can provide on-demand support and guidance, acting as digital tutors or study companions (Winkler & Söllner, 2018; Chen, Jensen, Albert, Gupta, & Lee, 2023).

However, AI use by students also raises ethical concerns and potential risks. One of the primary concerns is the potential for academic dishonesty, as students may be tempted to use AI-generated content as their own, raising issues of plagiarism and cheating (Driessen & Gallant, 2022; McGehee, 2023;). There is also a risk of AI systems perpetuating biases and misinformation, as these systems are trained on existing data that may reflect societal biases or contain inaccuracies (Mehrabi et al., 2021).

Moreover, the increasing reliance on AI tools may impact students’ critical thinking and problem-solving skills, as they become more dependent on these systems for tasks that traditionally required human reasoning and creativity (Luckin et al., 2016). Additionally, there are concerns about the potential for AI to widen the digital divide, as access to advanced AI tools and resources may be limited for certain socioeconomic groups, potentially exacerbating existing educational inequalities (Selwyn, 2019).

Significance of the Study 

This study holds significant importance for several reasons. It is the second study by Michigan Virtual in the field of artificial intelligence and education. It seeks to be a companion piece to work done with educators and help inform current and future work in the field as a thought leader and the space of artificial intelligence and education.

Firstly, it addresses a rapidly evolving technological landscape that is reshaping education. As AI advances and becomes more integrated into various aspects of society, it is crucial to understand its implications for the education sector, particularly from the perspective of students, who are the primary consumers of educational services. 

Second, this study intends to fill a gap in the literature by examining how students utilize AI (use cases), what they think about AI (perceptions), their frequency of AI usage, and how these individual or grouped variables relate to demographic characteristics and the almighty student outcome of achievement or grades.

Third, this study’s findings have the potential to inform policymakers, educational institutions, and educators about the current state of AI adoption among students. By understanding students’ motivations, perceptions, and experiences with AI tools, stakeholders can develop strategies and policies that promote responsible and effective AI usage in educational settings. This knowledge can guide the development of guidelines, best practices, and training programs to ensure that AI is leveraged in a manner that enhances learning outcomes while mitigating potential risks and ethical concerns.

Finally, the insights gained from this study can contribute to the broader discourse on the role of AI in education and its implications for pedagogical approaches, curriculum design, and the future of teaching and learning. As AI becomes more integrated into educational processes, it is crucial to critically examine its impact on traditional teaching methods, assessment strategies, and the overall learning experience. The study’s findings can inform discussions and decision-making processes related to the effective integration of AI in a manner that complements and enhances human-centric educational practices.

In summary, this study’s significance lies in its potential to inform policies, practices, and decision-making processes related to the responsible and equitable integration of AI in education. Current research indicates that AI tools may have multiple benefits, including positive impacts on achievement.

Research Questions 

This study used several research questions to fulfill the purpose of understanding online student usage habits of AI, opinions regarding AI, and any relationships those key factors may have with outcome and demographic variables such as achievement, grade level, or course subject area. 

  1. What are students’ AI Usage Habits?
  2. Are there differences in student achievement based on AI usage?
  3. Are there differences in student perceptions of AI?
  4. What differences exist in student usage of AI?

Methodology 

To address the research questions and fulfill the purpose of this study, a causal-comparative approach was used, meaning that no treatment was applied, no groups were randomized, and all of the data utilized already existed within Michigan Virtual databases. 

Data utilized were collected over a period of 1-2 months in early 2024 and consisted of LMS data and End of Course survey data.

Variables and Instrumentation

The data for this study were composed of variables created from two different instruments and are described in detail below. Each data source was anonymized and matched by using unique student ID codes.

LMS and Publicly Available Data 

Data from the LMS and publicly available district information was matched with End of Course Survey data, and included the following information on students:

  • Course Grades
  • Number of Current Enrollments at Michigan Virtual
  • IEP Status
  • Course Type (Remedial, AP, etc)
  • Course Subject
  • Public District Information – district size,  free and reduced lunch proportions
    • Locale – defined by population density in four categories 
    • SES – percentage of students in a district that qualify for free or reduced lunch, divided into four categories. This is not an individual student’s socioeconomic status, but rather a summary of the district’s status that the student belongs to.
      • The thresholds for the categories are as follows:
        • Low – Below 25%
        • Moderate – Between 25 and 50%
        • High – 50% to 74%
        • Very High – 75% and above

All of these data points were used as variables in this study. The number of Current Enrollments, IEP Status, and Course Type are all variables that historically have been used as covariates and moderating variables when predicting student achievement with Michigan Virtual students, and thus were also used in this study when achievement was the dependent variable in question.

End of Course Survey

Data from student End of Course surveys were matched with LMS data, and included the following information on students:

  • Satisfaction 
  • Ease of Use (Predictor Variables)
    • Comprised of multiple items regarding the ease of use of the learning platform 
  • Teacher Responsiveness and Care (Predictor Variables)
    • Comprised of multiple items regarding teacher/student communication 
  • Prior Experience and Effort (Predictor Variables)
    • Comprised of multiple items that summarize a student’s prior experience with online learning and general effort put towards learning.
  • Tech Issues (Predictor Variables)
    • Comprised of multiple items that indicate any difficulties or problems students may have had in the course from a technical standpoint, and if/how they were resolved.
  • AI Usage, AI Use Cases, and AI Perceptions

Predictor Variables in this study, refer to variables that are included in certain analyses for Michigan Virtual because they have been previously identified as important in predicting student satisfaction and success.

While all of these variables were included in the exploratory phase of this study, not all information from the End of Course survey was included in the final analysis due to relevance or usefulness in model construction. The most important items extracted from the survey were the key variables of AI usage, AI Use Cases, and AI Perceptions. These are described in detail below.

  • AI Usage
    • This variable was captured by asking all students whether they have utilized AI during the completion of their online Michigan Virtual courses. Answers were binary Yes/No.
    • Students were clearly told that this would be kept confidential and that it would in no way penalize them or affect their grades.
  • AI Use Cases
    • This variable set consisted of multiple select use cases for AI tools, and was only answered by students that answered “Yes” to the AI Usage item.
    • Choices included:
      • Summarizing information
      • Conducting research/finding information
      • Writing and editing assistant
      • Explaining complicated concepts or principles in simpler terms
      • Tutor/Teacher
      • Create study guides or sample test questions
      • Other (write-in)
    • During the analysis, it became clear that there were two distinct groups that each of the choices could be categorized into: use as a tool and use as a facilitator. These two categories would be created and used in subsequent analyses, and is discussed at length later in the study, but also described below.
      • Use as a facilitator – included selections that enabled students to still take on the main task of learning, rather than complete a very specific task itself.
      • Use as a tool – included selections where students used AI for a very specific task to get a specific result: calculation, information retrieval, editing, summarization, etc.
  • AI Perceptions
    • This variable was captured by asking all students to share how they perceive the usefulness of AI tools in learning. This was recorded on a 5-point Likert-type scale ranging from Not Useful at All (1) to Extremely Useful (5), with a neutral (3).

Analysis Methods

This was a largely exploratory study addressing a multitude of research questions, each of which used a variety of data analysis techniques. To simplify the explanation, a table was created to match analytical methods and variables to research questions. 

Table 1. Analysis Methods

Research QuestionAnalytical MethodsKey Variables
What are students’ AI Usage Habits?Descriptive StatisticsAI Usage
AI Perceptions
AI Use Cases
Are there differences in student achievement based on AI usage?ANCOVAs
Pearson R Correlations
Linear Regression
Achievement
AI Usage (1)
AI Use Cases (2)
Demographics
Are there differences in student perceptions on AI?ANCOVAs
Pearson R Correlations
AI Perceptions
AI Usage (1)
AI Use Cases (2)
Demographics
What differences exist between student usage of AI?ANCOVAs
Pearson R Correlations
AI Usage
AI Perceptions
AI Use Cases
Demographics

Participants

This study included 2,154 students enrolled in Michigan Virtual courses in the fall of 2023 who completed an End of Course survey in early 2024. 

Of the sample, approximately 50% of students were enrolled in a school district categorized as low proportions of free and reduced lunch students, and roughly 25% in moderate or high proportions of students with free and reduced lunch, and a slight presence of 1% of students enrolled in a district with very high amounts of students in free and reduced lunch programs 

Most students also lived in suburban areas (50%), or rural town areas (35%), and around 10% resided in urban city districts.

Figure 1 below shows the distribution of students across their locale types and the approximate percentage of impoverished students in their district.

Figure 1. Student Distribution by Locale and Low SES Percentage

Over half of the students in the sample were in 11th (28%) or 12th (44.5%) grade, while 9th (7%) and 10th (17%) graders composed about a quarter of the sample. Middle school grades comprised about 4% of the students collectively, and elementary school students accounted for only about .5% of the students in the sample (one student). The supermajority of students were in high school, most heavily weighted towards the higher grade levels. 

Outside of classes in the Other category, most of the students in the sample were enrolled in World Languages courses (22.5%), with Social Studies (14.7%), Science (12.4%), and Math (8.0%), comprising over a third of the student enrollments. A detailed breakdown can be seen below in Figure 2.

Figure 2. Subject Area Distribution 

Furthermore, a cross-tabulation visual in Figure 3 of student grade level and subject shows the distribution of students’ grade levels across subjects. 

Figure 3. Student Subject Area Distribution by Grade Level

Regardless of grade, most students tended to be represented in the World Languages courses, with 12th and 11th graders also displaying comparable enrollments in Social Studies, Science, and Other courses. 

Results

The results of the several different analyses will be discussed individually by the research question below and then synthesized in the next section. This section is limited to a presentation and interpretation of data.

Research Question 1: What are students’ AI Usage Habits?

This research question utilized descriptive statistics and cross-tabulations to analyze data from end-of-course survey items that addressed the following:

  • AI usage – Yes/No response
  • AI tools used – multiple select response of AI tools
  • AI use cases  – multiple response choices on reasons why students use AI
  • AI perceptions – 5 point likert scale response on favorability of AI tools

These items were cross tabulated with demographic data to uncover any meaningful trends.

Results indicated that only around 8% (n = 166) of the 2,164 students reported using Artificial Intelligence tools for their courses, and ChatGPT was the most frequently used tool (77%). 

Use cases for the AI tools amongst the students varied much more. As seen below in Table 2, the most frequent use of AI tools was, “To explain complicated concepts or principles in simpler terms,” followed closely by, “To conduct research or find information.” 

The original 7 categories were then binned into the classifications of “facilitator” and “tool” for easier understanding and grouping.

Table 2. AI Use Case Bins

Use CaseNCategoryPercent of Total (%)
To explain complicated concepts or principles in simpler terms92Facilitator24.90%
To conduct research/find information77Tool20.90%
To summarize information65Tool17.60%
To create study guides or sample test questions46Facilitator12.50%
As a writing and editing assistant38Tool10.30%
As tutor / teacher35Facilitator9.50%
Other16—-4.30%

Approximately 47% of students used AI tools as a “facilitator” and 48% used AI as a “tool”. It is important to note that no students reported using AI as a facilitator only, but rather always in conjunction with AI as “tool” usage.

This binning was used to help categorize students in subsequent analyses in order to better compare use cases of AI on several dependent variables.

Research Question 2: Are there differences in student achievement based on AI usage?

This research question utilized LMS data and end of course survey data in an ANCOVA model to address the research question in two ways: one model that addressed actual usage of AI tools, and then another that addressed use cases of those that did utilize AI.

Traditionally, achievement at Michigan Virtual is influenced by a student’s course load and IEP status, therefore these were used as covariates in the model. During the analyses, it was found that districts with disproportionately higher rates of free and reduced lunches performed worse in terms of students’ grades. This led to its inclusion as a covariate, with all analyses re-run to include it, in order to extract that variance. Its relationship with dependent variables is discussed further in the findings sections of this study.

The average grade percentage of all students in the dataset was 82.23, which was comparable to the population average established in Michigan Virtual’s yearly effectiveness report (Freidhoff, DeBruler, Cuccolo, & Green, 2024).

Model 1 – General Model

The general ANCOVA model (R2 =.470) included the scale measure dependent variable of student achievement, covarying on student IEP, number of courses, and product lines. Independent variables in the model included AI usage (yes/no), AI Opinions, Subject, Grade Level, SES, and Locale (population density). 

This resulted in indicating a main effect of SES (p <.05, F = 2.939), with all other variables producing no main effects or multiple interaction effects. This indicated that, as discussed before, students who were enrolled in districts with higher percentages of free and reduced lunch students tended to achieve lower than their counterparts.  This can be seen below in Table 3. 

Table 3. Average Grades by Low SES Percentage

Low SES PercentageMean GradeStd. Error
Low82.4941.484
Moderate80.9861.792
High 80.4291.703
Very High58.6065.701

Students that resided in low percentages of free and reduced lunch students scored higher, and scores decreased as percentages of students with free and reduced lunch increased, with students in areas of Very High amounts of students with free and reduced lunch score much lower than the other groups of students.

This led to the revision of the model to include SES as covariate, as discussed above (R2 =.523). The model was then revised and resulted in an ANCOVA model that produced no main effect interactions, meaning that once SES was accounted for, no variables in the model had any significant impact on students’ grades. 

Students who reported using AI (82.153) and students who reported not using AI (82.373) had almost identical grades. In addition, correlations between use of AI and regression models including the use of AI as a predictor alongside Michigan Virtual’s previously curated predictor variables, AI Use was found to have no significant relationship with achievement, and was not a significant predictor of achievement.

While no single variable had an impact on student grades, within-group observations indicated some significant variation of grades (p <.05) within the variable of AI Perceptions. For all students, regardless of whether they stated using AI or not, those students who held less extreme viewpoints on AI (2,3,4 on a five-point scale) scored higher than those who held highly unfavorable and highly favorable viewpoints (1 or 5).This can be seen in Table 3 and Figure 4 below. 

Table 3. AI Perceptions and Average Grades

AI PerceptionMean GradeStd. Error
Very Unfavorable83.2893.065
Unfavorable86.3282.847
Neutral84.5161.881
Favorable81.5362.220
Very Favorable76.1072.607

This suggested that there may be a nonlinear relationship between the variables. Curve estimation results indicated the possibility of a quadratic relationship (parabolic shape), and while the relationship was significant (p<.05), the correlation was weak (.117). 

Figure 4. Average Grades by AI Perception

This type of relationship and curve was also present when including AI Usage, as both students who used AI and did not use AI achieved similar grades across their perception categories, which indicates that student perceptions of AI and its relationship with achievement is not due to whether a student uses AI or not. This can be seen below in Figure 5.

Figure 5. Average Grades by AI Perceptions and Usage

Model 2 – AI users only

This ANCOVA model (R2 = .907) used only AI users for the entire sample. It used the same dependent variable of achievement and covariates of SES, product line, IEP Status, and course load; the only difference was that the major independent variable for this analysis was the AI Use Case variable, which divided AI users into three groups: those who used “AI as a tool,” those who used “AI as a facilitator,” and those who used both.

The ANCOVA results revealed a significant main effect of usage types on student achievement (p =.012, F =7.110), with no other main effects. This means that how students utilized AI alone had an impact on their grades.

Students reporting using AI as only a tool had average grades of 76.5 (std deviation 20.72), while students who reported using AI as both a tool and facilitator had average grades of 83.9 (std deviation 16.23). In addition, a Pearson R correlation was conducted to measure the strength of the relationship between these two variables, resulting in a small but significant direct correlation (r =.291, p<.05)

There was also a two-way interaction between the AI Use Case variable and subject areas. Those in math, science, foreign language, and “other” courses scored much higher than their counterparts when utilizing AI in both manners, while the inverse was true for ELA and Health & PE courses; notably, there were no AI users in Visual and Performing Arts. This can be seen below in Figure 6.

Figure 6. Average Grades by Subject and AI Use Case

According to these analyses, using AI as both a facilitator and a tool had a significant positive impact on Math and Science scores, or STEM subjects, while this was not the case for all other subjects.

Final Comparisons

After examining grades of AI users and non-AI users, and then within the AI users, a final comparison between the higher achieving group of AI users (those that utilized AI as both a facilitator and a tool and not just as a tool) and non-AI users was conducted.

Results indicated that students who utilized AI as both a tool and a facilitator, on average, outperformed non-AI users. The main effect (p < .05, F= 3.94) of AI use as both a facilitator and tool produced an average grade percentage of 83.9, while non-users reported grades of 82.4. This main effect was present across all subject areas and demographic groups.

Research Question 3: Are there differences in student perceptions on AI?

This research question utilized LMS data and end-of-course survey data in ANCOVA models to address the research question. The independent variables in these models were subject, grade level, and locale, while covariates in the model were a student’s IEP status, SES status, course load, current grade, and product line. The covariates were chosen due to the amount of variance they explained in model iterations, and thus were controlled for accordingly, as the research was focused on variance explained by the other variables. 

The dependent variable in this research question was a single item on the end-of-course student survey that asked about students’ perceptions regarding their favorability of AI tools on a 5-point Likert scale. A distribution of the dependent variable is shown below in Figure 7.

Figure 7. AI Perceptions Distribution

The large majority of students were neutral in their perceptions around AI, with a slight skew to the positive side.

Model 1 – General Model

This ANCOVA model incorporated the aforementioned covariates, and five independent variables: grade level, subject area, location, SES status, and AI usage (R2 =.797). 

The variables of AI Usage (p <.001, F =38.92)  and the Subject Area (p <.05, F=1.61) had main effects with the dependent variable of AI perception. 

Students who utilized AI more had much higher perceptions around AI tools, with those utilizing AI tools reporting an average of around 4, or slightly favorable, and non-users reporting, on average, 2.8 or slightly below the neutral response. This is a large and significant difference.

Additionally, Pearson R values indicate a significant, but moderate direct relationship between the variables (p <.001, r =.333).

Students in English Language Arts, Visual and Performing Arts,  and Health and Physical Education Courses rated AI tools lower than students in Math, Science, Language, Social Studies, and other courses. These differences can be seen below in Table 4.

Table 4. Average AI Perceptions by Subject

SubjectMeanStd. Error
English Language Arts2.925.144
Health & Physical Education2.665.212
Mathematics3.253.124
Other3.244.106
Science3.458.105
Social Studies3.269.097
Visual & Performing Arts2.903.215
World Languages3.143.084

These two variables also had a two way interaction ( p =.001, F =3.67), shows that there were differences in users and non users across multiple subjects. These differences can be seen below in Table 5, with significant differences indicated by a (*). Across almost  every subject, AI users reported significantly (p <.05) more favorable opinions on AI than non-users.

Table 5 – Average AI Perceptions by Subject and AI Usage

SubjectAI UsageAverage AI PerceptionStd. Error
English Language Arts*No2.570.159
Yes4.192.339
Health & Physical EducationNo2.515.219
Yes3.494.63
Mathematics*No2.965.139
Yes3.851.247
OtherNo3.133.113
Yes3.491.232
Science*No3.055.11.
Yes4.512.236
Social Studies*No2.916.107
Yes4.052.204
Visual & Performing ArtsNo2.903.215
YesNone———
World Languages*No2.809.086
Yes4.161.212

Model 2 – AI users only

This model utilized the same covariates as Model 1 for RQ3, as well as all of the same independent variables, except for exchanging the AI Usage (yes or no) variable for the AI Use Case variable, categorized users into three bins: those that utilized AI as a tool, as a facilitator, or both.

Results from the model (R2= .781) indicate that there were no main effects on perceptions by any of the independent variables in the model, nor any multiple independent variable interactions with the dependent outcome. There was no significant difference in perceptions of AI with regard to specific use cases amongst AI users.

Research Question 4: What differences exist between student usage of AI?

This research question utilized LMS data and end of course survey data in a general ANCOVA model and an ANCOVA model with AI users only to address the research question. The general ANCOVA model used the single item of AI usage (yes/no response) as the dependent variable, with a subsequent model using the binned AI Use Cases as a single dependent variable.

Findings from research question 1 indicated that only about 8% of students reported utilizing AI.

Model 1 – ANCOVA – General Model

The general model (R2 = .479)  utilized AI Usage as the dependent variable,  four covariates (IEP, Product Line, Course Load, and Current Grade/Ability), and five independent variables: AI Perception, subject, grade level, locale, and SES. 

There was a main effect interaction of opinion on the binary dependent variable of AI Usage ( p< .001, F =11.120).  Those students with more favorable opinions utilized AI more, which is shown below in Figure 8. This is consistent with the findings from the previous research question; Pearson R values indicated a significant, but moderate direct relationship between the variables (p <.001, r =.333).

Figure 8 – AI Perception and Utilization

Students that held Very Favorable opinions about AI utilized it almost 50% of the time on average, while students with Very Unfavorable opinions utilized AI only 1% of the time on average; this is a very large difference, fifty times more utilization reported between top and bottom bins.

Model 2 ANCOVA – AI users only

This model (R2 =.642) utilized the binned AI Use Case variable as the dependent, which categorized the original multiple response variables that contained multiple dependents into a single variable with three categories: AI used as a tool, AI used as a facilitator, or both. All other independent variables and covariates from the general model were the same.

This model found no significant differences between groups, nor any multi-independent interactions, meaning that there were no main effects of independent variables on the use cases of which students were utilizing AI tools.

Synthesis 

Finding 1 (Usage)

Student usage of AI tools was minimal, slightly less than 10% of students self-reported utilizing AI tools for their online learning courses. Though this seems low, measures were taken to inform all students that indication of their AI use would not penalize them in any way.

The most largely utilized tool by students at the time of this study was ChatGPT.

All students that used AI indicated that they made use of AI as tool, or like a tool, while over 2/3s of AI users stated that they utilized AI in ways that were like a tool and as a facilitator in learning, meaning that no users only utilized AI as a facilitator. 

The usage of AI tools alone had little to no impact on the student outcome of student achievement. The ways in which students utilized AI showed differences in achievement based on the types of use cases students reported. Students who utilized AI as both a facilitator and a tool had better achievement outcomes than students who utilized AI as only a tool, though it is not consistent across all subject types, and correlation data indicates that there is a small, but significant relationship (r =.291, p<.05) between usage types and grades.

Students in STEM subjects (including Computer Science from the “other” category of courses) that utilized AI in both manners outperformed their counterparts that used it only as a tool— this was not the case in all other subjects. In addition, those students who utilized AI as both a facilitator and a tool outperformed the non-AI users by a margin of 2.5 points across all subjects, on average.

Finding 2 (Perceptions) 

Students, on average, tend to have more neutral views regarding AI as opposed to extreme views, and those with the extreme views tend to have worse student outcomes. There may be a nonlinear relationship between the variables of achievement and AI perceptions, but this study only revealed the slight, but significant,  presence of such a relationship.

The relationship between usage and perceptions around AI is significant, those students who use AI tools tend to have better perceptions of it, and vice versa, which was supported by hypothesis testing and correlational data. However, amongst AI users, how students utilized AI made no difference in how they perceived it. 

Key Points

  • Using artificial intelligence alone does not have a strong relationship with student achievement.
    • Simply using AI alone doesn’t make students perform better or worse
    • Many students aren’t utilizing the AI for school at all, or at least reporting usage.
  • The way in which students use AI tools matters
    • Students who use AI in multifaceted facilitative use cases in addition to any lower-level use cases have significantly better grades than students who strictly utilize AI as a tool for lower-level use cases on average, but subjects vary.
      • This is strongly present in STEM subjects, but not as clear for others.
      • Students in English courses with AI “as a tool” only scored higher than those that used it for both, which may mean that AI usage “as a tool” is more effective in that area for ELA assignments, such as writing.
    • Students who use AI as a tool and a facilitator outperform non-AI user counterparts on average.
  • AI perceptions differ across and within subject areas.
    • STEM subjects, social studies, and foreign languages had higher perceptions than other subjects. 
  • Students who have less extreme perceptions of AI tend to have better student outcomes. There is a very small but significant quadratic relationship between the variables that should be investigated further.
  • Students who utilize AI more have better perceptions regarding its usefulness.
    • This was mostly consistent across subject areas.
  • SES and locales as non-factors
    • While a student’s SES was shown to have a main effect on student achievement, its variance was appropriately accounted for by inclusion as a covariate in subsequent models with achievement as a dependent, as this is a common factor with achievement; research has found repeatedly that low SES students usually achieve lower than their higher SES counterparts. Other models that did not use achievement as the dependent included SES as an independent variable in order to see if its effect was still present, and results indicate that it was not.
    • A student’s locale was also not a significant factor in any of the models, nor were any models adjusted to co-vary on its contribution to variance in the dependents.

Conclusions

This section will discuss the results of the study alongside existing literature, as well as provide recommendations for policymakers and future research on the topic of student AI usage.

Multiple tables at the end of this section will organize recommendations with resources for practitioners and policymakers, but a comprehensive list of resources curated by Michigan Virtual can be found here.

Implications for Practice 

This section discusses how the findings of this study fit into the larger context of education as a whole by comparing and contrasting results to the existing literature and examining where it fits in current practice.

Encourage multifaceted and facilitative use of AI tools: Educators should promote the use of AI technologies not just as a tool but also as a facilitator in the learning process. Students who utilized AI as both a facilitator and a tool had better achievement outcomes compared to those who used it solely as a tool. This type of approach is supported by Luckin et al. (2016), because it focuses on utilizing AI as an assistant to promote critical thinking and creative problem solving rather than shortcutting students to answers. Additionally, the distribution of use cases found in this study are consistent with findings from other large scale studies (Walton Family Foundation, 2024), compounding evidence that these types of use cases are accurate.

Therefore, measures should be taken to provide guidance on how to effectively integrate AI as a facilitator of learning, at the very least including the categories in this study: to explain complicated concepts or principles in simpler terms,  to create study guides or sample test questions, and using AI as tutor / teacher.

Address extreme perceptions of AI: The study found that students with extreme views (either positive or negative) regarding AI tend to have worse student outcomes. This is consistent with findings that show that perceptions of students are varied (Martinez et. al, 2023; Diliberti, Schwartz, Doan, Shapiro, Rainey, & Lake , 2024; Trisoni et al., 2023; Walton Family Foundation, 2024). However while perceptions were varied, they tended to be normally distributed, and mostly neutral, which is different from the positive leaning perceptions among students found by the Walton Family Foundation (2024) study. According to this study, while opinions may be varied, education should aim to cultivate a balanced and informed understanding of AI among students to promote better outcomes.

Equity in AI Usage: This study did not indicate that there was any disparity amongst AI usage regarding students of lower socioeconomic status or locale; there were no interactions between the variable of SES and other independent variables related to AI usage or perceptions. This is consistent with findings regarding locale in the Walton Family Foundation study (2024). This is evidence that students in online learning environments, regardless of socioeconomic status, likely have access to some form of AI tools, whether or not they choose to utilize them.

Subject specific Utilization of AI and Opinions of AI : As with adults (McGehee 2023), this study indicates that students who use AI more tend to have better opinions of it, and vice versa. Perceptions across subject levels with students vary, with students in STEM subjects tending to have higher perceptions of AI, and higher scoring students that utilized AI. This is consistent with the idea presented by Zawacki-Richter et al. (2022) that AI may be useful in promoting development of “hard” skills like critical thinking and problems solving that are central to STEM.

Age, grade level, and environment: This study consisted of a sample of students in asynchronous online learning environments, primarily in high school grades. This means that the results cannot be generalized to populations that are not similar; students in differing environments and grade levels may report different results. 

However, at the time of this study, this sample size is one of the largest recorded that deals with student AI usage, and many of the results are similar to other studies, with the exception that this sample has reported lower student AI usage overall compared to the Walton Family Foundation study (2024).

Future Research:  More research is needed in both of the focus areas of this study. While utilizing a large dataset, the data is very surface level and does not help us understand a lot of the contextual questions that the findings raised. The sample is also focused on a population of secondary school students in asynchronous, online learning environments. 

Future studies would benefit from better instrumentation and data collection protocols to expand on the findings of this study to understand how students utilize AI, what they think about AI, and what relationships those things have with outcomes that concern stakeholders. Below are specific examples:

  • Conduct further research on usage habits, specifically amongst different types of learning environments.
  • Investigate specific use cases of AI in specific subjects with controlled variables and environments.
  • Qualitative research regarding student AI perceptions and use cases.
  • Utilization of AI and high stakes testing.
  • Effectiveness of AI integration into coursework from both teacher and student perspectives.
  • Prevalence of cheating using AI.
  • Investigate changes in usage percentages and usage percentages over time.
    • AI usage is largely in its infancy at the time of this study, and these findings are likely to change as it becomes more prevalent.

Recommendations

For Policymakers:

  1. Develop guidelines and provide training for effective AI integration: Since the way students use AI tools matters, policymakers should develop guidelines and provide training for teachers on how to integrate AI tools into the curriculum effectively. These guidelines should emphasize the importance of using AI in multifaceted and facilitative ways, rather than just as a tool for lower-level tasks.
  2. Encourage subject-specific approaches to AI integration: The findings suggest that the impact of AI usage may differ across subject areas, with STEM subjects benefiting from facilitative AI usage, while ELA and PE courses might require different approaches. Policymakers should encourage subject-specific strategies for AI integration and provide resources accordingly.
  3. Promote AI literacy and balanced perceptions: Students with less extreme perceptions of AI tend to have better outcomes. Policymakers should develop initiatives to promote AI literacy and foster balanced perceptions of AI among students. This could involve incorporating AI education into curricula or organizing awareness campaigns.
  4. Support research on AI usage and perceptions: Further research is needed to understand the quadratic relationship between AI perceptions and student outcomes. Policymakers should allocate funding and resources for continued research in this area to inform evidence-based practices.
  5. Ensure equitable access to AI tools: While SES and locale were not significant factors affecting AI usage or perceptions, policymakers should still ensure that all students, regardless of their socioeconomic status or location, have equitable access to AI tools and resources. This study is a single source of evidence to support the idea that AI access may not be limited to the wealthy, but it is still a valid concern, as socioeconomic status has historically been a risk factor.
Policy RecommendationResources
Develop guidelines and provide training for effective AI tool integrationMichigan Virtual’s AI Integration Framework
Demystifying AI by All4Ed
Michigan Virtual’s K-12 AI Guidelines for Districts
YouTube Video: District Planning
Michigan Virtual Workshops
Encourage subject-specific approaches to AI integrationMichigan Virtual Workshops
Promote AI literacy and balanced perceptionsMichigan Virtual’s AI Integration Framework
Demystifying AI by All4Ed
Support research on AI usage and perceptionsPartner with local universities and research organizations, like Michigan Virtual!
Ensure equitable access to AI toolsMichigan Virtual AI Planning Guide

For Teachers:

  1. Integrate AI tools in multifaceted and facilitative ways: Teachers should strive to incorporate AI tools not just as tools for lower-level tasks but also as facilitators for learning. This could involve using AI for tasks like personalized feedback, interactive simulations, or collaborative projects. Guidance for how to avoid student reliance on AI and encourage facilitative use can be found here
  2. Adapt AI integration strategies based on subject area: Teachers should be aware that the impact of AI usage may differ across subject areas. They should adapt their strategies for AI integration based on the specific needs and requirements of their subject area.
    1. ELA teachers should especially adapt assignments that encourage higher-order skills so that simple AI “tool-use” will not result in plagiarism, dishonesty, or a lack of quality learning (or lower achievement). 
    2. STEM teachers should promote the utilization of AI as a facilitator and differentiator to support their students’ learning.
  3. Address student perceptions and promote balanced views: Teachers should actively address student perceptions and misconceptions about AI. They should aim to foster balanced and informed views by providing accurate information, addressing concerns, and highlighting both the potential benefits and limitations of AI.
  4. Collaborate with colleagues and seek professional development: Teachers should collaborate with colleagues, particularly those teaching different subject areas, to share best practices and learn from each other’s experiences with AI integration. They should also seek professional development opportunities to enhance their skills in effectively using AI tools in the classroom.
  5. Provide guidance and support for AI tool usage: Since many students are not utilizing AI tools or reporting usage, teachers should provide guidance and support to encourage appropriate and effective AI tool usage among students.
Teacher RecommendationResources
Integrate AI tools in multifaceted and facilitative waysMichigan Virtual Student AI Use Cases
YouTube Series: Practical AI for Instructors and Students 
YouTube Video: Using AI for Instructional Design
Teaching with AI (Open AI)
Adapt AI integration strategies based on subject areaMichigan Virtual Student AI Use Cases
AI in the Classroom
Address student perceptions and promote balanced viewsNext Level Labs Report: Demystify AI
NSF AI Literacy Article
Collaborate with colleagues and seek professional developmentMichigan Virtual Student AI Use Cases
Youtube Series: Practical AI for Instructors and Students 
YouTube Video: Using AI for Instructional Design
Teaching with AI (Open AI)
Provide guidance and support for AI tool usageYouTube Series: Practical AI for Instructors and Students
Next Level Labs Report
Youtube Video: ChatGPT Prompt Writing
AI 101 for Teachers
Michigan Virtual Courses

References 

Beck, J., Stern, M., & Haugsjaa, E. (1996). Applications of AI in Education. XRDS: Crossroads, The ACM Magazine for Students, 3(1), 11-15.

Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big? In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (pp. 610-623).

Chen, Y., Jensen, S., Albert, L. J., Gupta, S., & Lee, T. (2023). Artificial intelligence (AI) student assistants in the classroom: Designing chatbots to support student success. Information Systems Frontiers, 25(1), 161-182.

Diliberti, M. K., Schwartz, H. L., Doan, S., Shapiro, A., Rainey, L. R., & Lake, R. J. (2024). Using artificial intelligence tools in K–12 classrooms. RAND Corporation. https://www.rand.org/pubs/research_reports/RRA956-21.html

Driessen, G., & Gallant, T. B. (2022). Academic integrity in an age of AI-generated content. Communications of the ACM, 65(7), 44-49.

Freidhoff, J. R., DeBruler, K., Cuccolo, K., & Green, C. (2024). Michigan’s k-12 virtual learning effectiveness report 2022-23. Michigan Virtual. https://michiganvirtual.org/research/publications/michigans-k-12-virtual-learning-effectiveness-report-2022-23/

Kulik, J. A., & Fletcher, J. D. (2016). Effectiveness of intelligent tutoring systems: a meta-analytic review. Review of Educational Research, 86(1), 42-78.

Kumar, V. R., & Raman, R. (2022). Student Perceptions on Artificial Intelligence (AI) in higher education. In 2022 IEEE Integrated STEM Education Conference (ISEC) (pp. 450-454). IEEE.

Luckin, R., Holmes, W., Griffiths, M., & Forcier, L. B. (2016). Intelligence Unleashed: An argument for AI in Education. https://www.pearson.com/content/dam/corporate/global/pearson-dot-com/files/innovation/Intelligence-Unleashed-Publication.pdf

Martínez, I. G., Batanero, J. M. F., Cerero, J. F., & León, S. P. (2023). Analysing the impact of artificial intelligence and computational sciences on student performance: Systematic review and meta-analysis. NAER: Journal of New Approaches in Educational Research, 12(1), 171-197.

Mehrabi, N., Morstatter, F., Saxena, N., Lerman, K., & Galstyan, A. (2021). A survey on bias and fairness in machine learning. ACM Computing Surveys (CSUR), 54(6), 1-35.

Selwyn, N. (2019). Should robots replace teachers? AI and the future of education. John Wiley & Sons.

Trisoni, R., Ardiani, I., Herawati, S., Mudinillah, A., Maimori, R., Khairat, A., … & Nazliati, N. (2023, November). The Effect of Artificial Intelligence in Improving Student Achievement in High Schools. In International Conference on Social Science and Education (ICoeSSE 2023) (pp. 546-557). Atlantis Press.

Walton Family Foundation. (2024). AI Chatbots in Schools. https://www.waltonfamilyfoundation.org/learning/the-value-of-ai-in-todays-classrooms

Winkler, R., & Söllner, M. (2018). Unleashing the Potential of Chatbots in Education: A State-Of-The-Art Analysis. In Academy of Management Annual Meeting (AOM).

Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on artificial intelligence applications in higher education – where are the educators?. International Journal of Educational Technology in Higher Education, 16(1), 1-27.

Zheng, L., Niu, J., Zhong, L., & Gyasi, J. F. (2023). The effectiveness of artificial intelligence on learning achievement and learning perception: A meta-analysis. Interactive Learning Environments, 31(9), 5650-5664.

]]>
https://michiganvirtual.org/wp-content/uploads/2024/06/Students-in-the-Future.pnghttps://michiganvirtual.org/wp-content/uploads/2024/06/Students-in-the-Future-150x150.png
AI in Education: Exploring Trust, Challenges, and the Push for Implementation https://michiganvirtual.org/research/publications/ai-in-education-exploring-trust-challenges-and-the-push-for-implementation/ Wed, 19 Jun 2024 20:19:41 +0000 https://michiganvirtual.site.strattic.io/?post_type=publication&p=87684

This 2024 Michigan Virtual study of over 1,000 educators reveals varying attitudes toward AI in K-12 education. While administrators showed higher trust and prioritization of AI (trust levels ~58/100), teachers demonstrated lower trust (43.7/100). About 50% of educators already use AI professionally, despite only 30% of districts having AI policies. Main concerns included student misuse, privacy, and ethics, while opportunities were seen in content creation and personalized learning. With 80% of educators expecting AI to significantly impact education within 5 years, the study emphasizes the need for professional development and clear implementation guidelines.

]]>

Executive Summary

In March 2024, Michigan Virtual’s AI Lab convened the first AI Statewide Workgroup of education associations in Michigan to identify and coordinate important AI-related trends, challenges, and opportunities facing Michigan school districts.  The workgroup includes top leaders from 14 organizations including the Michigan Education Association, Michigan Association of School Boards, Michigan Association of Superintendents & Administrators, Michigan Association of Secondary School Principals, Michigan Elementary & Middle School Principals, Michigan Association for Computer Users in Learning, Michigan Department of Education, and more.  Following the recommendations from Michigan Virtual’s Planning Guide for AI: A Framework for School Districts, the AI Statewide Workgroup released the Sample Guidance on Staff Use of Generative AI for K-12 School Districts, and coordinated the distribution of a survey to educators across the state to better understand their needs at this moment in time regarding artificial intelligence.

This survey looked at how educators in Michigan are using AI and what their thoughts are on implementing AI in schools. Over 1,000 educators from various roles, from classroom to district to support organizations, were surveyed. The results of the survey are clear that additional work is needed to support awareness, research, training, and honoring concerns in the field. These results also revealed that many educators are ready to engage in this challenge, with the right supports in place.

Introduction

Artificial Intelligence (AI) has rapidly ascended as one of the most disruptive and transformative technologies in modern times. In the realm of education, AI holds the unprecedented potential to revolutionize the educational experience for educators and students. The integration of AI into educational systems has the power to create efficiencies, personalize learning experiences, and transform teaching methodologies.

Despite its promising potential, the successful implementation of AI-driven tools and practices in education is not without its challenges. It requires meticulous planning and strategic alignment with a school district’s educational goals, values, and priorities. The dynamic nature of AI technology and its rapid evolution in educational applications necessitate careful and ongoing research to ensure its effective and meaningful integration. Recognizing this, The AI Lab at Michigan Virtual partnered with over a dozen prominent education groups across the state of Michigan on an initiative to enhance our understanding, promote ethical AI use, and identify professional development needs in schools.

By exploring practical insights and strategies, this research aims to equip educators with the tools necessary to navigate the complexities of integrating AI in their districts. As AI technology continues to mature, the educational landscape will undoubtedly undergo significant changes, making it imperative for educators at all levels to stay informed and prepared for this inevitable transformation.

What Exactly is AI?

We use the term artificial intelligence (AI) throughout, but what exactly do we mean when we say “AI”? According to Michigan Virtual’s Planning Guide for AI: A Framework for School Districts, “artificial intelligence refers to computer systems and programs that possess the ability to perform tasks that typically require human intelligence. These systems are designed to simulate intelligent behavior, such as understanding natural language, recognizing patterns, making decisions, and learning from experience.” 

Additionally, when we use the term “AI” we refer to both of the following:

  • Artificial Intelligence (AI): A branch of computer science that involves the development of intelligent systems capable of performing tasks that typically require human intelligence. AI enables machines to learn from experience, adapt to new data, and make decisions based on patterns and algorithms.
  • Generative AI: This type of AI encompasses algorithms and models designed to produce new content—be it text, images, or video—by learning from vast amounts of existing data.

Current Research Study

This research study was developed based on conversations with, and input from, the AI Statewide Workgroup and guided by the following research questions:

  1. Are educators using AI in their professional roles, and if so, how? 
  2. Do educators trust AI systems? What are their primary concerns about AI? 
  3. How do educators envision the future of AI? 

To answer these research questions, a survey was developed to assess educators’ experiences, challenges, and perceptions of AI in Michigan schools. The survey included demographic questions regarding educators’ roles, ages/levels served, and educational environment; questions assessing trust in AI and AI use; and questions about AI implementation in schools including potential uses and concerns. The survey consisted of a set of core questions that every educator received as well as specific questions based on educators’ identified role. Role-specific questions were for the following groups: teacher, building principal/assistant principal (labeled throughout as building administrators), superintendent/assistant superintendent, school board member, curriculum director, and technology director (collectively labeled throughout as district administrators). The questions for all three groups were similar but worded to apply specifically to each role, i.e. “Are you using AI in your classroom/building/school?

The survey was shared with, and distributed by, the AI Statewide Workgroup to their membership, generating 1,055 unique responses in approximately 2 weeks. Please note that tables may not add to 1,055 as respondents were not required to answer questions. Additionally, some tables may add to more than 1,055 as respondents could select more than one option. Survey responses were collected in Qualtrics and analyzed in Excel. ChatGPT was used to assist in analyzing and summarizing open-ended question responses. 

Findings

Educator Demographics 

Table 1 below details the participating professional education association memberships. Eight professional organizations each had over 100 participant responses within the survey, representing a diverse range of educator perspectives. 

Table 1. Count of Professional Education Association Membership

Professional Education AssociationCount
MEA (Michigan Education Association)404
MACUL (Michigan Association for Computer Users in Learning)235
REMC Association of Michigan (Regional Educational Media Center Association of Michigan)217
MASSP (Michigan Association of Secondary School Principals)180
MASB (Michigan Association of School Boards)139
MANS (Michigan Association of Non-public Schools)132
MASA (Michigan Association of Superintendents and Administrators)123
MSBO (Michigan School Business Officials)123
MEMPSA (Michigan Elementary & Middle School Principals Association)68
MAISA (Michigan Association of Intermediate School Administrators)55
Other38
MASL (Michigan Association of School Librarians)27
MI-ASCD (Michigan Association for Supervision and Curriculum Development)27
MAPSA (Michigan Association of Public School Academies)25
AFT (American Federation of Teachers)21

As part of the demographic questions, educators were asked their primary role. Table 2 below provides details of their responses. Teachers represented the largest group of educators, followed by building principals/assistant principals, collectively providing diverse insight into AI perceptions and use in school buildings and classrooms. Additionally, there were nearly 200 respondents in district-level leadership roles, again providing diverse perspectives on district-level AI perceptions, needs, and concerns. 

Table 2. Count of Primary Educational Roles

Primary Educational RoleCount
District Level 
    School Board Member62
    Superintendent / Asst. Superintendent39
    School Business Official / CFO35
    Technology Director32
    Curriculum Director13
    Human Resources3
Building Level 
    Teacher362
    Building Principal / Asst. Principal139
    Coach / Consultant60
    Special Education Provider29
    Library / Media Specialist25
    School Support Staff15
    Educational Support Professional7
K-12 Peripheral Supports
    College Faculty16
    ISD Administrator10
    Higher Ed Support Professional2
    Other (Please specify)57

Educators were also asked to identify their primary educational environment. As detailed in Figure 1 below, well over half of educators were from public districts, with a considerable number from non-public schools/districts and ISDs/RESAs. 

Figure 1. Primary Educational Environment

If educators indicated they were teachers or building principals/assistant principals they were asked to identify the grade levels they currently taught or the building levels they currently served. Figure 2 below details these responses. Elementary, middle, and high school teachers were well represented, with a smaller number of respondents in adult education, post-secondary/college, preschool, and career and technical education.  

Figure 2. Grade Levels Taught and Building Level Served

Additionally, educators were asked, where would you turn (or have you turned) for information on artificial intelligence (AI)? Figure 3 below provides a summary of their responses. Among the most popular places educators turned to for information on AI were their colleagues or friends and conference presentations/workshops–with over 16% of educators reportedly using either or both. This finding is aligned with previous research where educators reported a preference for informal peer mentoring over professional development options such as conferences, webinars, and online courses. Far fewer educators reported using sources like social media, popular media, or students for information on AI. 

Figure 3. Sources of Information on AI

Are Educators Using AI in Their Professional Roles and if So, How? 

Educators were asked if they were using AI in their classroom (teachers), or if teachers were using AI in their school (building administrators and district administrators). 

The responses, detailed in Figure 4 below, indicate that building administrators report that teachers in their schools are using AI in some capacity in their classrooms on a much larger scale than reported by teachers or district administrators. Nearly 70% of building administrators indicated that teachers in their buildings were using AI in their classrooms, while less than 30% of teachers reported such use. The reason for this discrepancy is not known. It could be that these figures accurately reflect the context in which these two groups work or perhaps that building administrators are overestimating AI use in their schools. Another possibility is that teachers are underestimating or not identifying AI use as a distinct activity. 

Figure 4. AI Use by Role Type

For district administrators who reported that AI was not being used in their schools, those indicating that they were exploring future use of AI were nearly double those who had no future plans to use AI (45.1% compared to 21.2%). Given this data, building and district administrators appear more open or resigned to the inevitability of AI use in their schools. 

This trend was not the same for teachers. Of those not using AI in their classrooms, only 31.8% reported that they were exploring future use, while 43% indicated they had no plans for using AI in their classrooms. More so than the other two groups, teachers seemed fairly evenly split between those who were either using AI or planned to and those who had no plans to use AI. 

Teachers who indicated that they were using AI were asked to explain how they were using AI in their classrooms. These responses are summarized in Table 3 below. Responses were grouped into categories and examples of each, from responding teachers, are provided for context. 

Table 3 highlights various applications of AI in the classroom as reported by teachers, reflecting its diverse role in supporting educational practices. Teachers reported using AI for lesson planning, interactive learning, resource enhancement, and professional development, illustrating its broad utility. Additionally, teachers also reported using AI tools to assist in assessment and feedback, proofreading, and creative writing, and providing targeted support to both teachers and students. These uses suggest that AI can be a multifaceted and valuable tool in education, capable of addressing various needs and improving efficiency, though successful integration requires trust, training, and ethical implications. 

Table 3. Teachers’ Use of AI in the Classroom

CategoryExample 1Example 2Example 3
Lesson Planning & Curriculum DevelopmentCreating standards-based writing prompts, rubrics, and clear instructions.Developing differentiated lessons to provide tiers of support to students.Creating decodable passages and sentences for phonics skills.
Interactive and Engaging LearningIntroducing AI tools to students and teaching prompt engineering.Demonstrating the pros and cons of AI compared to traditional search tools.Integrating AI tools into lessons for interactive tutoring and conversation.
Enhancing Classroom ResourcesGenerating content for differentiated instruction.Translating texts and changing reading levels to meet student needs.Developing lesson materials and extension activities.
Professional Development and ModelingModeling AI use in lesson planning and communication for students.Teaching machine learning and generative AI units in computer science classes.Educating students on the ethical use of AI and how to edit AI-generated content.
Assessment & FeedbackProviding targeted, rubric-driven feedback on student writing with tools like Class Companion.Using AI to respond to students’ work submissions and provide feedback.Creating practice questions and assessment prompts.
Proofreading and Writing SupportTeaching students how to use AI for thesis generation, research assistance, and example summaries.Encouraging the use of Grammarly and grammar practice software like Quill.Proofreading student writing.
Creative Writing and Content GenerationAI-assisted writing structure, support, and feedback on fluency.AI “interviews” with characters/authors to generate ideas for writing.Creating visual prompts for oral exams and practice exams.
Communication and Administrative TasksGenerating emails, letters of recommendation, announcements, and newsletters.Using AI to change wording to professional language in communications with parents.Creating content and language objectives for lessons.
Specialized Uses and Niche ApplicationsCreating individualized learning pathways and analyzing student progress.Integrating AI in specific subjects like French language learning by updating old texts to modern language.Using AI for specific tasks like creating letters of recommendation and Quizizz assignments.

To understand how educators are—or are not—using AI, it’s important to understand what barriers exist concerning AI implementation. Educators were asked, what barriers, if any, are keeping you from using artificial intelligence (AI) in your professional role, or using AI with your students? 

Table 4 below provides a summary of the types of barriers identified by educators alongside examples of each in practice. Among the most prevalent barriers identified are logistical barriers such as lack of training and time constraints, institutional barriers, and ethical barriers such as ethical and privacy concerns, negative perceptions, and lack of trust in AI. In many ways, logistical barriers are easier to overcome as they have clear solutions. If educators feel that they have an overall lack of training on AI, the solution is to provide more training. Ethical barriers, on the other hand, have less clear solutions and take long-term, targeted interventions. Given this, simply addressing logistical barriers may not produce the lasting, institutional change buildings or district leaders desire. 

Table 4. Barriers to AI Use in Classrooms

CategoryExample 1Example 2Example 3
Lack of TrainingNeed more training before using it in the classroom.Lack of understanding of how to integrate AI into teaching.Professional development required to understand AI’s uses and pitfalls.
Ethical and Privacy ConcernsConcerns about data privacy (FERPA, IEP/504s, HIPAA).Worry about inaccurate or biased information.Ethical concerns about the databases AI is based on.
Institutional BarriersDistrict blocks many AI programs and tools.Approval is required from the district to use AI.Lack of district policy or guidelines for AI use.
Negative Perceptions and StigmaBelief that AI use is cheating or lazy.Concerns about AI reducing students’ critical thinking and creativity.Stigma and fear around AI from colleagues and students.
Time ConstraintsLack of time to learn and apply AI tools.Time needed to revamp assignments to incorporate AI.Other priorities taking precedence over learning about AI.
Lack of Trust in AIDistrust in AI’s reliability and accuracy.Belief that AI is not a trustworthy or honest tool.Concerns that AI will not replace human teaching effectively.
Resistance to ChangePreference for traditional teaching methods.Belief that students should develop skills without AI assistance.View that AI is against personal values and beliefs.

Do Educators Trust AI Systems? What Are Their Primary Concerns About AI?

All educators who participated in the survey were asked to rate their level of trust in AI from 0 (no trust) to 100 (complete trust), with 50 indicating a moderate level of trust. The average or mean score for all educators was 49.4 indicating a moderate level of trust in AI overall. 

Figure 5 below details the level of trust by role relative to the average level of trust reported by all educators. Curriculum directors (61.1), school business officials/CFO (58.3), coaches/consultants (59.7), and superintendents/assistant superintendents (57.2) on average reported the highest levels of trust in AI. Conversely, educational support professionals (35.5), school support staff (40.3), and library/media specialists (43.0) had the lowest trust. It is important to note, however, that these roles accounted for only 47 out of 1,055 responses. Teachers had the 5th lowest trust in AI with a mean of 43.7.

Figure 5. Level of Trust in AI by Role

We also looked at the level of trust by experience with AI. Figure 6 below details the results. Unsurprisingly, educators who have not used AI and do not plan to have the lowest level of trust in AI. Conversely, those who have used AI in their professional role had the highest levels of trust. The directionality of this relationship is not clear; however, it is important to note that there are not an insignificant number of educators who have very low trust in AI and have not used AI in any capacity/do not plan to. Additionally, there are a fairly large number of educators who are unsure if they’ve used AI, suggesting that any AI integration plans must begin with trainings that focus on awareness and understanding of AI systems. 

Figure 6. Level of Trust in AI by AI Experience

Finally, we looked at the level of trust in AI alongside the level of priority educators placed on AI integration shown in Figure 7. Educators were asked to indicate on the same 0 to 100 scale, what level of priority they place on artificial intelligence (AI) integration in education today. Teachers indicated the lowest level of priority of AI integration, much less than both building and district administrators who both had mean scores above 50. The scatterplot (Figure 7) shows a positive relationship between trust and priority—as trust increases, the priority placed on integrating AI also increases. Teachers reported the lowest levels of trust in AI and the lowest priority for AI integration. Both building administrators and district administrators started at similar levels of trust and prioritization; however, as district administrators’ trust increased, the priority they placed on AI integration slightly surpassed that of building administrators. Educators who have a high level of trust in AI may place increased priority on integration because of the perceived benefits of use, while those with lower levels of trust may be hesitant to place a high priority on integration due to perceived drawbacks and concerns.

Figure 7. Trust in AI and Listing AI as a Priority

Educators were asked to select all the areas in which they were concerned about AI integration. Figure 8 provides a detailed look at their responses. More than 10% of educators reported concerns around inappropriate student use, overdependence on technology, privacy and data security, ethical considerations with AI content and curation, and potential bias in AI. Educators, while still concerned, reported being less concerned overall about the cost of implementation, equity and accessibility, replacement of human educators, and lack of effective AI education tools. 

There was also the option to add additional concerns not listed in the original question. Overwhelmingly, educators indicated that they were concerned about the intellectual and academic impact of AI. For example, they noted concerns regarding AI encouraging intellectual laziness, negative impacts on students’ critical thinking, increased risk of cheating, and accusations of cheating. Educators also reported concerns about the reliability and accuracy of AI results, noting that AI often produces incorrect or inappropriate information, and students might believe all AI-generated content is factual, leading to misinformation.

This data suggests that educators’ concerns are varied but the most pressing tend to be those regarding ethical use (inappropriate student use, cheating, etc.) and less logistical (cost, tool availability, etc.). For schools and districts looking to explore AI use and integration, understanding educators’ specific concerns will be crucial in bolstering trust and confidence in AI. 

Figure 8. Concerns Regarding AI

How Do Educators Envision the Future of AI? 

As discussed above, building and district administrators place the level of priority of AI integration around 60 out of 100 (57.4 and 59.3 respectively). Nearly 80% of district administrators and nearly 90% of building administrators report that they are currently using AI or exploring future use. Taken together, this suggests that administrators, and to a much smaller extent teachers, regard AI implementation as somewhat of an inevitability and therefore a priority. One that will be necessary to consider in the future, regardless of their level of trust around AI. 

Figure 9. Promising Uses of AI

We also asked all educators in what areas they think AI can be a useful tool. The results are detailed in Figure 9 above. Educators seemed most optimistic about AI assisting with content creation and curation, personalized learning, and accessibility/assistive technology. Educators seemed most optimistic about AI’s ability to help them personalize and differentiate learning and make their content and curriculum more accessible. Teachers seemed less optimistic about AI assisting with some of the more “interpersonal” aspects of teaching such as virtual tutoring or assistance programs, instruction, and community/parent communication and engagement. 

To understand how to better support teachers and schools in AI implementation, educators were asked, in what areas do you need support for integrating artificial intelligence (AI) into your school/district. Figure 10 below details their responses. At the top of the list was a need for support around professional development, followed by technical expertise/training–two areas that are closely aligned. Educators also responded and identified a need for assistance with AI policy, specifically guidance around data privacy and security, as well as draft AI policies and guidelines. 

Figure 10. Areas of Support Needed Regarding AI Integration

Key Takeaways

Building and District Administrators Have High Trust in AI and Deeper Experience with AI

Building and district administrators have higher levels of trust in AI and consider AI integration to be a higher priority than teachers. These administrators also have much more experience using AI both personally and professionally than teachers. As such, administrators need to be mindful and patient as teachers may not automatically “buy into” their vision for AI integration. Education leaders can use their experience and vision to lead their buildings and districts but need to be understanding of stakeholder concerns and reluctance towards AI. 

Educators Are Using AI in Their Buildings and Classrooms, Regardless of Offical District Policy

Only approximately 30% of district administrators reported that their school, school board, or governing body officially adopted AI policy or guidelines; however, over 50% of educators who responded to this survey reported using AI in their professional role (an additional 15% reported using it personally but not professionally). Educators (not all, but many) are “ahead” of their districts and using AI in their classrooms and schools. Whether or not districts want to pursue AI integration, this data suggests a real need for clear AI policies and guidelines to guide the use that is already taking place. 

There Exists a Group of AI Skeptics That Cannot Be Ignored

There is a not insignificant group of educators who have little to no interest, low trust, and are not actively seeking information on AI. Six percent of educators reported that they are not looking for support on AI integration and will not need it in the future, 20% have not used AI and do not plan to, and approximately 10% do not think AI will be used significantly in classrooms in the next 5 years. This group of educators, while not the majority, may possess serious concerns about AI integration and/or be largely apathetic to the potential uses and implications of AI. 

Discussions Around AI Are Just Getting Started

A vast majority of educators, over 80%, feel like AI will play a “very significant” or “somewhat significant” role in education in the next 5 years. However, given current experience and use trends, there exists a large gap between how and when educators are using AI now and where they expect to be in the future. Encouragingly, educators reported a strong need for support integrating AI into their schools and districts in the areas of professional development/expertise, data privacy, and draft policy and guidelines among others. 

This data also suggests that not everyone is entering the discussion around AI integration with the same experience, perceptions, and concerns. Education leaders will need to assess where stakeholders are and ensure diverse voices are represented—not just those most enthusiastic about AI. One way to ensure diverse perspectives is to include the larger stakeholder community as educators in this survey responded that they valued those voices. Over 50% of educators felt that both students and the larger community should be involved in discussions around AI “always” or “often”, while only approximately 10% of educators felt that these groups should be included “rarely” or “never.” Figure 11 shows a detailed breakdown of how often educators felt students and the school community should be involved in discussions around AI. 

Figure 11. Perceptions of How Often Students and Community Members Should Be Involved in Discussions Regarding AI

Final Thoughts

If there is to be one takeaway from this research, it is that there is a vast array of perceptions, concerns, experiences, needs, and levels of trust around AI. Educators see many areas of promise for AI integration but also many areas of concern. Many educators are using AI personally or professionally, but districts may not yet be “caught up” to this reality. Educators have seen educational technology tools come and go for decades now, and while there are a small number who believe AI will not produce radical changes to education, many more are expecting AI to transform it. 

This survey is only the start of understanding Michigan educators’ perceptions of AI—many more discussions are needed. While this research focused on educational professionals, key stakeholders such as students and parents were not included; however, they will need to be as districts and schools move forward with AI integration. 

We started this report by stating that AI integration requires meticulous planning and strategic alignment with a school district’s educational goals, values, and priorities. This still holds true. AI use is already happening in schools and educators expect this use to increase significantly in the near future. Educators are hungry for more guidance, clarity, and information on AI and how to integrate it into their practice–but they’re also cautious. While the process will not be easy, it is imperative that schools and districts address AI, in some capacity, because leaders, teachers, and students are already using AI, oftentimes without any guardrails in place. School districts have the ability to establish these guardrails to equip educators with the tools necessary to navigate the complexities of integrating AI in their districts while empowering both their staff and students. 

Based on this evidence, the AI Lab within MVLRI will continue to provide support and leadership to the AI Statewide Workgroup and others around the following strategies:

  • Increase AI awareness and communications
  • Pursue ongoing research on AI classroom literacy
  • Expand professional learning on AI tools 
  • Validate and consider potential concerns
]]>
https://michiganvirtual.org/wp-content/uploads/2024/12/iStock-2130240743.jpghttps://michiganvirtual.org/wp-content/uploads/2024/12/iStock-2130240743-150x150.jpg
Key Strategies for Supporting Disengaged and Struggling Students in Virtual Learning Environments https://michiganvirtual.org/research/publications/key-strategies-for-supporting-disengaged-and-struggling-students-in-virtual-learning-environments/ Mon, 03 Jun 2024 18:34:22 +0000 https://michiganvirtual.site.strattic.io/?post_type=publication&p=87032

This study delves into effective strategies used by virtual programs and teachers to support disengaged and struggling students, noting significant overlap with strategies for all students and highlighting the critical roles of parental and mentor support, intervention, and communication. The report covers strategies employed by experienced virtual teachers, programmatic supports, professional development sources, and considerations for supporting virtual learners.

]]>

Abstract

This study identifies effective strategies that virtual teachers and programs employ with disengaged and struggling students. Virtual teachers reported using a variety of strategies with struggling students, among the most often used were providing frequent and specific feedback, and leveraging the support of adults close to the student. Virtual educators also provided advice for new virtual teachers and discussed the challenges they face online. The report concludes by highlighting programmatic considerations for effectively supporting virtual learners.

Introduction and Need for the Study

During and following the shift to emergency remote instruction during the COVID-19 pandemic, researchers at Michigan Virtual identified that schools and districts that had already implemented effective virtual teaching and learning practices before the COVID-19 outbreak experienced greater degrees of success in this transition than their counterparts. Teachers and school leaders of established virtual schools and programs had already invested time and energy in the development of effective pedagogical skills needed to help students achieve success in virtual or remote learning environments. Following this observation, researchers set out to identify these effective practices used frequently with teachers and students. 

The first study, conducted by researchers at Michigan Virtual Learning Research Institute (MVLRI), further aimed to provide promising practices for teachers and school administrators new to teaching and leading in a virtual or remote learning environment to understand how they could better engage students.

The present study is a continuation of that work, digging deeper to identify effective strategies that virtual programs and teachers employ with disengaged and struggling students. Unsurprisingly, there is considerable overlap between strategies used for all students and those frequently used with disengaged students–with a few caveats, including but not limited to the pronounced role of parental and mentor support and intervention as well as the profound effectiveness of communication. The following report discusses strategies that experienced virtual teachers use with struggling or disengaged students, programmatic supports for said students, sources of professional development for virtual teachers, and concludes by highlighting programmatic considerations for effectively supporting virtual learners. 

Methodology

This qualitative study utilized an online survey to collect data from 296 virtual educators (269 teachers and 27 supervising administrators) representing 10 statewide virtual schools or programs with considerable experience delivering virtual courses and serving thousands of students annually. As with our previous reports, it is important to note that the participants in this study were employed by virtual schools with well-established virtual learning programs, professional learning processes, and teacher supervision practices developed and refined over several years. The findings of this study represent an immense collection of knowledge and experience related to virtual teaching and learning across the United States.

The online survey was developed in the summer of 2023, and data were collected in the fall and winter of 2023. The data were compiled and analyzed throughout the spring of 2024, responses that came in after the survey collection cut-off point were excluded from the analysis. The resulting report was made publicly available to all schools and districts in the summer of 2024.

A vast majority of educators (74% of teachers and administrators) served students at the high school level (grades 9-12). Approximately half of the educators surveyed had 10 or more years of experience as a teacher or administrator, and an additional 23% had between 6 and 10 years of experience. A majority of educators (79%) were part-time teachers, 12% were full-time, and 9% were administrators. Most educators (88%) reported that they primarily provided asynchronous instruction. 

Limitations of the Study

​​As with the previous reports in this series, the findings of this study represent the perceptions of teachers and supervising administrators of well-established statewide virtual schools and programs. While the study intends to share promising practices with teachers and administrators new to teaching and leading within virtual learning environments, the practices are not generalizable to all schools in the United States as the participants of this study are working within mature virtual learning programs that have formal structures and supports for teachers and administrators to serve students and families in virtual learning environments.

Discussion of the Findings

Student Engagement Strategies 

Educators reported using various strategies in their online courses to support disengaged or struggling learners. Over 55% of educators reported using 13 of the 22 strategies listed, with at least a quarter of educators reporting that they use all but two of these strategies. Similar to previous reports, the most common strategy is providing frequent and specific feedback. In our report on engaging students in virtual learning environments, we discuss how feedback serves dual purposes of being the primary method of communication and relationship-building and supporting students’ academic progress. 

Two strategies that more than 60% of educators reported using involve communicating with and leveraging the support of adults close to the student. Nearly 70% of educators report that when faced with disengaged or struggling students, they communicate with the student’s onsite mentor, whereas 60% indicated that, in these cases, they encourage parental involvement. This is intuitive as even the most engaged and effective virtual teachers may have difficulty reaching severely disengaged students given the lack of physical proximity. Engaging another adult who is close to the student can help to “bridge this gap.” Table 1 provides a breakdown of the student engagement strategies used by virtual educators.  

Table 1. Student Engagement Strategies Used by Virtual Educators

StrategyCount of Educators% of Educators
I provide frequent and specific feedback23378.7%
I communicate with the student’s on-site mentor20468.9%
I make myself available to students through scheduled office hours or “drop-in” hours20067.6%
I provide clear instructions for assignments19164.5%
I provide supplementary materials (online tutorials, instructional videos, additional reading materials)18562.5%
I provide supplementary visual aids (graphics, diagrams, videos)18261.5%
I encourage parental involvement18060.8%
I provide clear course expectations17559.1%
I celebrate small student victories17458.8%
I offer regular asynchronous check-ins16555.7%
I try to cultivate a strong interpersonal relationship with the student16355.1%
I try to coordinate additional support for students16054.1%
I offer 1:1 support such as tutoring15452.0%
I utilize student progress monitoring tools within the LMS13846.6%
I use real-world examples12742.9%
I provide differentiating/individualizing instruction11739.5%
I offer regular synchronous check-ins11137.5%
I provide personalized remediation opportunities10435.1%
I provide frequent opportunities for formative assessments9532.1%
I develop personalized learning plans7324.7%
I provide opportunities for self-reflection6421.6%
I use gamification (points, badges, leaderboards)206.8%

Educators were also given the opportunity to report strategies they used with disengaged and struggling students that were not listed on the survey. When asked about strategies they used that were not listed, educators reported a variety of strategies. Several responded that they used audio/video recordings within their courses and to message students. One educator noted,

“[I] Add video explanation for students upon request or as I see they need a different method for gaining information or more detailed explanations.”

Others mentioned texting students regarding their course progress/grade updates or using external software or websites to “gamify” or incentivize students. Educators frequently mentioned texting students, such as this educator who stated,

“I am sure to touch base via SMS (text message) with all my students weekly.”

Several educators reported offering group tutoring sessions, “meet and greets” open to all students, and connecting with parents weekly. As one educator noted,

“I provide opportunities just to socialize and build our community of learners (sneaking in some connection to the content too).”

Educators also highlighted the significance of involving parents/guardians in the process, ensuring they are aware of student progress, areas of improvement, and the teacher’s availability for support. Many educators also took this opportunity to discuss their personalized approach to engaging students through building supportive teacher-student relationships and learning about the student as an individual, beyond academics. Educators reported specific actions such as celebrating student victories, offering encouragement, and communicating through the students’ preferred channels with one educator stating, 

“[I] show them I am a real person through a cheery attitude when on camera, personalized well-worded feedback, and text emoticons in personal communications to make up for lack of tone.”

Overall, the responses underscored the multifaceted nature of engaging students virtually, emphasizing the importance of tailored approaches, consistent communication, and integrating personal touches to foster a supportive learning environment. These strategies go beyond conventional methods, focusing on building rapport, providing timely feedback, and creating opportunities for interaction and participation, ultimately aiming to enhance student motivation and success in virtual settings.

Table 2. Most Effective Student Engagement Strategies Used by Educators

StrategyCount of Educators% of Educators
I provide frequent and specific feedback14047.3%
I communicate with the student’s on-site mentor7124.0%
I provide supplementary materials (online tutorials, instructional videos, additional reading materials)6823.0%
I encourage parental involvement6622.3%
I offer regular asynchronous check-ins6220.9%
I make myself available to students through scheduled office hours or “drop-in” hours5117.2%
I provide supplementary visual aids (graphics, diagrams, videos)4715.9%
I try to cultivate a strong interpersonal relationship with the student4515.2%
I provide clear instructions for assignments4314.5%
I offer 1:1 support such as tutoring4314.5%
I provide clear course expectations3110.5%
I celebrate small student victories3010.1%
I try to coordinate additional support for students289.5%
I use real-world examples227.4%
I offer regular synchronous check-ins217.1%
I utilize student progress monitoring tools within the LMS217.1%
I provide frequent opportunities for formative assessments144.7%
I develop personalized learning plans134.4%
I provide personalized remediation opportunities134.4%
I provide differentiating/individualizing instruction124.1%
I use gamification (points, badges, leaderboards)41.4%
I provide opportunities for self-reflection31.0%

Educators were also asked to select the three strategies they found most effective and used most frequently. While Table 2 highlights the strategies that educators used most frequently, the strategies educators found most effective closely mapped to these results. Overwhelmingly educators reported using frequent and specific feedback as a key strategy for supporting disengaged or struggling students with nearly 50% reporting this is a key strategy. While educators reported using strategies such as communicating with a student’s mentor or parent (24%) and providing supplemental materials (23%), providing frequent and specific feedback was utilized much more often (46%). 

Student Engagement Strategies for New Virtual Teachers

Educators were asked what strategies they would recommend to new virtual teachers working with disengaged or struggling students. Educators time and again highlighted the importance of communication in keeping students engaged and on track in their virtual courses. They stressed the importance of establishing open channels of communication, including using preferred methods of communication and providing frequent opportunities for check-ins or progress updates. As one educator stated,

“Keep communicating and building relationships with your students. Once they believe you are there for them even at a distance, they are more likely to do the work and take pride in their accomplishments.”

For new virtual teachers, veteran virtual educators also recommend focusing on building teacher-student relationships (for a detailed breakdown of this see our other report on key strategies for engaging students in virtual learning environments) and demonstrating genuine care for the student’s progress. Educators recommended using personalized feedback as a key strategy to building these relationships, ensuring that students receive specific, detailed, positive feedback on their assignments. One educator noted,

“Building relationships from the beginning is crucial, as is setting clear expectations from the beginning about how the experience should look from the student’s perspective. The amount of time they should expect to spend in the course as well as strategies that set them up for success and what they should do when they need help should be spelled out for students and parents from the beginning.”

Finally, educators discussed the importance of flexibility. Educators reported on the utility of offering alternative assignments, options to redo or make up assignments, and flexible schedules to accommodate diverse student needs. This also includes providing a variety of learning materials such as text, videos, graphics, hands-on activities, etc. as discussed by these educators,

“Practice “rigid flexibility”….meaning be ready to change course set up, content delivery, strategies, etc… as soon as you see data/evidence that shows your set up is not producing the desired student outcomes.”

“Be patient and really try to get to know your students. It will pay off in the long run. Set time aside each day for you to get your grading and communicating done with students. Be flexible. Not all students are going to have the same access to the internet as others. Some have dedicated hours of the day at their in-person school where they work. Others work from home.”

Challenges of Virtual Teaching

Educators were also given the opportunity to discuss the most prominent challenges they faced when transitioning to teaching virtually. The first notable challenge revolved around the difficulty of connecting with disengaged or failing students in a virtual environment. Despite their best efforts, teachers expressed frustration over the lack of face-to-face interaction, which prevents them from reading body language, easily establishing personal connections, and quickly regaining student attention. Educators reported that this absence of physical presence also makes it challenging to identify the specific reasons for student disengagement, which makes it especially difficult to tailor inventions. This sentiment was discussed by these educators who stated,

“Not seeing the students every day and being a part of their lives. No matter what amount of connecting you try to do in the virtual environment, it just isn’t the same.”

“The biggest challenge is making connections with students so they know you care and they are motivated to work.”

Further, educators highlighted the challenge of establishing effective communication with parents and guardians. They described unresponsive students and parents, incorrect contact information, and an overall lack of parental involvement. Overall, the consensus among educators seems to be that the transition to virtual teaching presents substantial hurdles in establishing connections, effective communication, and student engagement, particularly with those who are disengaged or struggling academically due to the physical separation between teachers and students. Educators noted, 

“Being able to make that personal connection with schools, facilitators, parents, and students without them being right in front of you.”

“Not seeing your students regularly, some students are hard to reach, some parents are hard to reach.”

Programmatic Strategies for Student Support and Engagement 

To provide a more robust understanding of student engagement and support from multiple levels, administrators were asked to discuss procedures their programs follow for students who are disengaged or failing their virtual course(s). Responses tended to fall into one of the following three categories: emphasizing communication and collaboration, early identification of struggling students, and encouraging parental involvement. 

Emphasizing Communication

Programs reported involving school facilitators, counselors, administrators, and site coordinators/mentors to address student disengagement. Further, administrators discussed the importance of clear communication channels, such as using Learning Management System (LMS) messaging systems, telephone calls, web conferencing tools, and personal visits to ensure that teachers, students, and parents/guardians remain connected and informed. This multi-faceted communication strategy aims to provide ongoing support, clarify expectations, and promptly address any issues related to course progress.

Early Identification of Struggling Students

Second, administrators highlighted the early identification of struggling students. Regular monitoring of student progress allows teachers and administrators to identify signs of disengagement or academic challenges early on. Once identified, intervention plans and individualized learning strategies can be developed to provide personalized support. These plans may involve additional resources, alternative learning methods, tutoring services, or flexible learning options tailored to meet the unique needs of each student. By intervening early and offering targeted support, programs aim to prevent further disengagement and facilitate student success in virtual learning environments.

Encouraging Parental Involvement

Third, many programs emphasize the involvement of parents/guardians in the educational process. Encouraging parental engagement, providing regular progress reports, and seeking parental support in fostering student engagement were commonly reported practices. Programs also reported establishing structured procedures for follow-up actions including meetings with students, parents/guardians, and school coordinators/mentors. This comprehensive approach ensures that all stakeholders are actively involved in addressing student disengagement, fostering a collaborative effort to support students in virtual courses effectively.

Sources of Educators’ Professional Learning 

Educators were also asked to identify the professional development sources they frequently utilize and those they perceive as the most effective. The data in Table 3 reveals a wide array of development opportunities available to educators, with a preference for optional opportunities offered by their virtual school or program, as noted by 70.6% of respondents. Similarly, conferences and mandatory school-provided programs were also heavily utilized, indicating a robust network of resources available to educators.

Table 3. Sources of Educators’ Professional Learning

Source of Professional DevelopmentCount of Educators% of Educators
Optional opportunities provided by my virtual school/program20970.6%
Conferences (in-person or virtual)14649.3%
Mandatory opportunities provided by my virtual school/program14047.3%
Webinars provided by educational organizations12943.6%
Online courses provided by educational organizations12742.9%
Informal peer mentoring with colleagues10936.8%
Formal peer mentoring with colleagues7124.0%
Graduate coursework through a college or university3712.5%
Undergraduate coursework through a college or university3612.2%
Social media299.8%
Other72.4%

The perceived effectiveness of these professional development sources, as detailed in Table 4, closely aligns with their usage rates (as shown above in Table 3). Over 50% of educators endorsed optional school-provided opportunities as the most effective, underscoring their critical role in virtual educator support. Conferences and informal peer mentoring were also highlighted for their substantial impact, with 44.6% and 36.5% of educators respectively finding these methods beneficial. These findings emphasize the value of immediate, applicable, and peer-supported learning opportunities in the virtual teaching context.

However, traditional and formal educational pathways like graduate and undergraduate courses were less frequented and ranked lower in perceived effectiveness. Only a small fraction of educators pursued these routes, and even fewer regarded them as among the most effective. This trend suggests a preference for more direct and practical professional development options that provide immediate benefits in the virtual classroom environment.

Overall, the survey illustrates a clear preference for diverse and accessible professional development opportunities that are not only readily applicable but also foster a sense of community and collaboration among virtual educators. These methods, particularly those that encourage active participation and peer interaction, are crucial for effectively engaging and re-engaging students in the virtual educational landscape. This approach not only enhances the teaching capabilities of educators but also significantly contributes to the overall success of students in virtual settings. For more on virtual educator professional development, please see our report on key strategies for supporting teachers in virtual learning environments.  

Table 4. Most Effective Sources of Teachers Professional Development

Professional Development SourceCount of Educators% of Educators
Optional opportunities provided by my virtual school/program14950.3%
Conferences (in-person or virtual)13244.6%
Informal peer mentoring with colleagues10836.5%
Online courses provided by educational organizations8328.0%
Mandatory opportunities provided by my virtual school/program7425.0%
Webinars provided by educational organizations7224.3%
Formal peer mentoring with colleagues4916.6%
Graduate coursework through a college or university175.7%
Social media82.7%
Other82.7%
Undergraduate coursework through a college or university41.4%

Key Takeaways & Conclusion

Our original aim with this research was to provide promising practices for teachers and school administrators new to teaching and leading in a virtual or remote learning environment to understand how they could better engage students. After two surveys, with nearly 2,000 responses from virtual teachers and administrators, the following practices emerged as crucial considerations for virtual teaching and virtual program administration. 

  • Prioritize Teacher-Student Relationships: The most effective way to engage students and keep them engaged is to focus on and provide support for developing teacher-student relationships. 
  • Establish Clear Communication Channels: Use LMS messaging systems, telephone calls, web conferencing tools, and personal visits to maintain regular communication among teachers, students, parents, and administrators.
  • Early Identification and Intervention: Regularly monitor student progress to identify signs of disengagement or academic struggles early on. Develop intervention plans and individualized learning strategies to provide personalized support.
  • Parental Involvement: Encourage parental engagement by providing regular progress reports, seeking parental support in fostering student engagement, and involving parents in decision-making processes.
  • Structured Follow-Up Procedures: Establish structured procedures for follow-up actions, including meetings with students, parents/guardians, and school liaisons to address student disengagement promptly.
  • Provide Additional Support Services: Offer tutoring sessions, peer mentoring programs, teacher-led virtual support hours, and other resources to assist students who need extra help.
  • Flexible Learning Options: Consider adjusting pacing, providing alternative assignments, or offering varied assessment methods to accommodate different learning needs and support student engagement.
  • Professional Development for Teachers: Provide ongoing professional development to equip teachers with the skills and resources necessary to effectively support virtual learners and identify signs of disengagement.
  • Collaborative Approach: Promote collaboration among stakeholders, including school facilitators, counselors, administrators, and site coordinators, to ensure a comprehensive and unified effort in supporting student success.

Considering these practices alongside critical elements such as effective and accessible course design, efficient program operations, supportive teaching, etc, programs can create a supportive and engaging environment for students in virtual courses, helping them overcome challenges, stay motivated, and achieve academic success.

]]>
https://michiganvirtual.org/wp-content/uploads/2024/06/iStock-1312894444.jpghttps://michiganvirtual.org/wp-content/uploads/2024/06/iStock-1312894444-150x150.jpg
Out of Order, Out of Reach: Navigating Assignment Sequences for STEM Success https://michiganvirtual.org/research/publications/out-of-order-out-of-reach-navigating-assignment-sequences-for-stem-success/ Mon, 01 Apr 2024 17:46:31 +0000 https://michiganvirtual.site.strattic.io/?post_type=publication&p=86110

Pacing, or the timing of students’ assignment submissions, has been shown to have an important relationship to course performance. Less is known about how the submission order or sequencing of assignment submissions relates to course performance. This study found that the order in which students submitted assignments in their online STEM courses is related to their final grades, with students who submitted all assignments in line with pacing guide recommendations outperforming peers who did not. Indeed, students’ final grades decreased as deviations from the pacing guide increased.

]]>

Abstract

Research shows that pacing has an important relationship with online course performance; however, most work has focused on the timing—not the order—of students’ assignment submissions. The current study examined the relationship between the order of students’ assignments and their final course grades in online STEM classes. Using course pacing guides as a benchmark, students’ assignment submissions were categorized as either “in sequence” or “out of sequence.” Then, students were categorized as either moving through their courses “in sequence” or “out of sequence.” Most students were categorized as moving “out of sequence” (~93%) and submitted around 38% of their assignments out of order. As such, going out of sequence was common among students, but done somewhat sparingly within the courses themselves. While this “out of sequence” behavior was common, it was not necessarily advantageous for students’ final grades. On average, students who completely adhered to the pacing guide had final grades 9.5 points higher than students who deviated from the pacing guide at least once. A small but statistically significant negative correlation was observed between the proportion of assignments submitted out of order, the extent to which a student submitted an assignment out of order, and final grades. In other words, as students become increasingly out of order, final grades decrease. Taken together, pacing continues to represent a student behavior that may have important implications for course performance. Instructors and mentors should continue to monitor student pacing, and communication about course progression is encouraged. Future work should focus on examining student submission patterns from multiple perspectives to better understand their relationship to achievement. 

Introduction

Michigan is seeing a rise in student engagement with online learning. The number of K-12 students who took at least one virtual course doubled from 7% in 2017-2018 to 14% in 2021-2022 (Freidhoff, 2019; 2023). As online learning continues to grow in popularity, it is essential to set students up for success as the current virtual pass rate is around 69% (for context, the pass rate for non-virtual coursework was 71%).

Studies have shown that pacing, which refers to how students progress through a course, is crucial for student success (DeBruler, 2021; Michigan Virtual Learning Research Institute, 2019). For example, submitting an assignment within the first week is correlated with students’ final grade, suggesting that this may be an early indicator of students’ engagement with course material (Zweig, 2023). Generally, students who are consistently on-pace throughout the course are more likely to be successful than those who aren’t (DeBruler, 2021). Similarly, students who struggle with pacing (as indicated by cramming assignment submissions at the end of the course) tend to perform more poorly than those who consistently pace out their submissions (DeBruler, 2021; Michigan Virtual Learning Research Institute, 2019).  

Certain Michigan Virtual courses, such as those for core subject areas and electives, do not have assignment deadlines, meaning students may submit any assignment at any time during the enrollment window. Because of this structure, students can progress through assignments in any order they would like, giving students flexibility about when and where learning occurs. To help provide guidance and structure, Michigan Virtual provides pacing guides that show what assignments and activities students should complete in a particular week or sequence. In other words, pacing guides provide clear expectations of students’ course progression and serve as a benchmark for students to evaluate their course progress. While a complete discussion of the ideal blend of structure and flexibility is beyond the scope of this report, providing some guidance around scheduling and routines can help students stay on track (Martin & Whitmer, 2016). 

Although not mandatory, following the pacing guides can help students manage a course’s workload, especially when courses do not have firm deadlines. Research suggests that disengagement from assignments and improper pacing can negatively impact student achievement (DeBruler, 2021; Michigan Virtual Learning Research Institute, 2019; Soffer & Tal, 2019; Wu et al., 2023; Zweig, 2023). While the frequency and consistency of course activity contribute to academic achievement, more research is needed to know how the order or sequence in which students engage with material is associated with course performance. 

This study examined how students’ engagement with course assignments related to their course performance, focusing on understanding how the sequencing of students’ assignment submissions was associated with overall performance (final grade). Identifying practices that promote or limit student progress is important because it could inform policies, instructional design principles, or LMS configurations that may improve outcomes. Understanding how users move through a course can allow for more informed planning and decision-making as well as the development of better student support structures. 

Methods

Data & Sample Overview

Data on graded course item submissions and course performance (final grade) was pulled from the learning management system, BrightSpace, for spring 2023 enrollments (n = 8,810 students). The dataset was filtered to exclusively contain students enrolled in STEM courses, a list of which is available in Appendix A (n = 1,818). Analyses focused on high school-level STEM courses because course content and assignments are scaffolded by design, making them well-suited for investigating the role assignment sequencing plays in course performance. 

Only students who completed their courses were included in the dataset (n = 1,732). Students with >50% of course assignments missing were excluded from the dataset to ensure accuracy given the focus on understanding how assignment submissions were related to final grades and these enrollments were missing most of their assignments (n = 1,481). Students enrolled in multiple courses during the spring 2023 semester (i.e., duplicates) were also removed from the dataset (n = 1,341). 

After all data cleaning, 1,308 students were retained in the dataset. Please note that for this report, “assignments” refers to any graded item within a course. Students in the sample were not first-time online learners; all enrollments had completed at least one course and approximately three courses on average.

Analysis

To gain insights into students’ submission patterns in STEM courses, a benchmarking variable called ‘User Driven’ was created. This numerical value examines the order in which students submit assignments relative to the provided pacing guide. Specifically, this value compares the student’s current assignment submission with the one immediately preceding it. If the value for the current submission is one greater than the previous assignment, it is deemed “in sequence”; otherwise, it is considered “out of sequence.” 

For example, if a student submits assignment 9 immediately following assignment 4, it is considered out of sequence as 9 is not one greater than 4. It’s crucial to highlight that, according to this benchmarking variable, a student’s submission is only deemed “in sequence” concerning the assignment immediately before it. Refer to Table 1 for an illustrative data layout. This method was crafted to evaluate the extent to which a student adheres to the pacing guide and to understand the implications of deviations on their final grade. 

It may be viewed as a relatively “strict” interpretation of pacing as it requires the student to move through the course in sequential order and narrowly defines in/out of sequence based on a singular assignment. However, this conceptualization will serve as a starting place for understanding the impact of assignment sequencing on final grades. Future analyses could look at pacing more holistically by investigating how students’ return to earlier content (after deviating from the pacing guide) influences their final grades.

Based on the order of their assignment submissions, students were assigned to one of two groups: “in-sequence” (if they submitted all assignments in line with the pacing guide) or “out of sequence” (if they submitted at least one assignment out of line with the pacing guide).

Next, a ‘Proportion of Assignments Completed Out of Order’ variable was created. This variable calculates the number of assignments a student completed out of order relative to the total number of assignments they completed in the course. 

Finally, a variable was created to understand the extent to which students deviated from sequentially navigating the course. The ‘Average Magnitude’ variable represents the average difference between consecutive assignment submissions for each student. For example, if a student submitted Assignment 4 and then Assignment 9, the magnitude would be 5.

Appendix B contains a list of all the variables created and referenced above and in subsequent sections of this report and may be helpful for the reader to reference. 

AssignmentCourse Design Order  Student Submission OrderUser Driven*Magnitude
0.1 Introduction to the Discussion Board110NA
1.1 Quiz: Review of Functions2201
1.2 Quiz: Algebraic Functions3301
1.3 Quiz: Exponential Functions4401
1.4 Quiz: Trigonometric functions5915
1.5 Quiz: Composition and Inverse Functions6613
1.6 Quiz: Logarithmic Functions7701
Table 1. Example Data Layout

Results

Is going out of sequence a common course behavior?

What percentage of online course enrollments go out of sequence?

Going out of sequence appeared to be the norm among enrollments sampled as approximately 93% indicated going out of sequence at least once. See Table 2 for totals. As such, students are more likely to deviate from the pacing guide than not – at least in STEM courses.

Student Behavior% (n)
In-sequence7.03% (92)
Out-of-sequence93.00% (1216)
Table 2. Student Sequence Behaviors

What is the average number of assignments submitted out of sequence?

The number of assignments submitted out of order across the entire sample ranged from zero to sixty, with 17.5 assignments submitted out of order on average (SD = 11.57). Table 3 details the descriptive data. The median value was 16, indicating that half of the students in the data set submitted less than 16 assignments out of order, and half submitted more than 16 assignments out of order. For context, on average, 38.15% of course assignments were submitted out of order (SD = 22.52).

VariableMeanSDMedianMinMax
Final Grade80.3616.4485.5416.75100
Number of Current & Previous Online Courses2.732.202115
Prop. Of Assignments Completed Out of Order38.1522.5238.80093.75
Average Magnitude2.311.061.77110.32
Total # of Course Assignments48.4312.12502871
Table 3. Descriptive Information

While some students submitted all assignments in their intended order, others submitted almost 94% of course assignments out of sequence. Half of the students submitted less than 38.8% of assignments out of order, while the other half submitted more than 38.8%. Overall, these results suggest that students turn in most of their assignments in the intended order, but there is variation among individual students. 

Investigating students’ assignment completion strategies and how individual differences contribute to course navigation may explain the variation in student submission patterns. When students deviated from the pacing guide and submitted assignments out of sequence, the extent to which they did so was relatively small. The average deviation was 2.31 assignments (SD = 1.06). As such, students were typically about two assignments “off” from the intended order. 

What is the relationship between course progression and students’ overall course performance?

What is the relationship between course progression and final course grade?

Students’ number of current and previous online courses did have a small but statistically significant relationship with final grade, proportion of assignments completed out of order, and magnitude suggesting that experience may factor into student performance but only to a small degree. 

The proportion of assignments completed out of order and magnitude had a statistically significant and negative relationship with final course grade, albeit with a small effect. The correlation coefficient for each variable was approximately -0.2, which was considered small.

This suggests that the proportion of assignments completed out of order and magnitude move in the opposing direction of the final course grade. For example, as a student’s final grade increased, the proportion of assignments completed out of order decreased (and vice versa). Similarly, as the magnitude increases, the final course grade decreases (and vice versa). 

While the current data does not allow for causal inferences, the negative relationship between completing assignments out of sequence and final grade suggests that completing assignments out of sequence may be part of a broader pattern of learner behaviors and characteristics that influence academic performance. For example, characteristics like cramming and poor time management are negatively associated with learning and may influence a student’s assignment submission behaviors (DeBruler, 2021; Hartwig & Malain, 2022; Malekian et al., 2020; Michigan Virtual Learning Research Institute, 2019). 

Are there differences in final course grades between students who go out of sequence and those who do not?

There was a difference of 9.5 points in the final grades of students who went out of sequence and those who stayed in sequence. Students who moved in sequence averaged a final grade of 89.2, outperforming their peers who went out of sequence and averaged a final grade of 79.7. There was slightly more variation in the final grades of students who went out of sequence; however, half of them received a final grade greater than 84.7% while half received a final grade less than that. Table 4 shows the descriptive statistics for final grades among enrollments who went in sequence and those who did not.

BehaviorM (SD)Median
In-sequence89.2 (10.9)92.3
Out-of-sequence79.7 (16.6)84.7
Table 4. Student Behavior and Final Grade

To gain a more nuanced understanding of how assignment submission patterns related to students’ final grades, the data was segmented into quartiles based on the proportion of completed assignments submitted out of order and average magnitude (values dividing the data into four equal groups). See Table 5 for quartile breakdowns. 

Students with assignment submissions in the top 25% (4th Quartile) for being out of sequence (meaning this group of enrollments had the highest proportion of “out of order” assignment submissions) consistently had the lowest final grades on average. 

Conversely, students in the bottom 25% (1st Quartile) for being out of sequence (this group had the least proportion of “out of order” assignment submissions) consistently had the highest grades on average. This suggests that final grades drop as students become increasingly out of sequence. 

Similar results were found when examining the magnitude variable. Students in the bottom 25% of magnitude (1st Quartile; i.e., students with the smallest magnitude values) had higher final grades on average than students in the top 25% for the magnitude variable (4th Quartile; i.e., students with the greatest magnitude values).

Quartiles1st  
Bottom 25%
2nd 
50%
3rd 
75%
4th 
Top 25%
  Final Grades  
Proportion of Assignments Completed Out of Order86.881.678.974.1
Average Magnitude88.581.276.375.3
Table 5. Final Grade Values Based on Quartiles

Discussion

The current study demonstrated the importance of assignment sequencing as it relates to course performance. While submitting assignments out of order was extremely common in STEM courses, it did not necessarily benefit students. Students who stayed in sequence had final grades that were 9.5 points higher on average than students who went out of sequence.   

Correlational analyses showed that the proportion of completed assignments submitted out of order and the magnitude of assignments submitted out of order had a negative relationship with final course grades. Follow-up analyses that looked at the differences in final grades when students were grouped into quartiles based on the proportion of completed assignments submitted out of order and magnitude revealed that grades continually dropped as students submitted more assignments out of order. 

The largest discrepancy in grades was between students in the first (bottom 25%) and second (50%) quartiles of the proportion of completed assignments submitted out of order and magnitude. Students in the first quartile (bottom 25%) for the proportion of completed assignments submitted out of order had an average grade of 86.8. As students moved into the second quartile (50%), their grades dropped by 5.2 points. Because students in the first quartile (bottom 25%) submitted between 0 and 20 assignments out of sequence, this suggests that students may start to exhibit drops in their grades as they surpass that number. Similarly, students in the first quartile (bottom 25%) for magnitude had average final grades of 88.5, which dropped by 7.3 points as they moved into the second quartile (50%). Because students in the first quartile (bottom 25%) had a magnitude ranging from 0 to 0.62, this suggests that even going one assignment out of sequence may be detrimental to students’ grades. 

While causation cannot be inferred based on the current methodology, it may be that submitting assignments out of sequence is part of a broader pattern of student characteristics and/or behaviors that impact students’ academic performance. For instance, self-regulatory skills and metacognitive abilities are associated with online course performance. They may also be related to students’ academic achievement and engagement with certain assignments (Xu et al., 2023; Zion et al., 2015). 

That is to say,  if students are not thinking deeply about their learning progress and making adjustments, they may be more likely to complete assignments out of order and receive lower grades. While research on assignment sequencing with adult learners has demonstrated a different pattern of results (Lim, 2016), the current research suggests that submitting assignments out of sequence may not be helpful for students. Adult learners may have more fully developed self-regulated learning skills, allowing them to more freely direct their learning. Students may still be developing these skills, and thus may more heavily rely on the guidance of instructors to fully conceptualize and draw connections between content. 

While it is unreasonable to expect students to adhere to pacing guides 100% of the time, transparency about course design, the scaffolding of content and material, and the purpose of assignments may help increase adherence. Instructors may also stress to students that following the pacing guide and completing assignments in sequential order may help increase their chances of achieving their desired grade. 

References

DeBruler, K. (2021). Research On K-12 Online Best Practices. Michigan Virtual. https://michiganvirtual.org/blog/research-on-k-12-online-best-practices/

Freidhoff, J. R. (2019). Michigan’s k-12 virtual learning effectiveness report 2017-18. Lansing, MI: Michigan Virtual University. Available from https://mvlri.org/research/publications/michigans-k-12-virtual-learning-effectiveness-report-2017-18/

Freidhoff, J. R. (2023). Michigan’s k-12 virtual learning effectiveness report 2021-22. Michigan Virtual. https://michiganvirtual.org/research/publications/michigans-k-12-virtual-learning-effectiveness-report-2021-22/

Hartwig, M. K., & Malain, E. D. (2022). Do students space their course study? Those who do earn higher grades. Learning and Instruction, 77, 101538. https://doi.org/10.1016/j.learninstruc.2021.101538

Lim, J. (2016). The Relationship between Successful Completion and Sequential Movement in Self-Paced Distance Courses. International Review of Research in Open and Distributed Learning, 17(1), 159–179. https://doi.org/10.19173/irrodl.v17i1.2167

Martin, F., & Whitmer, J. C. (2016). Applying learning analytics to investigate timed release in online learning. Technology, Knowledge and Learning, 21, 59-74. https://doi.org/10.1007/s10758-015-9261-9

Michigan Virtual Learning Research Institute. (2019). Pacing Guide For Success In Online Mathematics Courses. https://michiganvirtual.org/blog/pacing-guide-for-success-in-online-mathematics-courses/

Soffer, T., & Cohen, A. (2019). Students’ engagement characteristics predict success and completion of online courses. Journal of Computer Assisted Learning, 35(3), 378-389. https://doi.org/10.1111/jcal.12340

Wu, D., Li, H., Zhu, S., Yang, H. H., Bai, J., Zhao, J., & Yang, K. (2023). Primary students’ online homework completion and learning achievement. Interactive Learning Environments, 1-15. https://doi.org/10.1080/10494820.2023.2201343

Xu, Z., Zhao, Y., Zhang, B., Liew, J., & Kogut, A. (2023). A meta-analysis of the efficacy of self-regulated learning interventions on academic achievement in online and blended environments in K-12 and higher education. Behaviour & Information Technology, 42(16), 2911-2931. https://doi.org/10.1080/0144929X.2022.2151935

Zion, M., Adler, I., & Mevarech, Z. (2015). The effect of individual and social metacognitive support on students’ metacognitive performances in an online discussion. Journal of Educational Computing Research, 52(1), 50-87. https://doi.org/10.1177/0735633114568855

Zweig. J. (2023). The first week in an online course: Differences across schools. Michigan Virtual. https://michiganvirtual.org/research/publications/first-weeks-in-an-online-course

Appendix A (Click to expand)

List of STEM Courses Included in Dataset

  • Algebra 1A
  • Algebra 1B
  • Algebra 2A
  •  Algebra 2B
  •  Anatomy and Physiology A
  • Anatomy and Physiology B
  • Astronomy
  • Bioethics
  • Biology A
  • Biology B
  • Calculus A
  • Calculus B
  • Chemistry A
  • Chemistry B
  • Earth Science A
  • Earth Science B
  • Environmental Science A
  • Environmental Science B
  • Forensic Science
  • Geometry A
  • Geometry B
  • Mathematics in the Workplace
  • Mathematics of Baseball
  • Mathematics of Personal Finance
  • Medical Terminology
  • Oceanography A
  • Oceanography B
  • Physical Science A
  • Physical Science B
  • Physics A
  • Physics B
  • PreCalculus A: Algebra Review & Trigonometry
  • PreCalculus B: Functions & Graphical Analysis
  • Probability and Statistics
  • Veterinary Science: The Care of Animals
Appendix B (Click to expand)
VariableDefinition
Distinct AssignmentsThe number of unique assignments per course
Missing AssignmentAn assignment lacking a submission date and receiving 0 points
User DrivenThis is a benchmarking variable. It looks at students’ completed assignments and compares each assignment to the previous assignment submitted. If the current assignment is one greater than the previous assignment, it is considered in sequence. Otherwise, it is out of sequence. Out of sequence is indicated by a 1, and in sequence is indicated by a 0. Missing assignments are ignored.
Total Out-of-OrderThe sum of all the 1s a student has for UserDriven. The total number of assignments a student submitted out of order.
Completed AssignmentsThe number of distinct assignments minus the number of missing assignments.
Number of Current & Previous Online CoursesHistory of current and past completed courses.
Proportion of Assignments Completed Out of OrderThe total number of assignments a student submitted out of order divided by the number of assignments the student completed times 100.
Average MagnitudeThe average difference in sort order between consecutive assignments for each student within specific course sections.
Final GradeThe final numeric score the student received in the course.
]]>