By:
Jamie N. Mikeska, Ph.D., Senior Research Scientist, Center for K-12 Teaching, Learning, and Assessment, ETS
Heather Howell, Ph.D., Research Scientist, Center for K-12 Teaching, Learning, and Assessment, ETS

Recent shifts to hybrid and online instruction have added to pre-existing challenges faced by high-need schools and districts in meeting teachers’ needs. We clearly need to support STEM teachers’ engagement in ambitious teaching practices, but how?
One promising approach is using online simulated teaching experiences such as those shown in the opening photo to provide teachers with safe spaces in which to practice and learn from experience without sacrificing instructional time with their students (Howell & Mikeska, 2021). Ambitious teaching practices, such as facilitating argumentation-focused discussions, are widely acknowledged as critical for student learning in both mathematics (National Governors Association Center for Best Practices, 2010) and science (National Research Council, 2013). Simulations can push the envelope, providing targeted learning opportunities for teachers to optimize student learning.
Argumentation typically involves students in knowledge generation that mimics the processes used in the disciplines to establish new knowledge. In mathematics and science, we see this most often in peer review associated with the publication of new findings, theorems, or processes. In K-12 classrooms, knowledge generation takes place via the social process of supporting, exchanging, comparing, and critiquing ideas. In both content areas, engaging students in argument construction and critique is a critical part of that process and instrumental in supporting conceptual development (National Governors Association Center for Best Practices, 2010; National Council of Teachers of Mathematics, 2014; National Research Council, 2012; Reigh & Osborne, 2020). In our research, which was funded through the National Science Foundation’s Discovery Research PreK – 12 program (grant 1621344), we attend to both aspects of argumentation – argument construction and argument critique – as teachers learn how to facilitate discussions through the use of this innovative technological tool – the simulated classroom.
Using Practice as a Site for Teacher Learning
We draw upon an extensive research base arguing that practice is a site for professional learning (Ball & Forzani, 2009; Boyd et al., 2009). Opportunities for teachers to learn from engagement in the work of teaching, or approximations of practice (Grossman, Compton et al., 2009), are essential in practice-based teacher education and professional development. Traditionally, approximations involve peer teaching, rehearsals, or role-playing. Recent technological advances, however, enable teachers to interact with student avatars who look, sound, and respond as students might. Research also suggests that approximations coupled with targeted support help teachers reflect, analyze, and refine their developing practice within communities of practice as they build towards improvement over time (Benedict-Chambers, 2016; Lampert et al., 2013).
Research Study
In our research, we examine how to develop and use a set of performance tasks in an online simulated classroom environment to support preservice elementary teachers (PSETs) in learning how to facilitate argumentation-focused discussions. Our overall research question is, “How can performance tasks delivered within a simulated classroom environment be used to develop PSETs’ ability to facilitate discussions in science and mathematics?” We implemented performance tasks in four elementary mathematics and science method courses at three universities in the United States, examining the PSETs’ improvement from the beginning to the end of the semester. For each performance task, a teacher educator prepared the PSETs in advance of individual discussion sessions and then led debrief activities after the discussions to help the PSETs reflect and improve over time. We also provided the PSETs and the teacher educators with feedback linked to the specific strengths and areas of growth evident in the PSETs’ video-recorded discussions targeting five dimensions of discussion practice (Mikeska et al., 2019):
- Dimension 1: Attending to student ideas
- Dimension 2: Developing a coherent and connected discussion
- Dimension 3: Encouraging peer interactions
- Dimension 4: Developing students’ conceptual understanding
- Dimension 5: Engaging students in argumentation
The cycle of preparing, facilitating, receiving feedback, and reflecting on the simulated discussion was iterated three times during each methods course. The goal was to prepare PSETs to facilitate these types of discussions with actual students in real classrooms, although this last part was not a focus of this research study.
The use of written feedback is described in the project’s video from the 2020 NSF STEM for All Video Showcase.
Methods
In an earlier phase of the project, we piloted our full set of tasks in each content domain (three formative tasks and one pre/post measure in each of elementary science and mathematics) with small groups of PSETs or first-year teachers in order to optimize the written materials, with a primary focus on usability. During the pilots, participants were asked to plan for and engage in a single simulated discussion, and we then administered a survey and interviewed them to find out how they made sense of and planned for the discussion.
For each implementation, we collected data across two semesters – what we called the baseline and formative semesters. In the baseline semesters, we did not implement the simulation cycles or ask the teacher educators to adjust instruction. Instead, we simply used pre/post measures to capture the PSETs’ understanding of and ability to facilitate argumentation-focused discussions. During the formative semesters, the teacher educators integrated the three cycles of preparation, simulated instruction, feedback, and debrief/reflection using the performance tasks in the simulated classrooms. Our study included 112 PSETs (43 baseline, 69 formative) across a convenience sample of three teacher educators. Each one was an experienced teacher educator who had previously taught the targeted methods course, and whose goals for the course included helping PSETs learn how to facilitate discussions in mathematics or science.
To measure PSET competence in leading argumentation-focused discussions, trained raters scored each video-recorded discussion on the five dimensions of high-quality argumentation-focused discussions using a three-level scoring rubric — beginning novice (level 1), developing novice (level 2), or well-prepared novice (level 3). We then compared the PSETs’ scores on each dimension from the beginning (pre) to end (post) of the semester for evidence of improvement across the baseline and formative semester groups.
To investigate how the simulated classroom was used by the PSETs, we performed a secondary analysis on a subset of the data from our initial pilots of the science performance tasks. We analyzed transcripts of the PSETs’ discussions in the simulated classroom to examine how they engaged the student avatars in two components of scientific argumentation: argument construction and argument critique.
Findings
First, findings indicated evidence of statistically significant difference in the PSETs’ ability to facilitate argumentation-focused discussions in the formative semesters from the beginning to end of the semester across all five measured dimensions (Figure 1; p = 0.000 for each dimension). Most importantly, improvement in the formative semesters was greater than that in the baseline semesters, and this difference was statistically significant for all five dimensions (p < 0.01 for each dimension). We also noted some dimension level differences in PSETs’ performance. On average, PSETs scored highest at the beginning of the semester on dimension 1 (attending to student ideas), initially performing just above the developing novice level and finishing closer to the well-prepared novice level. In contrast, starting scores were lowest for dimension 5 (engaging students in argumentation), on average, beginning closer to the beginning novice level and growing only to the developing novice level by the end of the semester. However, improvement was observed across all dimensions, suggesting that the use of the simulated classroom as a practice-based space for the PSETs was impactful.
Figure 1. Pre and post means from the formative semester scoring across five dimensions of facilitating high-quality argumentation-focused discussions
Second, most of the 43 PSETs who participated in our science task pilots provided opportunities for students to engage in both argument construction and critique during these simulated discussions, with moderately more emphasis on construction (Figure 2). While this retrospective analysis was completed only on a subset of data related to the science tasks, we believe that the findings are likely to generalize across both content areas. In these science discussions, argument construction tended to rely on evidence from the scientific investigation in the performance task (81.4% of the 43 discussions) rather than prompting students to share prior knowledge (32.6%) or personal experiences (14.0%). Argument critique most often manifested in prompts for students to explain why they agreed or disagreed with one another (51.2% of the 43 discussions).
Figure 2. PSETs’ focus on argument construction and critique during the science task pilots
Taken together, these findings illustrate that the simulated classroom can serve as a productive practice-based tool for PSETs to learn how to engage in the targeted teaching practice, as we discuss in more depth elsewhere (Mikeska & Howell, 2020). The PSETs showed significant growth, on average, from the beginning to end of the semester in their ability to facilitate argumentation-focused discussions across all five dimensions. In addition, the PSETs showed variability in the extent to which they engaged the virtual students in argument construction and critique and the teaching moves they used to do so.
Implications
Based on findings from this research, we recommend that:
- Teachers may find the conceptualized dimensions helpful as a framework for decomposing argumentation-focused discussions and focusing their practice. Because PSETs struggled most with encouraging direct peer interactions and engaging students in argumentation (particularly critique), teachers may need to focus their professional learning in these areas.
- Principals and school leaders should consider providing access to practice-based learning opportunities, such as simulated classrooms, for teachers to practice their discussion facilitation skills in mathematics and science.
- Policymakers should support efforts to scale up the use of innovative tools, such as simulated classrooms, to provide these kinds of formal and informal practice-based learning opportunities to a more diverse group of teachers.
- Education researchers should prioritize studies to investigate how to support the transfer of teachers’ practice-based learning within simulated classrooms to classroom practice and should help develop scalable technology-based solutions that can provide targeted feedback to support teacher learning.
- Teacher educators and professional development facilitators can use the performance tasks, scoring rubric, videos, and transcript collection as tools for inquiry, reflection, and analysis and adapt them for varied contexts and purposes within elementary methods courses and professional development in mathematics and science. To access these publicly available project materials, you will need to first register and create a free account at the Qualitative Data Repository and then search for the “GO Discuss” research project. Alternatively, after creating a free account and logging in, you can access these project materials in the GO Discuss repository collection. This collection includes one data project for each of the eight GO Discuss performance tasks, plus a data project for the non-task specific interactor training and another data project for the scoring materials. When you click on an individual data project, you can either download all the files associated with the data project by selecting “Download Data Project” below the data project name or you can download individual files by selecting the arrow next to the individual file name. We recommend reading the “GO Discuss Data Overview” file first; this file is available in each of the ten data projects.
The use of innovative tools, such as simulated classrooms, has been gaining traction in teacher education as a viable solution to providing additional practice opportunities for novices and experienced teachers alike to develop and refine their teaching skills. We are excited about the possibilities such tools open up to create productive practice-based spaces to support teacher learning, to generate video artifacts for reflection, to provide a basis for valuable feedback to teachers, and to broaden the set of experiences teacher educators, district leaders, and professional development facilitators have at their disposal in their formal and informal work with teachers.
Acknowledgements
This material is based upon work supported by the National Science Foundation under Grant Number 1621344. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
To learn more about this multi-year project and hear from project participants, you can watch the following set of short, three-minute videos that were included in NSF’s STEM for All Video Showcase in 2017, 2018, 2019, and 2020.