Transforming science education through research-driven innovation



AERA 2024


You won’t want to miss the sessions provided by BSCS Science Learning at this year’s AERA Annual Meeting in Philadelphia, April 11-14, 2024! Details listed below, or click here for more information (enter in the search bar: BSCS).

Under Pressure? Shifting and Sharing Agency Across Digital and Physical Materials in Collaborative Design

Authors

Madalyn Wilson-Fetrow, University of New Mexico; Sherry Hsi, BSCS Science Learning, Vanessa Svihla, University of New Mexico

Abstract

Because learners most commonly contend with problems that have a single right answer, learning to frame problems can be challenging. We investigated the evolution of learners’ agency—shared with digital and physical materials—as they framed problems, gained coding capacity and as new materials, like pressure sensors, were introduced, and how these shaped their design ideas. We collected data during a week-long camp for children (ages 10-15, N=12) focused on networks, design, and flow coding. We conducted interaction analysis of video and audio recordings, supplemented with artifacts and ethnographic interviews. Learners who more clearly framed a design problem evaluated materials and their affordances for fit, whereas those who struggled to frame engaged in a more exploratory fashion with materials.

Date, Time, and Location

Saturday, April 13, 7:45 AM – 9:15 AM, Pennsylvania Convention Center, Floor: Second Floor, Exhibit Hall A

Towards Culturally Sustaining Approaches to Science Classroom Assessment

Authors

Erin Furtak, University of Colorado – Boulder; Clarissa Deverel-Rico, BSCS Science Learning; Douglas Adam Watkins, Denver Public Schools

Abstract

Purpose

The new vision for science education reform is unequivocal in its call for more equitable science learning experiences for all learners, particularly those historically held at the margins in school (NRC, 2012). Classrooms should be designed so learners engage in science and engineering practices and apply crosscutting concepts to learn disciplinary core ideas, all to figure out real-world phenomena that relate to their ideas and experiences (NASEM, 2018; 2022). This approach builds on approaches to responsive teaching, in which students’ ideas are seen as productive towards developing disciplinary understanding and informing next instructional steps (Authors, 2016).

Since the release of the NGSS (NRC, 2013), high quality curricular materials have been developed that align with such responsive practices. Assessment, however, as part of a system that includes curriculum and instruction, has much room to improve toward supporting rather than hindering these advances (Shepard et al., 2018).

In this paper, we describe our approach to building assessments based upon students’ interests and identities, including the creation of a tool to inform science teacher adaptation of existing tasks to better respond to and sustain students’ lives outside of the classroom.

Perspectives/Theoretical Framework

Assessment is situated not just in science classrooms but within the larger ecologies intersecting with students’ lives. We draw on culturally responsive (Ladson-Billings, 1995) and culturally sustaining approaches (Alim & Paris, 2017; Paris, 2012), which emphasize that responding to and sustaining the knowledge and practices of youth are essential goals for educational reforms. These perspectives inform how we seek to reform science classroom assessment by looking to how “the sociocultural identities of Black, Indigenous, and People of Color (BIPOC) students [can be] deliberately integrated… in every planning/development phase of the assessment” (Randall et al., 2021, p. 596).

However, current assessment practice can privilege canonical concepts and explanations at the expense of learners’ ideas and experiences (Bang et al., 2013; Penuel & Shepard, 2016; Randall et al., 2021). This is in part because these assessments are largely mismatched with the vision of the Framework and, as a result, limit what is assessed (e.g. Au, 2007; Braaten et al., 2017).

Mode of inquiry and materials

We describe development of a heuristic to guide assessment design, use, and implementation that builds on the work of Randall and colleagues (Randall, 2021; Randall et al., 2021). The heuristic (Table 1) is intended to support educators, developers, and science education leaders to integrate more culturally sustaining approaches to assessment. It invites those in these roles to reflect on their positionality, existing assessment practices, the purposes of assessment, and how students can play a more central role in determining how they experience assessment.

Warrants for arguments and significance

In the full paper, we present arguments for each category in the heuristic as well as ways that the heuristic informed the design of professional learning experiences in our ongoing Research-Practice Partnership with a large urban school district. The paper is of critical significance as it interrogates assumptions about the status quo in science classroom assessment.

Date, Time, and Location

Saturday, April 13, 9:35 AM – 11:05 AM, Pennsylvania Convention Center, Floor: Level 100, Room 112A

Measuring the Impact of Multidimensional Science Instructional Materials

Authors

Cari Herrmann-Abell, BSCS Science Learning; Jeffrey Snowden, BSCS Science Learning; Molly Stuhlsatz, BSCS Science Learning; Brian Donovan, BSCS Science Learning; Cynthia Passmore, University of California – Davis; Patricia Olson, BSCS Science Learning; Christopher Wilson, BSCS Science Learning

Abstract

NGSS-aligned assessments that measure multidimensional science learning tend to be time and resource intensive. This presents a challenge when developing instruments to measure the impact of large-scale interventions and related trade-offs of using different assessment formats. Rasch and two-level random intercept models were employed to investigate if multidimensional tasks and multiple-choice (MC) items could detect an effect of the use of instructional materials focused on developing students’ model-based reasoning. A significant effect was detected with the multidimensional tasks. While the MC items did not show an effect, they were helpful in reducing testing burden and estimating more accurate person measures on the assessment. Our findings suggest utilizing both multidimensional tasks and MC items to measure impacts feasibly and effectively.

Date, Time, and Location

Saturday, April 13, 1:15 PM – 2:45 PM, Pennsylvania Convention Center, Floor: Second Floor, Exhibit Hall B