Many science administrators know this tension well. Teachers are doing the work. They’ve built phenomena-driven units, they’re asking students to think across all three dimensions, and classroom engagement is up. Then state science test results come back — and the proficiency rates don’t reflect what you’ve been seeing all year.
This isn’t a curriculum failure. Instead, it’s a transfer problem.
The "We Didn't Learn About Sea Otters!" Problem
NGSS practice tests and state assessments — like the CA CAST, NY, NJ, and CT science exams — are designed around novel phenomena. These are observable events students have never encountered in class. That’s intentional. The test isn’t asking what students memorized. Rather, it’s asking whether they can use DCIs, SEPs, and CCCs to make sense of something new.
When a student encounters an unfamiliar anchoring phenomenon on test day and says “but we didn’t learn about sea otters!” — that’s the gap. Not a knowledge gap. A transfer gap. The student knows the content. What they lack is practice applying it outside the context where they learned it.
Most districts are closer than they think to solving this. The science phenomena examples already living in your curriculum are the raw material. The missing piece is structured practice — transferring that thinking to something new.
What "Sensemaking" Actually Looks Like on a State Science Test
When researchers and assessment designers talk about three-dimensional sensemaking, they mean something specific. A student looks at a new phenomenon or data set and uses what they know — not to recall a fact, but to construct an explanation.
On an NGSS-aligned assessment, a high-scoring response doesn’t simply identify a pattern. It connects that pattern to a mechanism (DCI), describes how the student figured it out (SEP), and often names a relationship that applies across science domains (CCC). Students who’ve only practiced sensemaking with familiar science phenomena examples will stall on the first step when the context changes.
The good news: this skill is trainable. Furthermore, the most effective training tool most districts already have access to is the classroom discussion.
How Structured Discussion on Novel Phenomena Builds Test Readiness
Facilitated class discussions around novel phenomena do something that worksheets and unit assessments can’t. Specifically, they make the thinking visible in real time.
When a teacher projects an unfamiliar science phenomena example and asks students to make sense of it together, several things happen. Students hear how their peers approach an unknown. They practice articulating the “why” before they know the answer. As a result, they learn to reach for DCIs and CCCs as tools — not just as vocabulary terms.
This is exactly the cognitive move the NGSS practice test demands. In other words, the discussion is the rehearsal. A simple protocol — observe, wonder, connect — applied consistently to novel science phenomena examples builds the transfer habit over time. No new curriculum is needed. What it takes is intentional, regular exposure to the unfamiliar.
Building a Science Phenomena List That Actually Prepares Students for State Tests
Not all science phenomena examples are equally useful for test prep. A strong anchoring phenomenon drives a full unit. For transfer practice, though, your phenomena should meet a few specific criteria:
- Short and observable. Use a photo, a graph, or a 30-second video clip. Students need to encounter it cold, just as they will on test day.
- Cross-disciplinary. Choose a phenomenon that could plausibly be explained using content from multiple DCIs. This reinforces the CCC thinking that state assessments weight heavily.
- Novel to your class. Genuine unfamiliarity makes for better practice. A science phenomena list built across grade teams and swapped between classrooms maximizes this effect.
When state test data tells a different story than classroom data, it can be frustrating. Yet it can also be revealing. Coverage isn’t always the issue. Effort isn’t always the issue. Transfer is — and transfer is a problem you can actually solve.
Transfer strengthens when students regularly engage with the kind of sensemaking that 3D science learning requires. When educators have the right data, they can respond with clarity and purpose. Your science phenomena examples are the best place to start.

