KnowAtom's Blog - Insights to STEM Curriculum & NGSS

What Student Science Test Data Represents

Written by Francis Vigeant | Mar 29, 2016

 

Data can tell us a lot about how students are performing and in what areas they still need work. However, to uncover this kind of information, we need to be able to  assess the data based on what it actually represents. In this post, we’ll cover some of the clues that data can tell us and how you can use these clues to improve your students' scores. 


Now let's take a look at what the data in standardized testing really represent at this point in time.

Current data give us clues about four different areas: test design, standards, student factors and instructional factors.

Unfortunately, as a teacher you don’t have a lot of control over the first two factors. They are things you can’t give much input on, are by-and-large political processes that take time to build and don’t respond readily to teacher input.

The third area, student factors, offers a little more opportunity. You may be able to bring community resources or wraparound school services to bear on these student factors.

Instead, the area where you will have the most control and impact is instructional factors. Let’s take a look at each sphere individually.

Test Design

Test design comes in multiple forms: multiple-choice, open response and hands-on, to name a few of the most common examples. Multiple-choice tends to align best with drill-and-kill instruction, whereas open response caters more toward inquiry. Hands-on assessments do exist (the 4th grade New England Common Assessment Program (NECAP) is a good example), and students in such environments often work collaboratively for at least part of the assessment.

Hands-on assessment will expose issues with student comfort with the 4Cs. For example collaboration, their ability to use materials appropriately and whether they understand the expectations of an inquiry environment. Open response will expose issues with the students who haven't mastered evidence-based writing, what it means to support an argument with data and how to really think creatively, analytically or evaluatively.

In the case of multiple-choice testing, if a student has a limitation with time or motor skills, the test may expose some issues there.

Standards

Other clues revealed in the data come from a standards perspective, and the factors that play in here are content, inquiry and systems. The traditional model of instruction and the state standards prior to Next Generation Science Standards really focus on rote content.

Some states, however, do focus on inquiry. Whatever the standards focus on, the standardized tests have to assess it, which is why when you have a change of standards you must have a change of assessment as well. The two are supposed to mirror each other.

Let’s be careful not to confuse standards with curriculum.

If we compare standards to a health department regulation stating that 'you must clean your coffee pot twice a year', that doesn’t mean you should only clean your coffee pot twice a year. A standard is something students must be able to meet, but that doesn’t mean you should only teach it in order that they should meet it. This will produce at best proficient students, not advanced ones.

Standards will test for content that you should be teaching in an authentic way, which means you’re not isolating standards, covering them and then moving on, but rather combining and scaffolding those standards in diverse and relevant contexts. In this new next generation model, inquiry is now a bigger part of the equation. If you have inquiry standards in your state, you've got a leg up, but even if you don’t, it’s possible to get there.

The important thing to keep in mind is that standardized tests under NGSS are not going to focus solely on content, because that is only one dimension of the performance expectations. The tests are going to focus on students demonstrating content knowledge while using science and engineering practices and connecting areas of content via the crosscutting concepts.

Inquiry models, in the case of science, go from a question to a conclusion, requiring intermediate steps of creating a hypothesis, designing an experiment, planning it, carrying it out and gathering data. The same goes for engineering. How do you go from a problem to looking at the available resources, developing a prototype, testing it and reflecting back on the prototype, using the data to make a recommendation using evidence-based claims?

This is a movement in the new standards that is going to be reflected in the new testing data, which is most likely not reflected in your current data.

Systems really come into play with higher order thinking and also the interplay between content practices and processes. Really, practices are skills specific to the discipline but processes are how you go about answering questions and solving problems as scientists and engineers. Similar to being a writer, who needs to know about more than simply vocabulary, word choice and sentence structure, students need to actually understand how to carry out science and engineering practices beyond conceptual knowledge of the discipline.

Student Factors

Again, here is where you have a little more control and opportunity when it comes to impacting performance readiness. If students aren't getting food, if they aren't sleeping or if there's something going on health wise … that affects performance readiness. It reflects on students’ ability to engage in testing. Addressing some of these factors can help enhance student performance readiness, because they are no longer hampered by the lack of their basic needs being met.

The next student factor is decision fatigue, a scientifically documented phenomenon in humans and even in animals. We only have a capacity to make so many decisions in a given day or time period. At some point, having to analyze and make decisions over and over again in a short period of time causes us to be fatigued to the point that we make poor decisions despite our overall ability to make better decisions.

Thankfully, President Obama would like the reduction of standardized testing to be part of his legacy and has stated that standardized testing shouldn't take more than 2 percent of learning time. This will be beneficial to students, because districts that engage in month-long models of continuous testing actually promote decision fatigue in students, which has the potential to bring scores down overall. Then there are the social-emotional factors.

For instance, students who have parents going through a divorce, who have moved, who have experienced a loss in the family or even in the community or have experienced other traumatic events do not test as well. These social-emotional elements can negatively impact data, because students simply aren’t in a place to perform their best. Again, there is only so much you can do to mitigate this, which is why you should place the bulk of your attention on instructional factors.

Instructional Factors

KnowAtom uses a model that integrates curriculum aligned with NGSS, professional development, STEM learning practices and the right materials to support all of the above.

Though we have a bit more control over student factors than over test design and standards, we still don’t have much. However, the fourth area – instructional factors – is a different story. We have almost complete control. Instructional factors fall into three different areas: quality of curriculum, quality of instruction and quality of instructional environment.

Quality curriculum isn't just, “Do these four standards in the first three months of school, these next four standards in the next three months and the rest of them in the last trimester!” Nor is it “Introduce these 10 vocabulary words and flip through these PowerPoint slides and just do what you can with it!” That is not curriculum, and it is also not effective STEM instruction.

Quality curriculum integrates science, technology, engineering and math throughout grade levels and from September through June. This is a snippet of curriculum. Quality curriculum, rather, fits the model of effective STEM instruction we saw in our earlier definition. It starts early on as students enter the public education system, pre-K and up, and is intentionally implemented from September through June. It is nurturing in its progression throughout the school year and from grade to grade, and engages student interest with an appropriate scope and sequence that creates momentum.

Quality isn’t just in the accuracy of the curriculum, though, but also in the support for teachers. They need materials to help students engage in these experiences, both of which are mapped to curriculum, to encourage understanding by design.

Making authentic, supportive materials available to teachers enhances the quality of instruction and enables an instructional environment from which students benefit, learning to engage as scientists and engineers via meaningful skills they can use in later life. Making those materials available to teachers across the board will help remedy systemic gaps that appear in the data. We need to support teachers with background, clear guidance about what goes into a lesson, specific materials and objectives, practice and discussion.

We’re not talking a script, but rather a solid backbone of STEM education.

Quality curriculum, in this way, supports quality instruction: that next generation model of instruction. We need explicit instruction of the standards and content in our classrooms, in such a way that it engages students in developing and using those skills, and requires them to engage in higher order thinking. When we have it, it will be reflected in the data by higher test scores, earned by students using those higher order thinking skills and working as scientists and engineers.

Then there’s the quality instructional environment, where science and engineering are infused with the non-fiction reading, Socratic dialogue and other applications of Common Core math and ELA standards. Students must apply their skills in the context of the standards, then use those science and engineering practice skills over and over again, all year long, in different contexts.

Science and engineering practices are a tool students use to extend their understanding of the disciplinary core ideas, and that's where the battle is won or lost.

Schools in districts that really focus on pulling science and engineering practices, disciplinary core ideas and crosscutting concepts together are the ones that perform at the top. Schools that try to focus on bringing up their data without recognizing or integrating the three dimensions aren’t developing students’ higher order thinking skills, and their data will often reflect that, remaining lower.

Think about the factors you can control and how state testing data gives you clues to which factors are out of alignment. Unfortunately, because all of these factors are intertwined, when something is out of alignment it has a cascading effect through all the other factors.