Two prominent faces of STEM education
Among the various ways to frame STEM education, two in particular stand out. One is more academically oriented, the other more geared to workforce interests.
Strong forces support both views, but they also conflict in many ways. Figuring out a way to harmonize the two faces of STEM education is an urgent question, yet one that gets little attention.
The academic side of the STEM coin is integrative in nature, drawing on connections among diverse fields of knowledge to teach students about the interrelated nature of the various disciplines the acronym subsumes. Students learn in different modes: qualitative and quantitative, synthesizing and analytical, verbal and numerical. And they develop varied skills: computation, communication and collaboration, problem-solving, persuasiveness, arguing from data, and others.
The workforce angle values STEM education more as an economic asset. It works as a form of training to prepare students to enter into and thrive in a workforce both shaped and driven by technological innovation and the technical skills of its constituents. Much the same kind of learning takes place, but the emphasis is on developing abilities like coding, using advanced industrial technologies, CAD, CNC programming, and other similar areas related to current workforce needs.
Looking through the filter of assessment
As false as any dichotomy, this distinction will nevertheless organize this post and the next. The point of departure for comparing academic and workforce STEM will be assessment, how to measure students’ learning and the benefits of each approach to broader, shared interests.
“Performance expectations”
As we have observed in the past, Next Generation Science Standards point STEM education practices in the direction of performance-based assessment. The standards ask students to demonstrate their learning by doing something with their knowledge, not just reproducing or representing it on a test.
This “doing something” takes the form of a “performance expectation,” which the standards define by grade level and content area. For example, 2nd-graders studying the properties of matter should be able to, “analyze data obtained from testing different materials to determine which materials have the properties that are best suited for an intended purpose.” NSTA offers a compact, comprehensive primer on how to understand performance expectations.
Important because …
As we have also observed, changes in assessment practices drive upstream changes in teaching methods and curricular materials. NGSS principles have great potential to reform STEM education for this reason. A linchpin of this change, however, lies in the availability of usable assessment tools up to the task of measuring how well students are meeting the performance expectations embedded in NGSS. As with the development of curriculum materials, these assessment tools have lagged behind the pace of state-level adoption of NGSS learning frameworks.
A fun test in STEM(!)
Green shoots are appearing, though, in the assessment development landscape. In 2014, the National Assessment of Educational Progress Technology and Engineering Literacy assessment deployed innovative, interactive “scenario-based” tasks to measure 8th-graders’ performance on engineering and technology challenges. Besides showing how to measure what students can do, not just what they know, the test proved engaging, a learning experience in and of itself. NAEP test organizers reported, in fact, that students called the test “fun”, almost certainly a first in the history of middle school science test-taking. (A new round of NAEP TEL assessment took place earlier this year, with results expected several months from now.)
Following in the NAEP TEL vein, two other projects are taking large-scale stabs at developing scenario-based assessment tools for use in NGSS-shaped curricula. The American Association for the Advancement of Science is leading one, focused on chemistry and physics. And the other, Next Generation Science Assessment, is a multi-university collaboration working on life sciences and physics. Each is in essence a demonstration project, working on models on which to base NGSS-style assessments across all fields.
Revealing results
Scenario-based assessment, designed to measure performance-based learning, offers tantalizing prospects for changing the STEM education game. The reason is that it promises to reveal and validate STEM successes in different ways and among different kinds of students.
For example, girls outscored boys by a small, but statistically significant degree on the 2014 NAEP TEL assessment. More broadly, recent NAEP results in math show that minorities continue to close the achievement gap with their white peers. Over the last 20 years, for example, eighth-grade white students’ average math scores have increased by 4.3 percent, African-Americans’ by 8.3 percent, and Hispanics’ by 7.2 percent. Explanations are multi-faceted, but the period does coincide with the spread of mathematics instruction oriented towards NGSS-style reasoning and inquiry and away from rote memorization and calculation.
An argument exists that inquiry- and scenario-based learning and accompanying assessment protocols will engage and reward a more diverse set of students in STEM fields. These test results seem to support this argument.
Making it real
Scenarios can ground otherwise alienating, technical content in relevant, socially meaningful contexts, an approach that is thought to appeal to groups of currently under-represented students. Furthermore, scenario-based learning lends itself to collaboration and open-ended inquiry, other features of classroom experience that can enlarge and diversify the cohort of STEM-interested students throughout the entire K-12 age range.
As the saying goes, we treasure what we can measure. If this is so, cracking the NGSS assessment nut could end up showing us how to treasure within STEM education a greater diversity of student abilities, and thus of students themselves. New ways of assessing students could open doors for new kinds of students to find success in STEM fields and then follow this success into STEM work opportunities. This prospect is especially exciting for technology- and engineering-related fields, where proportions of women and African-Americans have remained stubbornly low and static for almost 20 years.
Greater workforce diversity?
The carry-over impacts on the workforce – greater diversity of people, more varied skill sets and perspectives brought to bear on STEM-related enterprises – would benefit benefit both new entrants to the workforce as well as users and consumers of their professional endeavors.
It’s early to make a final call about the impact of these changes in STEM education on the demographic makeup of STEM fields or areas of employment. But the prospects are exciting and promising.
Next time, we’ll consider how to assess STEM education as a workforce development initiative and consider if and how we can enjoy the benefits of both approaches.
Eric Iversen is VP for Learning and Communications at Start Engineering. He has written and spoken widely on engineering education in the K-12 arena. You can write to him about this topic, especially when he gets stuff wrong, at eiversen@start-engineering.com.
You can also follow along on Twitter @StartEnginNow.
Brand new for 2018! Our new Cybersecurity Career Guide shows middle and high schoolers what cybersecurity is all about and how they can find the career in the field that’s right for them. A great pair with the recently updated version of our Start Engineering Career Guide.
We’ve also got appealing, fun engineering posters for K-2 and 3-5.
Our books cover the entire PreK-12 range. Get the one that’s right for you at our online shop.
Photos: LEGO minifigures, courtesy of Maia Weinstock