There are just two days until we are back in session, but three until I see students … (kids don’t have Digital Shop on the first day of school due to new year start-up logistics) … so, we’re continuing work this morning / day / weekend on the all-new Design Experience Zero [DE.0] for grades 6-7-8 (Grade 5’s will get a variant of last year’s DE.0.)
Just spent some time reviewing the NAEP 2014 Technology & Engineering Literacy (TEL) assessment, but not just the results. I was also interested in learning about the structure and design of the testing instrument. What did it test? How did it test it? I wondered: Are we doing the right things in Digital Shop? And are we doing them right?
I wasn’t expecting validation for our pedagogical strategy – but got it nonetheless.
Consider this brief video, literally published a month ago:
Among their findings:
- Overall, girls performed better than boys.
- Students who did activities outside of school focused on design and systems, like a robotics club, or building or fixing things on their own, scored higher.
- In-school learning focused on technology and society was associated with higher scores.
We definitely see #1 in my program, but, we’re not sure why. It would be nice to think we’ve somehow “cracked the code” regarding girls’ interest in STEAM but there’s no way we can make that kind of statement. (Besides, from what I’ve read and been told, it’s not unusual for girls’ interest in STEAM to peak in middle school, but it really drops off in high school. It will be interesting to see how our graduates progress through their high school careers.)
#2 is a great testament to our decision to focus on design and systems. Interesting how experiences students had outside of school resulted in academic better performance on the assessment. Well, guess what. Here in Northfield, we’re giving kids that opportunity both inside school (not just in my program, by the way) AND outside of school. So, we’ve got that going for us. Which is nice…
#3 is proof to me that human-centered design, and therefore, dreaming up society-benefiting devices like the prototype these gentlemen are showing at left, stimulate kids’ brains in ways far more powerful than a typical “recipe” based STEM learning experience. By tapping into (and requiring application of) their creativity and imagination, the learning is just so much more meaningful. Don’t believe me? Talk to some of my students.
Structure of the Instrument
This was not a multiple-choice test. Students were “asked to perform a variety of
problem-solving tasks based on interactive scenarios reflecting
realistic solutions.” Check out the sample tasks yourself. Wish they were collaborative, but, perhaps that’s asking too much in terms of difficulty to attribute results to a single indivudual. You can learn more about the testing methodology here.
More About Kids’ School Experience
Admittedly, this has been a weak link in our program; our prototypes have been so low-resolution that they couldn’t be expected to function as intended. This is something we want to focus on this year.
We DEFINITELY emphasize constraints when designing solutions.
Why focus on human-centered design? Why not design things that change the way people live, and make the world a better place? Things that matter to the community? Why, indeed.
Conclusions, Reactions and Interpretations (Mine)
I designed (and am designing) experiences for my students that require them to use systems thinking; ponder societal impacts; and consider real-world constraints. Our program is hands-on, developing (especially this year) fabrication and tool skills. We can (hope to this year) create more sophisticated, functional prototypes utilizing tools like Arduino micro-controllers. (That pesky 40-minute period is still a killer, though, even considering we have five sequential days to work together.) All of this leads me to conclude that while we’re not perfect, we are, as I’d hoped, at least directionally correct.
More to come…