We were happy to see two new ExhibitFiles case studies posted by participants in the PI Summit, held July 25-26 in Washington, D.C. Liza Pryor of the Science Museum of Minnesota wrote about Science Buzz, and Elizabeth Fleming of the Museum of Life and Science in Durham, North Carolina, wrote about Flip It, Fold It, Figure It Out. Like a number of other exhibitions described in earlier case studies, both were funded by the National Science Foundation (NSF), a U.S. federal agency.
As anyone in the informal science education field with NSF funding now knows, documenting the impact of the work is an increasingly high priority. NSF issued a Framework for Assessing the Impact of Informal Science Education Projects (PDF) earlier this year to guide grantees, and future proposals will need to address impacts laid out in this publication.
But defining intended impacts can be a challenge in the rich and multifacted world of informal, lifelong learning, and assessing whether or not we’ve achieved these impacts can be even tougher. In a later reflection added as a comment about her case study, Liza notes one challenge she faces in assessing the impact of Science Buzz: “We’re working on our summative evaluation,” she says, “but we don’t have anything to compare our data TO. We’ve got the data from the Pew internet study, but it’s not too helpful. I’m particularly interested in studies of online communities. What’s a decent participation rate? Any way, without resorting to discourse analysis, to figure out what people are learning?”
Maybe this online community can give Liza some help. In fact, we’ll be posting soon about how we’re thinking about this in relation to ExhibitFiles itself.