A Pedagogical Richter Scale for Earth-Shattering Experiences

Transformation Rubric for Engaged Learning: A Tool and Method for Measuring Life-Changing Experiences

Is an article authored by Emily Springfield and Anne Gwozdek from U of Michigan School of Dentistry and Andrew P. Smiler, of Evaluation and Education Services, LLC. It’s published in the latest edition of IJeP.

In brief, they argue that although higher ed is proficient at assessing competencies, it lacks the tools to asses “affective aspects of learning, such as changes in perspective and identity.” Their rubric is both a tool and a method of measuring “a program’s impact beyond competency attainment”.

The authors begin by describing the online, E-Learning, dental hygienist program that led to the problem the rubric attempts to solve: the need for a tool to measure the “personally transformative” learning from the traditional assessments of competencies (64).

The next section, “Need for a Rubric,” is an inspiring description of the program based on the powerful testimonies of its students that “Something special” is happening in it. Their curriculum asks student to compose self reflections often and the final project is to compose a meta-reflection on all the reflections they made throughout the program. The response to that method seems to be where the magic is happening based on their students’ testimonies:

Students say of the program “it changed my life” or “I see the world in an entirely new way now.” Even mature students—those in the E- Learning Program are coming back to college after an average of 7 years in professional practice—with personalities not generally given to exaggeration, report that “This is the best thing I’ve ever done,” and “I didn’t really understand at first but after the last round of reflections, I really started to get it why we are doing all these extra things.” (64)

This article is a powerful resource for gathering data on high impact learning/ teaching practices. The most useful and interesting content is where the researchers describe their coding process for developing the rubric. Although labor intensive (developing and applying the rubric), the value is immediately apparent as the researchers explain in detail their reasoning and convincingly argue for their assessment themes: Confidence, Pride, Skills, Perspective, and Identity, as well as the types of deliverables that might best be assessed: eportfolios, self reflective essays, exit interviews, open ended survey questions, online threaded discussion questions, etc.

Here’s an example of the questions used to gather the data from Table 1:
Focus Group Questions Used in the E-Learning Program
The focus group data used to develop the coding rubric had the following questions:
•    Did you have any a-ha moments?

•    What do you see as the role of reflection in your profession moving ahead?

•    Do you notice differences between yourself and the people you work with vis a vis reflection?

•    Can you identify something you do differently as a result of being reflective?

… (68)

There’s a lot more included in the article detailing the steps and process of implementation.

We should consider using this tool as we engage in this curricular change in FYC. We can use it to measure where students are now in the current system and how eportfolio practice affects student perception on learning. Implementation will take more than simply copying the survey questions, but I think designing a plan to implement it to gather data over the next 3 years would be a wonderfully worthwhile mechanism for assessing the value of the eportfolio in FYC.


One thought on “A Pedagogical Richter Scale for Earth-Shattering Experiences

  1. They do pose an interesting rubric for something we generally rely so much for our instinct to guide us on, and I do trust my instinct, but I understand their point that having a way to measure the effectiveness of reflective writing “will help ensure the continuance of these programs.” I also like their categories, though I don’t see us having the personnel or time to do all the code-norming and assessing they describe in the article. Perhaps a scaled down version, as a preliminary exercise, could help us determine the effectiveness of the new approach and convince others of the same.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s