Wednesday, July 21, 2021

Skills-based grading, simplified (an update)

Skills-based Grading and Skills Portfolio Update

I wanted to give an update to my skills-based grading system adventure. After a crazy year of mostly in-person but definitely not normal learning, I got some good takeaways and made some important adjustments to my skills-based grading and skills portfolio assessment systems. Skills-based grading does not have to be done via portfolios, but they do go nicely together. For a better understanding of my motivations and methods, including skills-base grading and digital portfolios for assessment, see some of my previous blog posts themed around skills. Maybe scroll down and start with the earlier entries and work up if you really want to kill some time.

I have found the skills-based grading system to be a meaningful and robust alternative to traditional and standards-based grading systems. Skills-based takes more work to establish in the beginning but the payoff in the end with a full course look at growth of actual important takeaways from class, rather than measurement of temporary retention of content, make the front end loading requirements well worth it. A big part of the heavy lifting in the beginning is establishing portfolios as the platform for assessment. Without portfolios I think you could hit the ground running earlier with skills-based, but the payout in the end would not be as much.

Simplified

Even with my previous list of 15 skills, which is much shorter than most lists of standards in standards-based grading, organized into five categories, I found it was complicated enough that some students got lost in the ocean of tasks and details and they sometimes lost the big picture. It also made regular updates and assessment of portfolios more difficult. They were just really long to write and read. Some of it felt like hoop-jumping.

After some discussions and evaluations of redundancies with students and colleagues, we pared down the skills to five basic skills, which were already the names of the categories. Here is the new list.
  • Investigation -- I can research, design and execute an experiment to test a question using the results to drive further inquiry.
  • Solution design -- I can model nature with detailed diagrams, mathematics and simulations.
  • Collaboration -- I can contribute positively to cooperative learning.
  • Metacognition -- I can use feedback and personal reflection to evaluate my progress and plan for further skills growth.
  • Communication -- I can convey my understanding and publish my results.
Having 15 different rubrics was also overwhelming, even if they were fairly simple checklists. Narrowing down and focusing the skills allowed for an opportunity to develop simple but powerful metrics for demonstrating mastery of skills in the form of 4-item checklists, with some over-achieving options for each. This also makes for quick an easy translation of items checked in evaluation of skills assessment to a grade. Each checked item moves your grade for that skill up a full letter. 4 checked off items = A, 3 checked off items = B, etc. Get a 5th checked item by nailing an exemplary option and get that A+. I'd love to not have to do the grade thing, but it's a game we all still have to play for now.



There is room in this system for working more quantitative measurements into the checklist items. For example, the third checklist item of solution design could include an 85% or higher average on homework and/or traditional assessment scores. This kind of adjustment allows for more traditional scoring to fit into the skills system, although I do not really use it for most classes. It sort of defeats the purpose and just becomes grade categories then.

There is also room for further adding details and working in content mastery requirements, but that would live in the things students do and the feedback you give them, which would all be considered as you evaluate their work for these skills.

Portfolio reflections moved to Metacognition journals

As far as portfolios go, I made each skill it's own tab in their Google Sites (it's no longer "New" Google Sites). Students fill their portfolio pages with artifacts of their work, including feedback, also doing a little writing to give the artifacts context. But instead of doing the deep reflections in those tabs I have them use four journals, one for each skill -- Investigation, Solution Design, Collaboration, and Communication -- and these journal live where it makes more sense for them to live, in the Metacognition tab. In fact, they are the only things that live on the Metacognition tab of their portfolios.

Now, when student are asked to update a portfolio section it takes them much less time and the work is much more focused. Multiple full portfolio updates in a quarter are reasonable. When they update a section I ask them to also jump over to their Metacognition page and make a journal update for that skill(s), following my guidelines for reflective writing for growth.

Simple means more opportunity

Simple rubrics and less of them mean students can dive deeper into understanding what they really mean without getting lost in details. They are easier for them to use as self assessment tools and for assessing each other, and they are easier to use as feedback tools. 

As mentioned, the shorter list of skills and simplified portfolios means the feedback cycle goes more quickly and there is more opportunity for monitoring growth more closely with regular feedback, conversations, and development plans.

There are opportunities for more details to be worked into assignments, tasks and assessments. For example, I have a fairly complex rubric for evaluating APA reports which students need to score high on to get that exemplary check. But zooming out and re-focusing on the big picture and the important takeaways from the class for assessment and reflection -- the growth of lifelong technical learner skills -- allows us to focus on what truly matters.

If you have any questions feel free to ask. Although I might just redirect you to my chemistry colleagues. I'm the idea guy behind this, but they are arguably better at actually implementing it.

Scott @BrunnerPhysics