(this post was started Nov. 28, 2014…. it has been tough to shape, focus, verbalize, & hit Publish)
The Part 1 post can be found HERE.
Also, this post is connected to two previous posts, a rich and thoughtful discussion with Jim Tiffin, and later an email after said discussion from Jim & a retweet I made of a lively twitter convo. The two posts are HERE & HERE
Assessment & Design Thinking. Grading & Design Thinking. Doing & Design Thinking. Simply Design Thinking. I am curious of these areas you are most interested in learning about, receiving artifacts for, and seeing demonstrations? For me, I’m always about the doing. I tend to learn so much from the doing. Assessment & Design Thinking is an area I have admitted in the above posts as something I want to document in a tangible form to utilize for myself but more importantly for others to use as they “infuse” design thinking into their instruction and own learning.
I honestly admit to not being as well versed or skilled in the physical construction of paper trail assessments. The interchangeable lingo and nuances confuse the hell out of me. The recent verbiage debate rattling in my head is around the terms Learning Progressions, Proficiency Scales, and Rubrics. As a classroom teacher who recorded numbers way back when, the word assessment had equaled plain ole quiz/test. Performance, formative, summative, standards based, etc assessment terminologies were not utilized or batted around in my previous schools of employment. It was pretty much straight up traditional grading. All about the Letter Grade!
My experience and mindset of assigning numbers to everything that moved immediately stopped when I arrived at Mount Vernon (if it was not for MV, I am afraid I would still be playing the numbers game). What took its place was a natural assessment of demonstrations of learning by my students that marked their growth over time (not sure if that makes sense but it does to me). My students take on challenges, tasks, choices, opportunities to show their growth of learning by virtue of the fact that I’m not required to assign a # grade to their learning, their doing, their demonstrations, their growing. I like to think of student learning as the Flare (not the focus). Fast forward to today, this is still the case yet I feel the pressure to keep up with the SBA, Learning Progressions, Rubrics, Proficiency Scales, or even the “for learning to be meaningful it must be measured” Joneses.
So, I guess my question…
Where is this pressure to develop tangible DEEPdt centered assessments? Is it ego driven, i.e. If I don’t develop something that fits in a “box”, someone else will and it will be too late. Or, is it driven by my need to legitimize the design thinking methodology in the K12 EDU arena? If I was really honest, the pendulum swing of assessing data points to everything in the learning world of EDU has started to make me question my leanings because these data points are getting heavy, muddled, & in the way.
On Friday of my Thanksgiving break, there was a twitter convo between two notable and well recognized educators in their respective fields. There were many tweets I wanted to RT and even comment on but it was not my conversation to interject. Yet, for some reason I had to RT the below tweet.
— Chris Lehmann (@chrislehmann) November 21, 2014
I think there are many parts to this tweet that could be unpacked and interpreted. 1- how easily tweeters discount or “reduce” others efforts, almost to the point of using visceral dialogue in their arguments. 2- positions in EDU get discredited/put down based on another person’s title/space they occupy 3- Education is Personal… 4- daily work/grind/Every day Schooling vs. Paratrooping/Cloud Jumping/ Traveling to place to place occupations are stark and full of contrast.
Yet, what struck me most about the tweet was the phrase…”believe in more structure…”
And this is where Jim Tiffin’s words begin to chirp (loudly) in my ear. In my attempt to create some physical, tangible assessment piece of paper to go alongside DEEPdt, Jim called me out for putting design thinking into a box. And taking the tweet from Chris Lehmann above, Jim sent me an email to further his thoughts (he gave me the all clear to share). Here are just a few snippets from his email. (Also the words below from Jim are several blog posts in one that I could never succinctly express thus very grateful Jim did so for me 🙂
“But that’s not why I’m writing to you. It was actually about something that Chris referenced in his blog post – SLA’s common rubrics. (the grid doesn’t show up on my screen)
This is a great example of an amoeba-like structure, as opposed to a rigid box-like structure. It is something that can be unpacked, and since it is focused on a wide audience, is can be customized as needed… note that the page says: The descriptions in the empty boxes are filled in according to the subject and project nature.
I want to use that rubric to expand more on our last conversation of the day yesterday… the one about taking learning progressions (or whatever the term is; I’m still not 100% on the terminology) and creating them for design thinking K-12. (I was thinking about it last night and started to craft this message, but my phone was getting low on battery in the parking lot.)
Perhaps a better metaphor I should have used was “cookie-cutter” instead of “box”.
My nagging concern is that your beautifully-written and well-intentioned work will pollute the larger purpose of DT. People use the DT process to allow them to find solutions, and find problems, that aren’t one-size fits all situations. It helps people learn to think with flare and focus. People learn to become elastic thinkers that can wrap their minds around the most complex, or simplistic, issues. Students are prepared to go where their learning needs to go.
But putting that richness into a cookie-cutter, I worry that it symbolically condenses DT into – “well just do this, then this, and this and you’ve got it!”
That’s what happened with UbD. The problems you’ve told me you had with UbD are exactly what Wiggins has voiced as his concerns (You won’t read the whole thing anyway :-), but Point 1 and Point 11 are the relevant ones to my argument)
Rigidity and box-filling of a template ruin the potential of UbD for teachers via improper implementation.”
– Does student plan and structure the project thoughtfully and purposefully?
– Does student demonstrate the understanding of ideas through inquiry, research, analysis, or experience?
– Does student use a variety of skills and strategies to apply knowledge to the problem or project?
– Does student take the necessary steps to fully realize the project goals?
– Does student effectively communicate the central ideas of the project?
Here is a sample of Learning Progressions I created for the #DEEPdt Pioneer Grit Challenge. Thanks to guidance and instruction from Jill Gough for helping me learn how to make I Can statements… If you notice, these learning progressions were specifically written with SS/LA/Sci/Lit learning outcomes & direct student experience with this challenge in mind. There are not any stated guideposts that a teacher could attach to the DEEPdt process yet I see them. My practice of DEEPdt has given me the foundation to create these LPs, yet what about others who are just learning the craft? As Jim alluded, creating specific DEEPdt assessments for public use will limit teacher’s learning to create their own and instead hand them a “shovel-ready” assessment. I guess my next blog post needs to go into this space of discussion using Jim’s words from above….