An Assessment Teaser: #EAinnovates Visible Evidence of Learning

IMG_6797.jpg

An aspect of design thinking I gloss over because of time (my excuse) and uncertainty (again I will claim too) is the assessment of Visible Evidence of thinking, processing, communication, idea generation, connections, and chunks of data. A lot of times this visible evidence is too quickly removed, cleaned up, put away for a rainy day, but more often the "mess" is thrown away.  Below is a slide show of Visible Evidence captures from a recent 4 day workshop at Episcopal Academy, I co-facilitated with @jimtiffinjr and @mpowers3 Innovation in Teaching Workshop  #EAinnovates

[cycloneslider id="eainnovates-visible-thinking-snaps"]

 

A question I am often asked is how do you "grade" design thinking. Sometimes I dread this question because I don't have the most tangible and accessible answer and I'm afraid my answer will turn off a teacher just new to the DT approach to learning. (I am ashamed to say that I have found myself answering this question with the standard, "You can use a rubric."- yet I don't have a rubric to offer) In a project or in the overly used term PBL, there are clear, concrete, and pretty predictable outcomes/products that can easily be slapped with a number grade and/or a standard project rubric can be applied. 

I understand the difficulty to grade the abstractness & subjectiveness of the visible evidence, growth, & demonstrations of learning during the DEEPdt process. To me design thinking is not a thing, it is a process, an experience of immense learning. Sometimes products are produced but not always and if it does occur, the outcomes are unpredictable. And with that said, I do not "grade" design thinking, I assess and I assess constantly. Where I falter in my practice and pedagogy is "officially" documenting the varied aspects of assessing the process. I capture constantly. Yet, I do not take the time to unpack it and "officially" assess it (which means reflecting, evaluating, writing it down, and submitting it in some "official" capacity), especially in regards to all the Visible Evidence produced by the designers. So during and when the challenge concludes, the designers and I are left with huge amounts of Visible Evidence to sift through, unpack, synthesize, digest, discuss, connect, assess, etc. A choice must be made Get Messier with the Visible Evidence or Toss...

So why don't I get messy? Again time is a major factor but if I was honest my assessments are occurring in the moment, giving and receiving feedback, questioning the designers, instigating a little, sitting back and observing, taking snapshops andvideo, etc yet the moments are fast and fleeting as the space I am in is in constant buzz and forward motion. And if I'm truly honest I haven't figured out how to make it all official looking, "pretty", or what to actually write down. I know what skills are being developed, maintained, and mastered, I just haven't written them down in some skills-based official format. A side note, I give my students time to capture, document their learning, write reflection pieces throughout the process, and give/receive feedback verbally and in written form for self, team, and others. 

When it comes to Mount Vernon designers, we cross-check and utilize learning outcomes & skills of their grade with each DEEPdt challenge and the number of LOs/skills that connect are astounding yet not surprising.  Also, the MV Mindsets are driven through and applied to all DEEPdt challenges.

I truly feel one (of many) strengths of learning and doing design thinking are the countless moments and opportunities for assessment of self, peer, group, process, and "for teacher". The skills demonstrated, honed, crafted, developed, attempted, learned (catch my drift) are endless as one goes through a DEEPdt challenge. There are just so many varied assessment opportunities occurring simultaneously to process, let alone document and capture for a teacher, the student, and the d.TEAMs. Or at least this is what I tell myself as I sheepishly overlook this deficit of mine most of the time. And on that note the point for writing this assessment teaser post....

I hope to write more about the how, why, and what in terms of creating and documenting official "assessments" for DEEPdt over the course of the school year.  As with our recently published DEEPdt FlashLab and Playbook, we utilized time (6 years), experiences, students and teachers to iterate the DEEPdt methodology and thus create the FL & PB.  In writing this post and being somewhat transparent in my uncertainty with an approach to documenting assessments, I will choose that route as well (except for taking another 6 years) So until then, thanks for reading. 

An Area of Assessment for DEEPdt I plan to explore and utilize more through the strengths, experiences, and wisdom of Jill Gough is her Leading Learners to Level Up (Leveled Assessment) i.e. I Can Statements:

Leading Learners to LevelUp

 Jill Gough's Blog (a goldmine of wisdom, demos, assessment examples, and pedagogical approaches)

Video of Jill Gough & Shelley Paul via Educon Leading Learners to Level Up #LL2LU - Educon304

Links to Random Formative Assessment Examples (yeah, I just googled them to add to this post): http://www.edutopia.org/groups/assessment/250941 http://wvde.state.wv.us/teach21/ExamplesofFormativeAssessment.html http://www.levy.k12.fl.us/instruction/Instructional_Tools/60FormativeAssessment.pdf

Dive Deepmary cantwell