In my last post, I indicated that I would describe some of our recent efforts and the innovation thinking behind them. So often I hear, “think outside of the box,” bandied about as a mantra for creative problem solving and innovation by those who like to describe themselves as “outside of the box thinkers.” I’m certainly not opposed to the type of ideation suggested by the phrase. Nor am I opposed to seeking out and exploring different or fresh perspectives when imagineering new solutions to old problems. So why, then, does the phrase “outside of the box” grate like nails on a chalkboard every time I hear it? Maybe it’s the oversimplification of innovation or change implied by the phrase or the lack of definition, context, and focus around what constitutes “the box.” I suspect it’s really the way the phrase glibly discounts or negates what’s inside the box, as though the entire solution is always “out there” without due consideration of what’s “In here.” Perhaps that’s why I find the “Closed World principle” of Systematic Inventive Thinking (SIT) so intuitively appealing. According to this principle, the closed world is an imaginary box or boundary around a current problem, opportunity, product , or service. There are attributes and resources inside the box that we can divide, multiply, or recombine in different ways to solve a problem or generate new value. The imaginary boundary lines around the box are simply a matter of perspective. Zoom in or out, depending on the desired area or scope of your focus.
Traditionally, continuous improvement processes in higher education have relied on feedback loops that incorporate aggregated assessment and course evaluation data into lengthy development cycles, with little to no benefit afforded to the current learner. This was true for both classroom based instruction and online learning. Yet unlike face-to-face classrooms, online learning systems generate significant underlying data related to learning patterns at the individual, group, and course levels. Much like dryer lint, that data was simply a byproduct of the learning processes and interactions also taking place inside the box–until recently.
Two of the most promising innovations being prototyped and piloted here at UMUC and other forward-thinking institutions are adaptive learning and learning analytics. Both make use of something that was already “inside the box”–the data byproducts originating from the learner’s interactions with the content and the delivery system. These innovations unify the previously discrete tasks of content delivery, assessment, and feedback collection with real-time data collection to provide adaptive content, learning activities, and other learning support services customized to current learner. For the university, real-time unobtrusive data collection can provide a level of contextualized insight on learner engagement with the LMS, course, content, and even others in the class not easily captured by traditional course evaluation instruments. In his recent post to the Innovation Excellence blog, Pete Foley presented a compelling argument in favor of using more contextually-sensitive data gathering methods to obtain user input about products and services, noting that “question based research asks our conscious mind to second guess our ‘unconscious’ one, and without authentic contextual cues to help it.”
You see, it’s not really about the box, but rather how you define it. Context matters.