A few years ago, Jim Marshall and I surveyed workplace learning professionals about elearning. Although we set out to learn about the contours of the elearning terrain, our project revealed much about instructional design practice today.
We pursued this question: when doing elearning, what are you doing? Were learning professionals relying on webinars, on podcasts, on mobile learning? What of scenario-based online programs, social networks and communities, discussion boards, or personalized programs?
We expected to find reliance on podcasts and scenarios, with healthy use of online communities and discussion boards. But when given the opportunity to rate twenty-six approaches from typical of practice to rare, respondents surprised us. Mostly, they used technology to enable those practices associated with good old fashioned instructional design.
The most frequently occurring use of technology reported by workplace professionals was for testing skills and knowledge. Interestingly, testing even surpassed classroom delivery via computers, though by a narrow margin. Tutorials, scenario-based learning, practice and feedback, and problem-solving strategies were identified as typical.
Practitioners inclined towards the familiar in mid 2009. Meat and potatoes ID dominated, with little reported of ID or technology that would be seen as new. We didn’t find much mobile or user generated content and scarcely any use of discussion boards either, except in higher education.
I harkened back to these findings after reading Xyleme CLO Jeffrey Katzman’s provocative blog posting about new instructional design. Katzman proclaims “a new world for the instructional designer,” with 3-ring binders kicked to the curb. He encourages instructional designers to relinquish control and cease reliance on “a process of extracting and interpreting information from the SME…. ” Katzman continues, “The expertise is out in the field …. In some cases the learner is the teacher, and teacher a learner.”
Appealing? Yes. Realistic? Not so much, not today. The 2009 study gives me pause. We are only now integrating good old fashioned ID.
What is this good old fashioned ID?
- Theory drives practice. There are reasons for the decisions that are made, and those decisions are based on the literature and best practices regarding learning, communications, technology and culture. Years back I did a project for a government agency. The director urged me to consider his behavioral roots, to make sure that participants received lots of opportunities to practice because that’s how people learn.
- Data direct decisions. Instructional designers make decisions based on data from many sources, including clients, job incumbents, the literature, work products, and error rates. Data focus the instructional designer’s attention, with output from one phase of the effort enlightening subsequent actions and decisions. When a client says, “Train them about performance appraisals,” instructional designers look to narrow the problem by turning to data, such as existing appraisals, help desk logs, and law suits. Where are the problems with the appraisals? Where are they not?
- Causes count. Once the mission is targeted, instructional designers want to know WHY. Why are appraisal forms flawed? Why is line 7 filled out inconsistently? Why are lines 2, 3, and 5 and 6 on point? Is it that they don’t know how or that they don’t think it’s worth doing or that doing it results in a hassle? Why does the group in Belgium do it, when the group in Boston doesn’t? Once causes are known, then a solution system can be tailored to the situation.
- Instruction is good, but not sufficient. Wise instructional designers ask questions about cause in order to use instructional resources where they can do the most good. Back to the appraisal challenge. Are the flaws in line 7 caused by not knowing how to write it up? Have they forgotten? Do supervisors doubt the value of line 7 or fear that honest and detailed entries could lead to unhappy employees or even lawsuits? If doubts and fears cause lame entries, training alone won’t improve performance. Instruction is only one thing we can do to develop and enhance performance.
- Outcomes rule. While there is disagreement from constructivists about how very royal outcomes are, most instructional designers subscribe to the importance of defining what participants will be able to do as a result of the learning experiences– and of guiding their experiences.
- Learning comes from action. The evidence is clear. Practice is critical to performance and even more of it leads to expertise.
- Teams deliver. Cross-disciplinary teams, with content experts, programmers, artists, and clients, join instructional designers to create the program in an orderly fashion. A recent project for a federal agency involved dozens of content experts, two senior instructional designers, a programmer, graphic artist, and two graduate student interns. They worked together, under the leadership of an instructional designer, in service to a client organization. Deliverables were established on the basis of analysis; outcomes articulated; and roles and approaches defined and honored.
Your mother’s ID proclaimed itself to be about outcomes, practice, data and authoritative sources. The Katzman blog posting, and influential voices like Jane Bozarth and Jay Cross, encourage us to consider a new ID that is characterized by informality, choice, learner centricity, and connections. While I don’t think this new ID is yet prevalent in the enterprise, in organizations like the military or financial services firms so dominated by compliance pressures, that doesn’t mean it shouldn’t be on the table. And there are glimmers. How would it work?
Benchmarks for the new ID
Is the ID that you and colleagues practice different today than it was a few years ago? If you and your unit were making strides towards the practice of this new instructional design, what would it look like to participants?
Here are some benchmarks:
- Participants enjoy MORE choices regarding WHAT they learn.
- Participants enjoy MORE choices about HOW they learn, including when and where.
- Participants have access to MORE sources of expertise, with appreciation for what they have to learn from each other.
- Participants enjoy MORE learning, collaboration, support and information IN THE WORKPLACE.
- Participants CONNECT MORE often, using technology for conversations with peers, experts, and supervisors.
- Participants MORE OFTEN REACH for learning, support and information in the workflow, leveraging MOBILE devices.
- Programs are MORE OFTEN selected and focused based on intelligence gathered from internal and external SOCIAL MEDIA.
Something old AND something new
If you ask 100 instructional designers for a definition of ID, you won’t get one. You’ll get many. That was true when I wrote it a decade ago. It is even more true today.
Walter Dick, long time Florida State professor, now retired, provided this definition: instructional design is applied educational psychology. For Dick, with Lou Carey, so very influential in the 70s and 80s, ID roots were in educational psychology. They still are, but we must also appreciate the gifts we are receiving from evidence about how organizations work, the brain, communities and of course, the omnipresence of information technology. All are goosing our conceptions of instructional design and opening the ID 2.0 door.
Because this emerging ID is less about authenticated deliverables and more about user choices and experience, it is both attractive and risky. Many call it informal. Does it make sense for your organization and people? Might some of it make sense? American Express’ Frank Nguyen and I built a free tool to guide consideration about risks and readiness to move towards these forms. Recently, when demonstrating the tool in an engineering organization, they remarked at how valuable it was going to be in their conversations with line leaders.
Must we pick good old ID or the newer ID? Is it dichotomous? Practice vs connections? Outcomes vs choice? Causes vs idiosyncrasy? I hope not. I think not. Why not a tailored concoction, recognizing context and strategy and mitigating risks. What do you think?