Case Study: Reviewing a Virtual Instructor Led Training

I recently wrapped up a project helping a client make his virtual instructor led training (vILT) the best it could be.  When I first reviewed this course, I was impressed.  Although I didn’t understand all of the content (it was quite technical and involved specific skills for software engineers), it was immediately obvious that everything in the course would be highly relevant to the target audience.  Furthermore, the course engaged learners in practice activities from day one and gave them opportunities to get expert feedback.  In other words, this course was designed to develop skills in an actionable way – not merely teach learners about the topic.

Even though I was impressed with the structure and content of the course, there was room for improvement.  

An Easy Win

Some of my suggestions were small and easy to implement.  For example, in some cases, instructions weren’t as clear as they could have been.  Even though the learners were highly accomplished and motivated adults, figuring out instructions was a distraction that added to cognitive load without providing any benefit.  As an outsider, I was able to clearly see where instructions were unclear and provide specific remedies.

A Technical Fix

In a classroom, it’s relatively easy for people to see when other people are getting ready to ask a question or make a statement.  However, in a video call, even when everyone has their camera on, these cues get lost.  This can make it feel awkward for learners to jump into the conversation, ask questions, or otherwise participate.  There are a variety of ways to ameliorate this.  One of the suggestions I made was to make full use of the chat function.  In this case, that meant taking a moment at the beginning of each class to encourage the learners to use the chat for comments, questions, and observations.  It also meant assigning a teaching assistant to monitor the chat and break in periodically when learners brought up points that the instructor should address.

An Evidence-Based Add-On

Spaced practice promotes learning and retention.  In other words, 30 minutes of practice is more valuable if it is broken into two, 15 minute sessions that occur on different days than a single, 30 minute practice session.  In this case, the class already happened over the course of a week, so the learners were getting some benefit from spaced practice.  But, after that, there was no follow-up.  I recommended creating a set of emails with content reminders and suggestions for ways to apply the content of the class to the workplace environment.  These emails were designed to go out after 1 week, 3 weeks, and 6 weeks.

A Major Adjustment

Think back to some of the more technical classes you’ve taken – maybe in physics, math, or computer programming.  Have you ever experienced a situation where the initial examples and problems were easy, but the difficulty ramped up too fast and left you in the dust?  This is common.  The solution is to add more intermediate steps to the learning journey and to scale the difficulty of those steps carefully.  This isn’t easy, and it can sometimes leave the most accomplished learners in a group chomping at the bit.  In this case, my recommendation was to create a set of optional assignments to bridge the difficulty from the warm-up examples that explained concepts to the realistic problems similar to what would actually be encountered on the job.

If you currently have a vILT class that you teach, you may find it useful to have an instructional designer review the class.  Often, even an excellent class can be improved – and you may find that there are things you can do to improve the learner experience without adding to your own workload!