The Rapid Elearning Blog

 The Rapid E-Learning Blog - tarred and feathered elearning developer

It’s amazing how fresh eyes can find things you might have overlooked during production. So, before launching your elearning course, it’s a good idea to have others review it.  You want to discover any hidden issues before the big launch.  

Most of the times you find simple issues like typos or broken links.  However, there are times where you run up against larger technical issues.  In either case, it’s good to expose those issues prior to releasing the course for consumption, where you could be exposed to ridicule, and possibly tarred and feathered.

In this post, we’ll explore some ideas around the review process and getting your course ready to go.  Keep in mind that we’re at the end of the production process.  Ideally, somewhere at the beginning of your project you created a prototype course.  This is where you present the general flow and content of the course, and your client affirms that it’s all good.  It’s also when you want to invite some learners to review it as well.

You’ll always have to make some adjustments, but during the final review, there really shouldn’t be any major surprises.  It’s more about a final quality review check, making sure the course is tight, and that everything is going to be ready for the live implementation and launch.

Client Project Review

Prior to piloting the final course with your learners, the elearning developer, client, and subject matter experts should go through the course.  At this point, you’re almost done, so there shouldn’t be major changes.  What you’re looking for is stuff like this:

  • Are there any typos and grammatical errors?  You should do this before you meet with the client so you don’t appear sloppy.  But when you look at the same content over and over again, it’s easy to miss those things.  I’ve also found that sometimes you’re better off having a couple of “missed” typos to distract the client so they don’t nitpick things or throw a wrench in the process by suggesting additions.
  • Are links and external resources working? Is the contact information correct?  All links working and going to where they should?  Review anything that the learner will click on outside of the course content to make sure they work and go to the right places.
  • Is all of the content there?  There are some things you don’t learn about the course until you’re almost done.  This is especially true of some clients who don’t fully understand what’s going on until they see the final product.  Make sure the flow is right and that the course content supports the information in the course.  I’ve been on projects where we found that too many assumptions were made about the content and we didn’t see the gaps until after the course was ready to go.
  • Is the content accurate?  Sometimes information changes prior to the course launch.  This is especially true of policy and compliance training.  I was on a project once where some regulations changed near the end.  I’ve also been on projects where we were building technical training at the same time as developing the technology.  In that environment, sometimes the content is a moving target.
  • Are implementation plans in place?  What has to happen once you have a complete course ready to go?  Each organization is different but there’s usually some sort of marketing component that goes with a course launch.  You also need to make sure that the IT or LMS folks are onboard.  There’s nothing worse than delivering a really cool elearning course and learning that none of the PCs are equipped with speakers or headsets yet.

If you’re lucky, the client review will be smooth and you’ll make minor adjustments.  Unfortunately, these types of projects can start to get screwy at the end.  To avoid some of this, set clear rules.  The first being that at the forefront of the project you get an official sign off on what will be delivered and by when.

Another suggestion is to not bring in a new person for the final review.  Here’s a common situation.  The client is so happy with the course that she invites her boss to attend the review.  During the review, the boss who has not previously looked at the content starts to recommend changes.  Since he’s the boss, you’re kind of stuck.

Learner Project Review

The review you do with your client is going to be different than the one you do with your learners.  With your client, you review the project goals and agreed upon deliverables.  On the other hand, when you review the course with your learners you’re testing the course’s effectiveness.  Here are some things to pay attention to:

  • Is the navigation clear?  Does the learner know how to go from A to B?  While you don’t need to go overboard with instructions, you need to make sure that it’s clear what the learner has to do to advance through the course.
  • Have you provided the right instructions?  If you want the learner to do something that’s a little different than the normal navigation, make sure to provide clear instructions.  This is especially true of interactions and scenarios where they need to make choices or interact with content on the screen (like a drag and drop).
  • Is your course too sexy for its body?  Sometimes we want to go outside the box to create something unique.  While there’s nothing wrong with that, using non-conventional navigation and course structure can be confusing to the learner.  If you have to build a training module on how to use your course, that might be a sign to revisit the user interface.  In either case, be careful to listen to your learners if they complain about the structure. What’s obvious to you might not be to them.
  • Watch the learner go through the course?  Often we solicit feedback by sending a course link and having the learner forward their thoughts.  However, it’s valuable to sit and watch them go through the course.  You can see how many times they click, what they look at, and get a sense if anything in the design is confusing.  At a minimum, find at least one person who you can watch go through the course.
  • Does the course meet the learning objectives?  I’m not a fan of waiting to test this on the final run through.  Your best bet is to prototype the course and test its effectiveness before investing the time building it.  However, you want to make sure that the final product produces results.  Does the learner meet the learning objectives?  Does the assessment provide the information you need? 

A challenge with learner reviews is that they can be ego crushers.  You put in a lot of time to craft the course, perhaps trying a few new things.  And in just a few minutes, all of your joy comes crashing to the ground at the first criticism. 

Because of this, it’s tempting to discount the feedback you get from the reviewers, especially since they’re not “trained instructional designers” and probably don’t always understand what you’re trying to do.  Don’t fall for it.  Be humble and really consider their feedback.  It’ll help you build better courses.

Even if all you have is one person with which to test your course, that’s fine.  My advice is to find someone who has no interest in elearning and might even be a bit technically challenged.  Definitely stay away from people who build courses or know something about UI or usability design.  They tend to complicate things with their professional opinions. 

These are some basic tips for your final project review.  I see the client review as a way to do one final quality control check and to celebrate your success; and the learner review as a way to test that everything works as planned for those who have to take the course. 

Like I said earlier, you don’t want to wait until the end of the project to find out if your course works or not.  A good practice is to quickly mock up the course in PowerPoint and then test out your ideas, navigation, and flow of content.  If there are any major issues, they’ll surface there.  That will save you a lot of time down the road.

What are some of your experiences during the final review process?  What types of issues have you run into and what would you have done differently?  Please share your thoughts by clicking on the comments link.


Tidbits

I’ll be in Orlando next week.  You can continue to submit your elearning tips until I get back.  Then I’ll do a drawing for the copy of Patti Shank’s Essential Articulate Studio ‘09.  I’ll get her to autograph it while I’m in Orlando.

If you’re at the Learning Solutions Conference & Expo swing by the Articulate booth and say “Hi.”

Be sure to check out these sessions:

Related Posts with Thumbnails


41 responses to “10 Things to Consider Before Your E-Learning Course Goes Live”

Very interesting article! I agree with most of the things you describe! My conclusion would be: be proactive and follow a process. It will avoid you some bad surprises at the end of a project such as “the flash plug-in is not compatible” or “the voice-over has a wrong accent”.

Great stuff Tom. I am with you on the ego-crushers comment. But I always have to remember that the end-user is who the course is for in the first place. So the “non-techie” viewpoint is often the most valid in terms of usability. I have had a couple of times where I have done something and the people who understand Flash, Articulate or anything are like “yeah, great” and the regular user said, “huh, I don’t get it”. That’s always an eye opener. Back to the drawing board on that one. Thanks again.

Great tips. In my experience, it is really important to provide very clear timelines for the review and to let reviewers know that you WANT them to provide very honest feedback. I have had some reviewers and SMEs who were hesitant to provide important feedback simply because they knew a lot of work was invested and thought there comments may be discouraging. I always tell them that I want them to scrutinize the course and their honest feedback will be appreciated and will not hurt my feelings.

If possible, I also try to beta test the course with people in a computer lab. Quietly watching people taking the course reveals a lot. Are they having trouble navigating it, are they skipping sections, scratching their heads, failing the assessment/review questions, did they just throw the monitor out the window?, etc.

Again, thanks for the great tips.

March 16th, 2010

One of the down sides of eLearning over ILT is that we usually don’t get much contact with the end audience.

One of the potential high points of any project is to conduct user testing and actually see if the theories you had on engaging interactivity actually play out as expected. When they do and you get to see “the light come on” is a great feeling. When they don’t, its always better to find out about it before you roll out the training and have it miss the mark.

One of the high points in my career was a usability session when the test audience were actually fighting over who could “play” with my training next because it was so much fun. I would have hoped for that effect, but had I never conducted the user testing I would have never seen it in person.

Hi, Tom — very good post. An important point to consider before you start pounding on the keyboard — know (or at least have an idea) of what your customer needs/wants. The needs analysis is the basis for success or failure. Listen, Listen, Listen.

Keep up the good work!!!!!

Tom,

We are chuckling here! I’ve never been tarred and feathered, but your graphic certainly does describe the e-learning designer’s worst fear!

I believe one of the biggest mistakes made is when the target population is not included in the review process. Although it sounds so rudimentary, I cannot tell you how many of my clients open pilot testing to administrators or other educational professionals, but don’t make it possible for the actual learner group to provide feedback. I understand they want feedback on the content from the SME perspective (although pilot testing is a little late for that), if you do not have a sound sampling, the review data is just not reliable.

We do put our prototype modules under an intense review process (teaming designers and developers). This helps us to ensure a much better end product.

See you in Orlando!

Sandy

March 16th, 2010

Thanks Tom. I got user testing my e-learning developments by reading Web Usability books by the likes of Steve Krug and Jakob Neilson.

I quickly learned never to let any of my e-learning modules go out without at least a few pair of eyeballs going over it and they thinking out aloud (whilst I observe them). User testing reveals so many things that as designers/developers you may overlook, it’s definitely worth the little time it takes up.

Two quotes that sums it up for me:
Jakob Neilson, a web usability guru says – “Don’t listen to users, observe them!”

Henry Ford, founder of the Ford Motor Company said – “If I had asked my customers what they wanted, they would have said a faster horse.”

March 16th, 2010

Thanks, Tom! We are launching an entire learning portal for our customers in less than a month, and about to embark on some QA BETA testing with Rapid Intake. Thanks for all the great tips and confidence boosters :)

Excellent post Tom.
We try implementing most of these during our development process and have seen great improvements in the final output that we deliver. Gathering learner feedback, however, is often a big hurdle. Sometimes our clients don’t have the time or just don’t appreciate the value.
Thanks for sharing this.

March 16th, 2010

I have three comments:

  1. If you write an audio script, read it aloud before you or someone else records. You’re apt to find phrases that are awkward to say, or sound confusing to the learner. You’re also likely to adjust screen text to make it easier for the learner to read.
  2. Regarding instructions — do your homework and learn the computer savviness of your audience. Some will need both navigation instructions at the beginning of the course and additional instructions throughout the course, which should be a different font color to stand out. Other audiences will need just the basics and maybe a few prompts.
  3. I like to work with my clients to setup the player template and publishing options during course design so there are no surprises later.

I hadn’t heard it referred to as the “fuzzy thumb” technique – I call it the Amish Quilt – but I have often used the trick of putting in something that’s obviously less than perfect to give the reviewers something to call me on. It’s so frustrating to develop a really good program, based on sound design principles, and get only this feedback, “There’s always supposed to be .33 inches white space around our logo.” I find the logo is a good place to use your fuzzy thumb!

@HAS: you can’t go wrong with distracting them by improper use of the logo. :)

March 16th, 2010

In a blog he posted late last year, David Anderson mentioned a format he uses in working with clients. Here is the template he recommends: http://www.elearnovations.com/design/SlideDesign-And.pdf.

I have found this very helpful in mapping out a course and encouraging buy-in from the customer. This also enables the user to confirm up front the content and images. On the topic of images—I find that it is sometimes easier to have the client find the images they want used in their course.

One quick story about how IBM tested personal computers back in the early 80s: They would put test subjects in a room with a 2-way mirror. In the room was a binder of instructions, an unfamiliar box-like thing—in other words, an early IBM personal computer, and one of those old floppy disks. In their instructions they asked the user to “pick up the floppy disk and remove the protective covering.” The “protective covering” was actually the sleeve in which the disk was placed. Unfortunately…or perhaps fortunately…the sleeve had fallen off the disk. In an effort to “remove the protective covering,” several test subjects actually removed the hard sides of the plastic disk, leaving them with a very floppy magnetic film. Since the ensuing instructions asked participants to “insert the floppy disk into the drive,” those with the floppier floppy disks were not too successful!

This anecdote represents an important lesson that everyone has to start somewhere and that user testing is very important.

March 17th, 2010

Since I work in a small team, we are each responsible for reviewing our own courses. It’s challenging to do a thorough review once –being 100% aware of all of your course components. So, I have found it quite helpful to review my courses multiple times, focusing on a few different elements each time.

For instance, during my first review, I’ll make sure my script matches my audio verbatim. Then, I’ll review my course and make sure the visuals follow the script, the content is accurate, navigation is clear, etc.

[...] It’s amazing how fresh eyes can find things you might have overlooked during production. So, before launching your elearning course, it’s a good idea to have others review it. You want to discover any hidden issues before the big launch. Most of the times you find simple issues like typos or broken links. However, there are times where you run up against larger technical issues. In either case, it’s good to expose those issues prior to releasing the course for consumption, where you could be exposed to ridicule, and possibly tarred and feathered. Original post [...]

I love the Fuzzy Thumb piece! I usually do something similar if I’m waiting to hear back from my SME or sponsor. If I’ve given a reasonable amount of time for a response, and still haven’t heard back, I’ll finish the rest of the course, but stick a big neon green arrow with a text box that says something like “Insert Bob’s comments here”, and send it the full course to them for review. It serves two purposes – 1) to remind them they haven’t replied, but also 2) to show them how the information I need will be used in context. It makes my job easier when I finally do get the response I’m waiting for.

I always get about a dozen people to test a new course for me before launch. I entice them with a small gift card. I ask them to pretend I am not there and then watch them. I discuss their experience with them when they are done. It is invaluable- I catch a million things big and small- and really boosts my confidence in launching the new course.

@Judith – thanks for letting us know that storyboard worked out for you!

I liked your IBM anecdote and you’re so right about user testing. I think a lot of designers hold rough drafts too long before sharing with their SMEs or clients for feedback. Share early, share often:-)

March 17th, 2010

Perfect timing! I’m publishing a course for testing this week.

I have found it very helpful to have a professional UI/UX (User Interface/User Experience) software tester run through the course and provide feedback. I rarely receive good quality feedback from my SMEs and Learners during the testing cycle. A professional software tester with experience testing the usability and user interface has proven to be invaluable.

Good testers are trained to spot typos, broken links, broken functionality, poor navigation, etc… And they are trained to break things by trying to do things in the course that the elearning developer might not think of, but that a Learner might try if they are unfamiliar with a course.

We don’t have a tester in-house, so I contract out the testing as needed to a software test engineer. It has made a huge difference in the quality of the courses.

Great tips, thanks Tom- I agree its important to have the details agreed upfront about what will be delivered and when. And I love the tip about leaving in a few typos in- am going to try this on my next project!

March 18th, 2010

Perfect timing as JJTrainer says! We are about do a pilot testing with a group of user for a course and I was discussing with our SMEs on the criteria for the user testing! It couldnt have come at a better time. Thanks, Tom.

Great points Tom. I’d like to add some more thoughts about your point – “Watch the learner go through the course?”

We go to great lengths to capture learner feedback in a systematic and scientific manner. We call it “learnability testing”. This is the equivalent of usability testing of a product/app. Over the years we have realized the value of Learnability Testing. It not only provides us valuable feedback for the relevant project but also helps us increase our understanding of learners needs and motivations.

Now when we propose solutions to clients, it is easy because we have primary research to back our solutions. This approach is captured in detail with examples on Learnability Matters Blog.

1. Learnability Testing
2. Do You Test Your Online Courses

Your article here is quite enriching. Your advice is that one should not discount the opinions of key stakeholders on the way to success of the project. That is nice. Another stakeholders, are professionals in the e-learning community. However, we should know how to factor in their comments and views rather than seeing them as mere distractions for luxury of time which in most cases is not available. Thank you for this piece, Tom.

[...] Tom Werner on March 23, 2010 A good post by Tom Kuhlmann about the nitty-gritty of reviewing an e-learning course prior to [...]

March 24th, 2010

I agree totally about having a technically challenged person try the course and looking over her shoulder before launching it to the public. Gave me important pointers as what I need to do and that fancy is not always better :) Thanks Tom for another great blog!

As per the whole site, words written in stone. I would add that after all is said and done, if the site is designed properly and is graphically sound but uses an unprofessional voice to present the course the outcome will leave a poor impression.
Today it is so easy to get inexpensive yet professional narrations for training and tutorials. and the turnaround can be within 24 hours!
For those who are serious yet skeptic as to the difference, I personally record a short demo that can be used for evaluation and to gather opinions.
For any questions you can reach me via http://www.eliotcoe.com

[...] this month, Tom Kuhlmann talked about 10 things to do before your eLearning course goes live. His post makes great sense but what really caught my attention was the statement he began [...]

[...] changes or improvements you and your SMEs agree upon. I quite like Tom’s suggestions about things to consider before your course goes live. It’s crucial to get client and learner feedback before you launch, and some of your [...]

April 13th, 2010

This was an extremely interesting and helpful article. You’ve put some really good stuff. Keep up the good work.

[...] antes de fazermos o “release” de um novo curso? Ele foi reproduzido a partir do seguinte post do blog The Rapid Learning Blog, da comunidade do [...]

[...] on May 6, 2010 Two months ago, Tom Kuhlmann wrote a piece on the importance of thoroughly reviewing your elearning courses before launching them. One of his key tips was to watch learners go through the course in order to understand how they [...]

May 16th, 2010

I found this information very interesting and helpful. I am just beginning my masters degree in Instructional Design, I am learning from my mistakes as I go. After reading this post it has me realizing that I should have someone check out my project whether it is a video, blog, Powerpoint, etc. before I submit it to make sure there are not any glitches or compatibility issues.
Looking at the bigger picture; creating a course, I like the idea of having someone critique in steps as the course is being developed. If problems or issues are found early on it will probably be less of a headache to fix.

Great tips Tom. I can add one more — Is it what they are looking for?

[...] months ago, Tom Kuhlmann wrote a piece on the importance of thoroughly reviewing your e-learning courses before launching them. One of his key tips was to watch learners go through the course in order to understand how they [...]

[...] 10 Things to Consider Before Your E-Learning Course Goes Live » The Rapid eLearning Blog [...]

[...] 10 Things to Consider Before Your E-Learning Course Goes Live » The Rapid eLearning Blog [...]

[...] Read the article here. Posted in Best Practices | Tagged pretesting, quality assurance, testing | 1 Comment [...]

[...] share three free applications that might come in handy as you work on your elearning courses. 10 Things to Consider Before Your E-Learning Course Goes Live It’s amazing how fresh eyes can find things you might have overlooked during production. Last [...]

Great tips here. I’m also searching for solution on how to best provide course content to reviewers. If a quality review process requires content to be exported (e.g. to Word) so that comments and changes can be tracked, what is the simplest solution? I’ve so far found this to be tedious with courses that include various Engage, Quizmaker files. While each of those can be exported, I haven’t identified a way to create one comprehensive file where content appears in the order it would appear to the learner in a live WBT.
Is there a best practice identifed that enables SMEs and editors to review content using Word so that comments and edits can be integrated with the text?

[...] on http://www.articulate.com Share this:TwitterFacebookLinkedInEmailTumblrDiggStumbleUponRedditPinterestLike this:LikeBe the [...]