Friday, December 4, 2009

White House crashers: a training issue?

Last night on CNN, I heard a Senator talking about the "White House crashers." He said that this training issue would be addressed with the Secret Service. 

I confess to not knowing the ins and outs of this situation (thank goodness I'm not on the Secret Service training team), but I feel that this type of comment illustrates what's wrong with the perception of the training world. It seems that, in this scenario, the security guards knew that they were supposed to check the guest list and identification at the door. Why didn't he do it? If he knew how to check the guest list and identification, then this is not a training issue! It could have been another performance factor that interfered: were the expectations to check every single guest not clear? Was he not motivated to follow proper procedures? Were the procedures too arduous, was the line of guests too overwhelming? Was the process flawed? 

I don't profess to know all the answers, but I'm pretty sure that this was not simply a training issue. 


Monday, November 16, 2009

Thanks for attending our session at DevLearn!

We had a great turnout and a lot of fun. 

Remember that you can try out our tools and see results of the carousel survey at the session's website: http://sites.google.com/site/ispigoog/devlearn-home

Tuesday, September 22, 2009

What's the next generation of e-learning?

According to Marc Rosenberg, a presenter at the April, 2009 ISPI conference, the future of e-learning is a knowledge management system that combines online learning materials, communities of practice, direct access to experts, information repositories, and performance support.

Key points I took away from his presentation:
  • work and learning are integrated; materials must be accessible at all times, in all locations
  • mobility: people must learn wherever they are (ipods at the gym or on the subway, for example)
  • we must think beyond just training to learning; most learning is informal (through colleagues, trial and error, Google searches)
  • our role as performance consultants and instructional designers must shift from education and training to providing an infrastructure to enable information sharing
  • the information must be accessible at the time of need
  • the information should be generated by the learners
  • "Don't kill the classroom!" There is a place for the classroom--some skills are most effectively taught in real life.
  • Components of an effective Knowledge Management system:
    • online training
    • information repositories, which require structure, accurate content, and ease of use (examples of effective ones: WebMD , World Bank , Vehix.com)
    • communities of practice, which are groups of employees with similar roles who are connected horizontally across the company (instead of just vertical groups) that share ideas and best practices
    • access to experts and expertise, and a way of "matching the right expert to the right need at the right time"
  • the best knowledge management system in the world is a library. You can walk into nearly any library in the world and find a specific book because they have common organizational structures. The implication is that information within a company should be easy to access, too, so disparate organizations within a company should agree on standard ways of presenting the information.
  • performance support that helps employees accomplish a task at the moment of need is important, too
  • Learning is a by-product of performance; learning alone should not be the focus of training initiatives

I look forward to continuing to advocate for these principles; we are spending a lot of effort conducting frustrating knowledge tests right now.

Friday, July 17, 2009

Summary of research about e-learning

The US Department of Education recently analyzed a series of research on the use of online learning. It was geared specifically for K12 education but contains interesting implications for adult use of e-learning as well.

The questions they set out to answer:
1. How does the effectiveness of online learning compare with that of face-to-face instruction?
2. Does supplementing face-to-face instruction with online instruction enhance learning?
3. What practices are associated with more effective online learning?
4. What conditions influence the effectiveness of online learning?

Some of the key findings:
  • Students who took all or part of their class online performed better, on average, than those taking the same course through traditional face-to-face instruction.
  • Instruction combining online and face-to-face elements had a larger advantage relative to purely face-to-face instruction than did purely online instruction.
  • Studies in which learners in the online condition spent more time on task than students in the face-to-face condition found a greater benefit for online learning.
  • Most of the variations in the way in which different studies implemented online learning did not affect student learning outcomes significantly.
    • Of those variables, (a) the use of a blended rather than a purely online approach and (b) the expansion of time on task for online learners were the only statistically significant influences on effectiveness.
  • Blended and purely online learning conditions implemented within a single study generally result in similar student learning outcomes.
  • Elements such as video or online quizzes do not appear to influence the amount that students learn in online classes.
  • Online learning can be enhanced by giving learners control of their interactions with media and prompting learner reflection.
  • Providing guidance for learning for groups of students appears less successful than does using such mechanisms with individual learners.
"Despite what appears to be strong support for online learning applications, the studies in this meta-analysis do not demonstrate that online learning is superior as a medium. In many of the studies showing an advantage for online learning, the online and classroom conditions differed in terms of time spent, curriculum and pedagogy. It was the combination of elements in the treatment conditions (which was likely to have included additional learning time and materials as well as additional opportunities for collaboration) that produced the observed learning advantages. At the same time, one should note that online learning is much more conducive to the expansion of learning time than is face-to-face instruction."


Tuesday, July 14, 2009

What to do when you don't agree with your stakeholder's requests

When I attended the ISPI Principles and Practices of Performance Improvement three-day workshop last year, I remember a story that Miki Lane told. He was a young performance consultant when a company called him up and asked him to lead a team-building training. He said, "Yes, I am happy to help." He conducted a brief analysis and discovered that, in his opinion, this team did not need a team-building training. He recommended several alternative performance interventions. His client did not want to hear his ideas and did not hire Miki as a consultant. A few weeks later, he saw a news clip about ABC company taking their team on a team-building trip in the woods. 

This story has remained with me for over a year and seems particularly relevant in my current situation. It illustrates an example of when a stakeholder calls you up, knows exactly what solution they want, and doesn't want to listen to any alternative solutions. 

In my current situation, a key stakeholder has decided that he wants to implement a certification program that includes a difficult SAT-like examination. I am firmly opposed to any multiple-choice tests in the workplace (a topic for another day)! I do not believe they accurately measure performance. Since the vast majority of performance problems result from environmental problems (like a lack of clear expectations and feedback, inadequate tools and compensation models), why does it make sense to test people? 

The test will be used not only to assess individual's skill gaps but also to identify gaps in training offerings. It seems to me that if we have to create this test, we should create learning solutions from which people can study first. I am concerned about the self-esteem (or worse, employment) issues that may result from people "failing" the test. I am worried that it will be a waste of time because we do not have a clear plan of what we'll do with the data or how to communicate scores and gaps to the learners. 

And yet, the stakeholder says we must do it, so we will.

Friday, July 10, 2009

Presenting at DevLearn in November!

We will reprise our "Web2.0 and Performance: What's Working for Google Employees" presentation at the Elearning Guild's annual conference this year. 

We hope you will join us in November in San Jose at the DevLearn 09 conference!

Wednesday, May 20, 2009

A moment of empathetic instructor panic

Last night I went to guitar class as usual. I'm recovering from a cold, so I didn't have a ton of energy, but I had requested the song that we would learn this week, and it was the last class of the quarter, so I went anyway. I am so glad I did, because I witnessed how my instructor handled an unexpected situation with grace and managed to turn a near-calamity into a successful teachable moment!

The typical class session in this series starts off with a review of the previous week's music. (We, the twenty students, all practice like aspiring guitar rockstars (folkstars?) all week, of course, unless life happens to get in the way.) We then learn a new song by first listening to a recorded version of it, receiving a handout of the transcription in tablature, then having the instructor demonstrate the tricky parts, measure by measure. We then get a chance to practice together so she can assess our progress and rectify any mistakes we're making.

Last night was anything but typical.

For the first third of the class, we reviewed last week's music, as usual. It took slightly longer than normal since it was arranged as a duet, and some people wanted the chance to play both parts, so we played through it a total of four times. The instructor then put in the CD of this week's song that we were about to learn, and as we listened, she realized she did not have the printouts of the arrangement. "I must have left them at Kinko's down the street!" she exclaimed. While we continued listening to the recording, she dashed to the copy machine in the building where class occurs. Seconds later, she returned and announced that the copier was broken. Of course!

My stomach lurched. I pictured myself as the instructor and wondered what I would do in that situation. Panic! 

Here's how she gracefully handled it:
  • Sent someone to Kinko's to search for her original copies and/or to make new ones
  • Talked about how she arranges songs by ear
  • Played the song for us again, called out chords, and invited us to play along with any strum or finger-picking pattern we wanted [I loved this part because it got me to think about just the chords, not the fancy finger-picking pattern that I usually focus on; it sounded great because everyone played along with a multitude of strums but since they were all on the same chords, it sounded harmonious; there's nothing quite like the sound of twenty guitars playing well together; I knew that even if I didn't nail every single fill when practicing or performing the song, I could still play the chords and sing along and it would sound great!]
Just as she started to teach us the fancy fills by demo-practice (less-than-ideal without having the music to look at), the copies arrived! For the remaining ten minutes, we walked through the music as we usually did. 

I left the class feeling inspired not only to practice the song but also to think about how to handle unexpected situations during training sessions.

Anyone have an example of a time your training or class didn't go as planned? Please share in the comments!

Thursday, May 7, 2009

Top 10 Tools for Learning Professionals

Jane Hart at the Centre for Learning and Performance technologies has asked learning professionals to contribute their lists of top tools we use to enhance learning and performance. 
  • Blogger : experts can easily contribute tips, thoughts, and best practices to a large community of learners. They can also be used to create a community of learners following learning events.
  • Google documents:  multiple collaborators can contribute to online documents and spreadsheets. They can be used to brainstorm ideas, edit project plans, share ideas, and generate repositories of information for use before, during, after, and instead of a learning event.
  • Google Moderator: this online tool enables a community to submit questions, ideas, and suggestions about a given topic. Members of the community can then vote on the ones they like best. President Obama has even used this tool a few times to gather citizens' sentiments.
  • YouTube:  we have been using flip cams and SnagIt to have experts share tips, tricks, and best practices on a variety of topics. Members of the community can view these easily and rate the ones that they like best or find most effective. The YouTube Symphony is a recent example of gathering top performances from around the world!
  • Twitter: I love hearing from learning and performance experts share quick tips, links, and information about what they're working on and how they're using new technologies. Our team has also used it for internal communication across geographies and for creating communities of learners following learning events.
  • WebEx:  Since travel budgets have been slashed, virtual classrooms are being used more and more. I love the features of WebEx Training Center, which include enabling participants to write on (or annotate) the screen (great for brainstorming!) and break out into virtual sessions with each other (connected to each other via phone and a shared computer screen).
  • SnagIt: I love this screen capture and screencast tool. For static captures, the annotation features are excellent (especially the spotlight and magnify features) for creating job aids for tool walk-throughs. For dynamic screencasts, SnagIt beat out Camtasia in a head-to-head competition for ease of use and quality of output (even though it has limited editing capabilities). Easy enough for a Subject Matter Expert to use! 
  • Captivate:  Adobe Captivate is a great tool for creating interactive scenarios. It quickly records what is happening on your screen, including audio narration, and adds captions. You can also add interactive elements like rollovers and clickboxes easily!
  • InDesign: I am a big fan of job aids. I think job aids can replace training and learning events 90% of the time; if not, they can certainly supplement what is being taught. Adobe InDesign enables the production of high quality, sharp PDFs.
  • Wikis: I don't have a favorite wiki provider, but wikis are amazing tools for online editing and collaboration. Wikipedia is a great example, but I also have used Google Sites and PBWiki successfully.

Monday, April 27, 2009

Web2.0 and Performance

Wiki website for our Web2.0 & Performance session at ISPI is live: http://sites.google.com/site/ispigoog 

We had a a wonderful time at the ISPI conference and look forward to reflecting on all that I learned. Stay tuned!

Thursday, March 26, 2009

Grading on Participation?!?

Recently, I was asked to review a colleague's assessment plan for an entire curriculum. The assessment plan was to provide feedback to the learner about the skills that they had learned throughout different aspects of the program. 

There were three main components of this evaluation: participation, knowledge, and application. 

I completely agree that application should be measured. It is difficult to measure this with a multiple-choice test (as was proposed); I recommend performance checklists or performance observations instead. 

I disagree with pure knowledge testing in a corporate environment (which is the subject of a future post). 

Participation? I always shudder when my partner comes home each quarter from the first night of classes in her Master's program. She shares the syllabus with me, and the grading section usually catches my eye. Many classes include practical applications of the topics, including role plays, videos, analyses, etc. Without fail, however, class participation is listed as 5-10% of the final grade. I have two problems with this:
  1. Participation should be expected, not incentivized.
  2. Everyone participates in their own way. How do you measure this? Is the person that answers and asks a lot of questions participating more than someone who sits quietly, reflecting upon the information in class?
An instructional designer friend at a college mentioned that in their blended classes, the professors grade on online contributions (or participation). I suppose that it makes sense when you are trying to assign a letter grade and in order to have a thriving online class, people must contribute to online discussions. She said that professors, too, are evaluated based on how much participation their students display.

What do you think about grading on participation?

Monday, March 16, 2009

Class Review: Public Health

As an instructional design/training professional, I revel in continuous learning. I feel that it is critical for trainers and instructors to frequently play the role of student for a myriad of reasons: remember what it is like to know absolutely nothing about a topic, learn from other instructor's best practices (and unfortunately, learn what not to do). Since I value learning so highly, I constantly take new classes in a variety of topics: gardening, guitar playing, song writing, pottery, harmony singing, presentation skills, improvisation, project management, MS Project, human performance technology, and Portuguese are just a few topics I've learned about through classes or trainings in the last few years. I will post what I have learned about teaching by attending these classes from time to time. 

My partner and I recently attended a class offered by our health care provider as a mandatory prerequisite for being admitted to a clinic. 

Here's what I learned:
  • Practice with technology prior to the session. There is no excuse for being flustered by PowerPoint!
  • State your objectives clearly at the beginning of the session. What will we learn? What will we be expected to do at the end of the session?
  • Provide clear, effective handouts. They did give us one, but it was terrible (complicated diagrams were printed out PPTs in "notes" version-- too tiny to see).
  • There were action steps that were required for us to complete after the class before being seen by a doctor: point these out clearly, and give people a checklist.
  • Be careful of gender/sexual orientation discrimination. This class was primarily for couples. Probably ninety percent of the couples attending this class are heterosexual, but this clinic treats couples of all sexual orientations. Instead of constantly referring to husbands and wives, please consider mentioning partners instead.
  • Participants will vary in how much sensitive information they will freely reveal (e.g. a participant who was maybe 25 shared with the class that she had been recently diagnosed with menopause). This class did not include a mandatory sharing portion; this came out in the Q&A portion. I was a bit shocked that someone would reveal so much, but I suppose she felt safe in this environment, for which I have to give kudos to the instructor!

Thursday, March 12, 2009

Always make handouts for training/classes

Every single time you deliver a class or a training, you should create a handout. 

Handouts help you convey the most important topics, provide a framework for note-taking, and enable your learners to refer back to something tangible when they go to use the skills or knowledge you are teaching them. It helps enhance both retention and application of the skills, therefore increasing performance. 

Before you deliver the class, after you have determined your objectives and activities, create your handouts. 

Handouts should not simply be a printout of your slides. However, that is a great start, and it is better than nothing! I encourage you to go beyond these printouts to think about creating something that will help your learners organize and synthesize information. If you are presenting a complicated diagram or chart, reproduce it larger for your students. They could add labels as you talk about each part, for example. Think about how your learners will be using the knowledge you're teaching them--make it easy for them to reference this information later.

The handout could be printed out or in electronic form. I like to save trees as much (or more!) as the next person, but I also value having a framework in which to take notes, organize thoughts, and refer back to later. Sometimes, I even hang the handout next to my desk (currently I have a chart of Bloom's verbs, a performance analysis flow diagram, and a Human Performance Technology model hanging around my workspace. Looking around, several of my colleagues have Bloom's verb charts taped to their monitors, too. Above my desk are manuals, references, and textbooks, some of them from trainings I have attended. I love being able to page through the manuals to remind me of an activity we did in class or the 5-point framework that was discussed, for example. Providing this as an electronic reference could also work; that way, people could decide whether or not to print it out to address their own needs. If you decide to do a solely electronic version, please consider sending it to your participants in advance so that they can decide if they should print it out. Several times I have been in classes when I have heard, "Oh yes, we'll giving you this handout at the end, you don't need to take notes," only to be disappointed by the lack of detail in the handout afterwards.

Now that you have created your handout, do you really need to hold the training or class after all? Can you shift the focus of your class to teach people how to use the handout in their jobs or real life, or a discussion about the content? Can you shorten the length of your class?

I cannot think of a single time when I attended a professional training class without a handout. However, the informal training classes within our company, as well as the community/ recreational classes I have attended, frequently lack handouts. It is my professional opinion that every single class and training include a handout of some sort. [Note: college classes may be different. Higher education is not my area of expertise!]