Monday, April 18, 2011

Reflections on #Lrnchat: What Can We Learn about Learning from Star Wars? (Part 1)



Each week that I am able to participate in #lrnchat discussion I post a summary of the discussion to my blog. I do this both for my personal development as well as sharing with the Learning and Development Profession at large. This summary is based on my own interpretations of the chat; others who participated may have differing opinions or interpretations of the discussion. I welcome those that do to add your ideas to the comments.

The topic of this week's #lrnchat session was What Can We Learn About Learning from Star Wars?". 

One of the ways I enjoy learning is by trying to find connections between two seemingly unconnected and unrelated things.  I find that the process of building these connections very often opens my mind to possibilities I had not considered before. 

I often do this with learning, asking myself the same question: What can 'XYZ' teach me about learning?  The fun of it is that 'XYZ' can be anything, and the more difficult it is to build the connections, the more enjoyable and valuable the experience can be.  I’ve done this in the past on my blog with the video games Angry Birds and E.T. The Extra Terrestrial.  This week’s #lrnchat provided a similar opportunity, as we explored what the Star Wars movie series can teach us about Learning.

This #lrnchat was especially enjoyable, as there was tremendous energy and excitement around the theme.  There just seemed to be a great number of people having fun during the chat – some even created special avatars for the occasion.


So let’s explore the Star Wars Universe, and see what it can teach us about the world of learning.  In this exploration, I look not only at the stories themselves, but also the making of the films.

Because of the length of this post, I am going to spread these reflections into two (or three) separate posts.  This first response explores the first of the six questions asked during the chat: 

What are some lessons you take from Star Wars?

The more you try to control, the more people rebel.

In a scene from the original Star Wars, Princess Leia defiantly states to General Tarkin: “The more you tighten your grip, Tarkin, the more star systems will slip through your fingers”.

This speaks almost directly to the learning professional, and to the shifting tides that we need to change with.  Historically learning has been about control, with a trainer showering selected knowledge onto participants.  It was a very controlled environment, which is very much why it was also often a very ineffective environment. 

You can not force people to care about learning that is forced upon them – just look at the average organization’s compliance training program for a perfect example.  More and more, employees are setting their own learning paths, and it may not fit into the cookie-cutter approach designed by the L&D group. 

Learning Professionals need to adapt to this new model.  We need to find ways to support and foster this new environment for learning, and to incorporate our formal approaches into this new environment in a more focused and targeted manner.  The more we try to force learners into the 'old' mold, the more they will resist it, and ‘slip through our fingers’

If you don’t complete your training, Darth Vader is going to kick your @ss and chop off your hand.

While it was ultimately Luke’s decision to leave Dagobah, the fact that he faced Darth Vader without completing his training ultimately resulted in very poor on-the-job performance, leading to very real ‘Level 3’ evaluation data in the form of the stump foound where his right hand used to be.

Kidding aside, too often learning professionals are operating within the constraints of a specific time allocation related to a learning program.  We allow ourselves to be pushed into situations where we try to ‘fit’ more content into shorter timeframes. 

Learning Professionals need to be stronger in these situations and be very forthcoming about what can be accomplished – and what will be sacrificed – if the expectations for a program exceed what can be accomplished by the program’s constraints.  Otherwise, when someone loses a hand while putting new skills to use, it will be because training has failed.

Luke: “I can’t believe it.”;  Yoda: “That is why you fail.”

In one of the key scenes from The Empire Strikes Back, Yoda uses the force to pull Luke’s X-Wing out of the swamp, and the above exchange of dialog takes place.  It’s a simple and powerful concept that should be very present in the design and delivery of our learning programs. 

It is not enough to deliver learning.  We need to connect learners to the content on an emotional level – the quintessential WIIFM needs to be present.  In addition, we need to verify that our learners believe the performance outcomes expected of a learning program are realistic and achievable.  If they do not believe this, they will never be able to perform effectively.

I’m OK with Yoda pulling the X-Wing out of the swamp to provide the example of what is possible.  I just think he should have dumped it back, looked at Luke, said “Your turn it is.”, and continued the training.

The smallest design flaw can blow up the whole Death Star

On a project the size of building the Death Star, it’s unlikely that none of the thousands of parties involved would have picked up on the design flaw of an exhaust port with a straight path directly to the volatile core of the facility.  Yet that’s exactly what happened, with catastrophic results (unless you are a rebel).  In learning design, it’s entirely possible that one simple mistake could derail all of the desired outcomes of a program. 

Sometimes this issue surfaces in the form of an error in content.  A simple miscommunication between the Subject Matter Expert and the Instructional Designer results in an inaccuracy in the program content.  Learners that recognize this tend to fixate on it, and you really can’t blame them for that.  After all, if this one fact is incorrect, how many other mistakes are there?  Should any of it be taken seriously?

E-Learning design also can suffer from the small error that can completely derail learning.  Sometimes it’s not design as much coding.  If you’ve ever clicked a button and gotten an unexpected response that should have been picked up in testing, you know what I mean.  Often we shave testing time in the pursuit of ‘speed-to-market’.  That’s probably what happened with the Death Star.

And just look at how well that turned out….



Subject Matter Experts should not produce nor direct the projects.

That’s a direct quote of a tweet from Aaron Silvers, and one that I wholeheartedly agree with.  It’s not a commentary on the stories as much as the making of the films.  George Lucas is the definitive Subject Matter Expert for Star Wars.  That doesn’t mean he is the most qualified to produce, direct, or write the stories.  This was one of the biggest complaints of the newer ‘prequel trilogy’ – that George Lucas should have let others with more expertise take the reins in these areas.

This applies to learning programs as well.  A SME is quite often not the best person to make the decisions related to a learning program.  Their knowledge of the content is critical, but that knowledge does not extend into an understanding of designing a program to convey their knowledge. As learning professionals we need to make sure we make our expertise known, otherwise we become order takers, and our programs become the learning equivalent of “The Phantom Menace”.

No matter how many digital effects your e-learning has, it falls apart without a strong narrative.

The Phantom Menace is a technically amazing piece of film.  The amount of digital imagery in the film was staggering, and often awe-inspiring.  Yet despite being bombarded with digital eye-candy for over two hours, many movie-goers walked away with some form of ‘meh’ reaction to the film.  Why?

The answer quite simply is that so much time was spent on the digital effects that somewhere along the way the story got lost.  People didn’t care for the story, and ultimately walked away without the same type of connection they felt from the original films.

How often do we do this in learning?  Some sort of new high-tech or trendy tool comes about and we plug it into learning programs.  We forge the learning around the features of the tool instead of the other way around.  Not every learning program needs a Jeopardy Quiz, an avatar guide, or whatever new fireworks the latest upgrade offers.  Just like The Prequel Trilogy did not need Jar Jar Binks.  Don’t try to hide an inferior learning experience behind flashy effects.  It won’t work.

In parts two (and maybe three) of this Reflections of #lrnchat post, I'll explore the remaining questions posed during the chat:

Q2) How do you fix your Podracer in the middle of your race? (overcome challenges while still keeping up with your projects)?
Q3) What “Jedi Mind Tricks” do we employ in our trade-craft?
Q4) How would you describe your/your org’s Dark Side? How do you avoid becoming Darth Vader?
Q5) Before leaving Dagobah, Yoda pleads with Luke to complete his training. What could inform Yoda that Luke wasn't ready?
Q6) Who is your Yoda? What are the qualities that fill that role?

Until then, May the Force be with you.
 

Saturday, April 16, 2011

Reflections on #Lrnchat: Preparing for Change

Image use courtesy of lrnchat and Kevin Thorn (@LearnNuggets)

Each week that I am able to participate in #lrnchat discussion I post a summary of the discussion to my blog. I do this both for my personal development as well as sharing with the Learning and Development Profession at large. This summary is based on my own interpretations of the chat; others who participated may have differing opinions or interpretations of the discussion. I welcome those that do to add your ideas to the comments.

The topic of this week's #lrnchat session was Preparing for Change". 

I always find looking at the questions that are used to loosely guide the chat as a nice way to see the overall theme of the chat. Here are the discussion questions that were presented to the group:

Q1) What changes do you anticipate in your organization over the next few years as new (social) technologies impact it?
Q2) Do you see your own role in the organization changing as a consequence, if so how?
Q3) What are YOU doing to prepare for changes in your own role?
Q4) What are you doing to prepare the people in your organization for change?

Social Media technologies have drastically altered the landscape of human connectedness. 

That sentence at first glance might seem like an overly-dramatic or overly wordy sentence.  In truth though, I think it is extremely specific.  Social media technologies have provided us with a number of valuable resources that enable sharing, contributing, and greater flexibility in many of the actions we take part in every day.  Just about every social media technology, no matter how it is being used, is adding and/or strengthening the connections between human beings.

These technologies have affected just about every aspect of life, from the most complex and far-reaching situations like global collaboration, to something seemingly less significant event like receiving an Evite invitation to a family reunion.  It’s no longer a question of ‘If’ social media will have an impact on your life; it’s a question of how quickly it will happen.

This week’s #lrnchat discussion explored the increasing insertion of social media tools into the existing workflows of our daily lives, and the implications it has on our workplaces and our roles in Learning and Development.

The chat began with an exploration of what changes we anticipate in our organizations over the next few years as social technologies impact it. 

I think the question answers itself in many ways.  The question asks about changes from the impacts of social technologies; it does not even entertain the possibility that organizations may be able to avoid the changes coming from social media.

Of course, not all organizations will follow this path willingly.  There are organizations that blaze a new path where there currently is none, and there are organizations that wait until the path has evolved into a fully paved road with street lights and a crossing guard before they feel safe to start their journey.  I think we are at a point wherein there is a dirt path that has been worn by the early adopters to this shift, and the path has provided guidance and direction to those that need it to start their journey.  The speed of the shift will only accelerate at this point, and any organization that thinks they can hold it back or slow it down is engaging in a foolhardy endeavour.

So what are some of the specific learning and development changes organizations can expect from this shift? 

Overall, I think organizations can expect there to be less barriers to communication, as social media tools slowly poke holes in departmental silos, allowing information to flow more easily between areas.  This increased sharing will make it much easier for learning and development professionals to see the 'big picture' of organizational performance needs, and respond accordingly to them.

In addition, the growing acceptance and trust in the usage will also begin to allow social media tools to permeate the virtual firewalls.  This will have a huge impact on organizational learning, as there is exponentially more learning available outside the firewall than could ever be collected from the inside. 

As access to resources outside the firewall increases, it accelerates another important shift in employee learning: the shift towards bottom-up, or self-directed learning.  When learners begin to realize they have access to tools that give them greater control of their own learning, they will take advantage of that opportunity in ever-increasing number. 

The impact on the learning professional?  The illusion of top-down control of learning will be lifted.  Top-down learning will still exist; it will just be smarter, more focused, and a complement to bottom-up learning, which will become the primary driver of organizational learning.

The discussion then shifted to how these changes will affect the role of the Learning Professional.  What I found most interesting about this piece of the discussion was the reaction I received to a specific tweet I posted:

Training Rooms will become the largest closet in the office.

There were a number of people in the chat who voiced their disagreement with this statement.  The statement was meant humorously - somewhat tongue-in-cheek, with an undercurrent of truth.  The fact is, technology has advanced to the point that training rooms, and the in-person training events they enable, are becoming less critical.

I am not saying in-person training is going away.  I would actually be at the front of the line to disagree with someone that makes such an absolute statement.  We just need to remind ourselves where in-person training fits into our toolbox.

Here's my simple personal rule for to use in-person training: In-person training is used in situations wherein the experience learners need can not be achieved in any other format

Think about the type of activities that are traditionally used for in-person training: Role Plays, Systems Training, Discussions, Games, and more.  In the past these types of activities could only take place in an in-person environment.

That simply isn't the case any more.  Technology has advanced to a level in which many aspects of social learning that historically could only take place in-person can now take place just as effectively online.  To add further weight to this shift, many learners are accustomed to using these types of tools in their personal lives, making them a comfortable and often desirable platform for learning. 

Finally, I should also point out that my comment regarding training rooms becoming the largest closet in the office was also loosely based on a recent real-life example.  I had an in-person workshop scheduled, and went to prepare our training room the day before the event.  I was surprised and displeased to find three giant laser printers, still new in the box, stacked on the tables of the training room.

I requested that they be removed, and asked why they were 'stored' in the only room in the building allocated to training.  I explained that using the room for storage could disrupt our planning, as it was a critical resource for learning and performance.

The response I received was simple: "Dave, those printers have been there for almost two weeks; you haven't noticed them before today?" 

I hadn't.  And with that, much of the outrage I was feeling about the audacity someone had to use the training room for storage felt unjustified.

A few years ago those printers wouldn't be in a training room for more than a few hours before I noticed them and raised the alarm.  Today they were there for almost two weeks before I noticed.  The training room is still an important resource, but as technology continues to advance and our dependence on it continues to lessen, chances are you too will one day find it being used as 'temporary storage'.

The discussion then moved on to what learning professionals can do to prepare ourselves for the changes in our roles.  There was a fairly consistent theme to the answers: We need to learn about the tools of social media so that we can be prepared to use them.

I think our preparations need to go a little further than just 'learning the tools'.  That is a key part of it, but it's not THE key part of it.

I remember when my father taught me how to drive.  He took me to the local school's parking lot on a weekend, when it was completely empty.  He taught me how a car worked.  We went over every pedal, switch, knob, and dial, and put them into action.  I drove around that parking lot for hours and days on end, practicing the skills associated with operating an automobile.

And yet, even after seemingly countless hours of practice, I still did not know how to drive.

Truly knowing how to drive requires the ability to exist and respond to the environment of the driving world: the other drivers, pedestrians, road work, highways, and countless other inputs that must be processed and responded to in a manner consist with what is accepted within the 'driving community'.

Using social media for learning is really no different.  You can understand how the technology of a social media tool functions, but that doesn't mean you know how to use it.  To use it effectively, you must put the tool to practice within an active social media community.  More importantly, to truly know how to use these tools effectively, you must participate in social media communities.

I find the most effective ways to do this is to find existing communities that you are interested in personally or professionally, and participate in them.  If your first participation in a social media community is one that is launched at your organization, you leave yourself at a severe disadvantage. 

The discussion concluded by exploring how we can prepare the people in our organizations for the changes being brought to us by social media.

A common theme of the responses was that we need to lead by example.  That goes back to my previous point of participating in these communities and speaking their value.  I also think it's important to talk about the benefits of these tools with senior managers so that they understand their value.

Often a discussion about social media centers on risk: We'll lose productivity, it's not secure, and countless other objections that we have been hearing about for years.  It's for this reason that I think two critical things must be represented in these discussions.

First, you do need to talk about risk.  However, the risk you need to talk about isn't answering the question "What are the risks of us doing this?"; the question you should focus on is "What are the risks if we don't". 

There are great current examples that can be used to demonstrate this.  Two that come immediately to mind are Blockbuster Video and Borders Bookstores.  Both companies kept their head in the sand as technology fundamentally changed their industry.  They did not respond to the changes until it was too late, and both companies are near bankruptcy.  That's what happens when you choose to ignore the changes going on around you.  That's the risk you need to be talking about to prepare your organization for this change.

Another thing to bring into the discussion are real-life examples.  Many stakeholders are ignorant to the value of social media in general, so you might need to show it to them.  An technique I've used is searching social media for comments about my organization. 

Recently I found a post on Twitter that was complaining about a service issue at one of our locations.  I took a screen print of the tweet and sent it to a senior manager with a simple message "How would you like us to respond to this?" 

The response I received back instructed me to respond to the customer personally, but it also included a gold nugget of value: "Where did this come from?".  That simple question created an opportunity for me to have a conversation with a senior manager about social media tools.

Social media tools are already impacting the environments organizations do business in, even if they have not yet been put into use within an organizations.  The impact of these tools increases every day, and  the day in which organizations can no longer ignore their influence is rapidly approaching.

The question is, will your organization, and you as a learning professional, be ready?

Until next week #lrnchat-ers!

Wednesday, April 13, 2011

Resources Shared at Recent ASTD Presentations


Resources Shared at Recent Presentations

Recently I have had the privilege of speaking at a number of ASTD Chapter Meetings.  While the topics varied sum, they all fell under the larger umbrella of Social Media for Learning Professionals.

This Blog Post collects many of the resources that I discussed and shared during these presentations.

BOOKS
COMMUNITIES
FACEBOOK FAN PAGES
WEBSITES AND WEB PAGES
BLOGS
VIDEOS
RECOMMEDED TWITTER FOLLOWS
Note: The value of network is determined by each individual. What I may consider of value may be different than what you do.  That said, here are a few people that I highly recommend following on Twitter if you are looking to learn more about social media use in employee learning.

Tuesday, April 5, 2011

Reflections on #lrnchat: Learning Analytics

Image use courtesy of lrnchat and Kevin Thorn (@LearnNuggets)

Each week that I am able to participate in #lrnchat discussion I post a summary of the discussion to my blog. I do this both for my personal development as well as sharing with the Learning and Development Profession at large. This summary is based on my own interpretations of the chat; others who participated may have differing opinions or interpretations of the discussion. I welcome those that do to add your ideas to the comments.

 
The topic of this week's #lrnchat session was Learning Analytics". 

I always find looking at the questions that are used to loosely guide the chat as a nice way to see the overall theme of the chat. Here are the discussion questions that were presented to the group:

Q1) What learning data does your org collect? Why? What problem is org trying to solve?
Q2) Are there ethical issues in collecting learning data?
Q3) How are you collecting formative and summative data now?
Q4) What about informal and social learning? Can we/should we try to measure that? How?

If you want to gauge the effectiveness or progress of something, you need to collect some sort of data that can be analyzed.  Sometimes the data is very simple, like the data collected by a gas tank that is analyzed with the output showing you how much gas is left in your tank.  Sometimes the data collected is much more detailed and robust, like the seemingly endless statistics collected by sports organizations. 

Collecting data for analysis as a means of gauging performance is a practice that exists in almost every aspect of life, including the learning and performance departments of organizations.

When you hear about the effectiveness of a learning program, the discussion often focuses on the effects of the program.  Youll hear about changes in performance, the effect the changes in performance are having on the organization, and perhaps a discussion on the ROI of the program.

What you dont hear about as often is the data that has been collected and analyzed that enabled the learning department to come to these conclusions.  An analysis is only as credible as the data behind it. 

This weeks #lrnchat focused on the data collection methods used in Learning and Performance departments.

The discussion started by sharing the types of data being collected by learning departments, as well as exploring the purposes for collecting the data.

The majority of the responses confirmed some of the shackles we place on ourselves within our organizations.  What we are tracking are mostly widgets: Number of course completions, test grades, and the proverbial butts in the seats.  The other data that was collected by many was reaction data from program participation, commonly referred to as Smile Sheets.

Think about that for a moment.  The two most commonly collected forms of data are respected so little by organizations that there are commonly known phrases that openly mock their value.  Thats a problem.

I think part of the problem, historically, is the learning function providing the data that is asked for as opposed to the data that is needed.  If you don't understand the metrics of a field, chances are you're going to ask about the obvious widgets associated with the topic.

In the case of learning, the most common widget is the 'butts n the seat'.  I point the finger of. Blame for this in two directions.

The first area I see causing this issue is Compliance Training.  In many organizations, this is the fist, and possibly only, learning program in which data is requested.  It's also training that is often less about performance improvement and more about being able to report that the required training took place. 

In my experience, most stakeholders ask questions about compliance-related questions that fall under the theme of "Did everyone complete it?" and not "Is everyone's performance meeting the compliance standards? The data being asked for, by both the organization and the regulators that enforce the compliance standards, is 'butts in the seats'.

The other area I see as contributing to the data problem is the learning field itself.  We should not be waiting to be asked for data; we should always be showing he value being created by our efforts and of learning in general.

In the organizations I have been a part of, there is not a great understanding of the link between learning and performance.  There's an accepted connection between the two, but not an understanding the pathway that leads from one to the other. 

The organization cares about performance, not learning.  Those that do not understand the pathways from learning to performance will likely ask about the numbers.

I rarely share learning 'data'.  Honestly, I don't have much interest in the reports I an able to get from my LMS, and if I don't have any interest in it, why should I expect anyone else to?

I do collect data, but unless pushed for it, I dont share data.  Its not an ownership issue, though that is something that will be considered later in this post.  I just dont think the data itself has value.  The value of learning isnt something you will find on a spreadsheet.  The value of the data I collect is in the story the data enables me to tell to a stakeholder.

Stories feel much more real than sterile and ultimately non-valued data.  In my experience, most of the stakeholders value and trust the story more than the data, though such trust does need to be earned.  For stakeholders that are interested in the data behind the story, I have that and can share it too.

The main point here is that Im not going to wait for someone to tell me what data to provide them.  Ideally that would be a discussion during the needs assessment conversation, but in some organizations, thats not a standard.  Even if I do not walk away with set metrics to report back on, I will walk away from those initial discussions with an understanding on the types of metrics that need to be impacted by the program.  That understanding gives me the framework for the story I need to tell.

I believe in organizations with a more mature learning culture, much of what Im describing is formally built into the structure of the workflow, and thats a good thing.  In the absence of that though, learning professionals need to step up and fill the gap.

From here the discussion moved on to the ethical issues that may exist with the very concept of collecting learning data.  For me, this is a question that needs to be analyzed beyond the initial knee-jerk reaction.

I think for many and this was represented in the discussions the immediate reaction is to say, collecting learning data is unethical; it goes against the principles of organic growth.

I think theres truth in the ethical concerns, but I dont think it has to do with the data collection.  Learning data is just that data.  Its not the data itself that is unethical; Its how we collect it and how the data is used that raises ethical concerns.

I think it starts with the plan specifically having one related to your data.  Any time data is collected without a set plan for its use, youre opening the door to ethical issues.  For example, many managers use the data for Gotcha Management; they use the data to hold people accountable and punish those that have not completed courses.

Dont get me wrong accountability is a huge piece of the performance improvement puzzle.  In the context of data though, its important that the data is used effectively.  Much of the data I collect is supplied directly or indirectly by the participants.  If we use the information they supply us with against them, how forthcoming with information will they be in the future?

Another pet peeve I have related to the ethics of data collection has to do with what we tell learners about the data we collect.  How many organizations make learners complete Level 1 evaluations - even explaining that their feedback is important and will impact future programs and then do nothing with the data?

Is it ethical to tell someone their opinions matter when in truth the data is not changing anything?  Im always amazed by the amount of data that is collected, and never used. Make a decision regarding you data: Either collect the data you need and use it, or dont collect it at all.  Your time and the time of learners are too valuable to waste it collecting unneeded data.

The discussion then moved towards the methods we are using to collect our formative and summative data.  Of course, in order to answer that we need to have an understanding of what the differences are between formative and summative data. 

While there are number of levels at which we could explore the difference, at the core the differences between the two are not about the data itself; its more about when the data is being collected.  Formative data is the data that is being actively collected during a learning program, whereas summative data is data that is collected after the program is over.

While I do see a place for formal data collection methods like focus groups, surveys, and assessments, they are not my primary tools for data collection.  I cannot overemphasize the fact that the most important data collection tools we have are simply our eyes and our ears.

If I want to collect data on how participants are learning and performing, my first step is to quite simply stop talking.  Getting people to talk about their learning in reflective ways, hearing that they see the connections between their learning and their work, and listening to them share ideas with each other on how to apply their new or enhanced skills provides me with more powerful data that I would likely ever get from an assessment or a smile sheet.

The other tool I mentioned is our eyes.  Harold Jarche simplified learning to its core when he said Work is learning, Learning Work.  That being the case, if I want to see if someone is learning, one of the best ways I can to do that is to watch as the person is working.  Not only does it provide learning professionals with some of the most accurate data related to performance, it also provides an excellent opportunity for real-time performance support and coaching. It fosters continuous improvement.

I also believe that its important that we define work in this context.  A work-based role-play or a simulation of the work environment is not representations of work.  They are excellent tools to reinforce performance, but the only way to truly collect data related to work performance is to observe the work itself where it happens. Anything else really pales in comparison.

Need proof of this?  Ask yourself how many times you have heard Thats great, but thats not how it works in the real world from a participant.  Learners already know this is true; we just need to follow their lead.

The chat concluded with an exploration of informal and social learning, and what, if any, data we should collect regarding it.  What surprised me about this discussion was how much of it centered on a different discussion: Whether one type of learning is better than another type of learning. 

To me that subtheme underscored one of the main issues related to learning analytics.  The learning is actually irrelevant in most cases.  What really matters is the performance. 

In most cases, we dont have a Social Learning Program or an Informal Learning Program; we have a learning program that incorporates Formal, Informal, and Social learning techniques.  Its the collective influence of program and all other external factors that ultimately leads to the performance.

Ultimately we design learning with the desired performance in mind.  When the time comes to collect summative data, we should be collecting data related to the performance, not the learning.  We can always backtrack to what factors influenced any performance change, including the effectiveness of each aspect of the learning program.

As I mentioned earlier, when I talk about the results of a learning program with stakeholders, Im sharing a story, not data.  Therefore, I rarely talk about measurement of the learning programs.  Ill talk about measurements related to performance, and then Ill build credible connections between the performance and the learning programs.

Until next week #lrnchat-ers!