The eLearn Podcast
The eLearn Podcast is your go-to resource for creating successful learner outcomes. Whether you’re an instructor, an eLearner, a parent supporting an eLearner, a content creator or just an eLearning enthusiast, our episodes provide the latest strategies, practices and technologies for supporting, engaging and retaining students... both on and off line! Learn more at https://www.elearnmagazine.com/elp
The eLearn Podcast
How To Move L&D Into The 21st Century With Clark Quinn
Hello everyone! My name is Ladek and my guest for this episode is Clark Quinn. Clark is the Executive Director of Quinnovation, an independent learning experience design strategy consultancy that assists organizations to “work smarter” by aligning how we think, work, and learn. He’s also the Chief Learning Strategist for Upside Learning Solutions. But for anyone who has been in the learning space for any amount of time, you’ll also know Clark is a celebrated expert that has been applying evidence-based practices from the cognitive and learning sciences about how people think, work, and learn for organizational performance improvement since the 80s.
In this ‘quinnovative’ conversation, Clark and I discuss
00:00 › Start
9:40 › Clark's Gaps—What are the gaps that Clark sees, time and again, that individuals, schools and companies are not putting into their learning design?
13:51 › Quantified Heart-To-Heart—Why the most important thing, in Clark’s opinion, to improve learning is to start measuring in meaningful ways and why cognitive science and learning science are critical foundation knowledge for all instructional designers
21:45 › Dos And Don'ts—Clark discusses that we shouldn’t be designing a course - or any learning - unless we know what outcome it will be used to achieve in the business or organization… and why ROI should be treated carefully
27:35 › Effectivizing Tools—We end our conversation around a discussion of what tools and processes Clark recommends for designing learning… and how we can make learning most effective in any organization.
Subscribe for the latest news, practice and thought leadership at eLearnMagazine.com
This is the eLearn podcast. If you're passionate about the future of learning, you're in the right place. The expert guests on this show provide insights into the latest strategies, practices,
and technologies for creating killer online learning outcomes. My name's Ladek, and I'm your host from Open LMS. The eLearn podcast is sponsored by eLearn Magazine,
your go -to resource for all things online learning. Click -by -click how -to articles, the latest in EdTech, spotlight on successful outcomes and trends in the marketplace. Subscribe today and never miss a post at eLearnMagazine .com and Open LMS,
a company leveraging open -source software to deliver effective, customized, and engaging learning experiences for schools, universities, companies, and governments around the world since 2005.
Learn more at Open LMS .net. Hi there, my name's Ladek, and my guest for this episode is Clark Quinn. Clark is the Executive Director of Quinnivation, an independent learning experience design strategy consultancy that exists organizations to work smarter by aligning how we think,
work, and learn. He's also the Chief Learning Strategist for Upside Learning. But for anyone who's been in the learning space for any amount of time, you'll also know that Clark is a celebrated expert that has been applying evidence -based practices from the cognitive and learning sciences about how people think,
work, and learn for organizational performance since, you know, since like the 1980s. In this innovative conversation, Clark and I discuss what the gaps are that Clark sees time and again that individuals,
schools, and companies are not putting into their learning design. We then talk about why the most important thing, at least in Clark's opinion, to improve learning is to start measuring in meaningful ways and why cognitive science and learning science are critical foundation knowledge for all instructional designers out there.
Next, Clark discusses what we shouldn't be doing when designing a course or any learning, unless we know what outcome it will be used to achieve in the business or organization that we're working for and why return on investment or ROI should be treated very carefully.
We end our conversation around a discussion of what tools and processes Clark recommends for designing learning and how we can make learning most effective in any organization. I remember we record this podcast live so that we can interact with you,
our listeners, in real time. So if you'd like to join the fun every week on LinkedIn, on Facebook, on YouTube, just come over to elearnmagazine .com and subscribe. Now, I give you Clark Quinn.
Hello, everyone. Welcome to the Elearn podcast. My name is Ladik, as you've probably heard now, at least four or five times. I work with a company called Open LMS, but this show is not about me.
This show is about my guest here today, Mr. Clark Quinn. I'm from Quantivation. How are you, Clark? I'm good. Thank you, Ladik. It's excellent to finally meet you. As I do always,
where do we find you sitting in the world? I'm in Walnut Creek, California, which is Bay Area, a little bit inland. Strangely enough, I know exactly where that is.
How's the Bay Area? I mean, we're recording this on September 5th. If I've been reading the news correctly, it has been something of a crazy weather year. Does that mean cool,
hot, wet, windy? What? Yes, all of the above. Crazy weather. We had more rain than usual. Our temperature has gone up and down more.
Instead of a steady heat with a few hot days, we've had some really hot days. We've had some really cold days. It's just wackier. General chaos, much like the L &D universe,
I think, over the last four or five years. Before we dive into our conversation today around how we actually moved the L &D world into the 21st century,
because we're already 23 years into that century. Tell us about who you are. I mean, so many people have already heard your name. I've heard seconds on who you are and what you focus on.
My sympathies. So I saw the connection between computers and learning as an undergraduate design my own major. It's been my career ever since.
Back then, we didn't have the term e -learning, whether who coined it in the late 90s, we were doing stuff a long time before then. My first job out of college was designing and programming educational computer games and that focus on combining engagement with effective education has remained a recurrent theme in my work,
including games. I went back because I knew we didn't know enough and I went back and got a PhD in what was effectively applied cognitive science. And I did the academic route for a while,
joined a couple of government startups, came back to this country to, you know, work with a develop an adaptive learning platform a couple of decades before we started really doing this.
And for the past couple of decades, I've been an independent consultant assisting organizations to take what we know about how our brains work, how we think, work and learn and apply that to design the solutions to make organizations more effective.
Fantastic. I am sure that you have helped thousands and thousands of people. So what do we get right in L &D now? You know, it's 2023.
And what are we doing right now that we've come out of the pandemic? We're in this now like definitively hybrid universe. So online learning or that the virtual environment is no longer a nice to have.
It's just everybody everybody's assuming it, et cetera. What are we doing well? - We are learning to use the tools in ways that are effective.
I sort of argue at after 911, nobody wanted to travel. We started saying, "Oh, let's do e -learning." But the problem is we gave everybody authoring tools and we took anybody who'd been a face -to -face trainer and said,
"You're now an instructional designer." We had a lot of information going on. I think over the past couple of decades, since then, we've begun to be aware of learning science.
We're beginning to incorporate that into what we do. As you point out, with COVID in particular, we learned we could do online learning and it wasn't horrible.
It wasn't necessarily better. There was some research done from SRI that said online is actually better than face -to -face. But there was a caveat to that and that was, if you take the time to redesign,
and really it was to me more about the design than the technology, and I think that's still true. So I think we are increasingly aware of what we can do. There's still a bunch of structural constraints that make it difficult to do as well as we can.
And some of that blame, we can lay at our own feet as well. - So what are those common barriers? I guess, before I ask you about the barriers,
what is it that we don't do well? Like, if we're starting to get it, if things are being produced that, hey, we're learning some stuff, learning science is coming into it, cognitive development, et cetera, what are those gaps that you still see time and again every time you sort of work with clients?
- Yeah, it's funny. Depends on if you see the glasses half full or half empty. I was in my conversation with our colleague, Will Tolheim, and he was going, oh, you know, Alan D really wants to do better,
but the layers above keep constraining them to just do ordinary stuff and don't resource them appropriately. But I came back and said, yes, but we're also our own worst enemy.
So what we're doing is, despite we're increasing our use of learning science, there's still a lot of places that aren't. We're still doing a lot of information dump. There's this view that if we give people new information,
they'll change their behavior. behavior, despite the robust evidence that that's not true. You can look at behavioral economics or cognitive psychology, and you will see that people do a lot of irrational things,
right? And it takes, we now have evidence that what it takes to make persistent behavior change requires a fair amount of practice and across appropriate contexts, and I don't see enough of that.
We still train until they get it right instead of until they can't get it wrong. And I would argue the most important thing we're not doing is we're not measuring it in any meaningful way. We measure completion.
We may measure smile sheets and see if they liked it. The only problem with that is the correlation between people's assessment of how effective learning is, and its actual impact is about zero.
It's 0 .09, which is zero with a rounding error, and that's just not very good, right? We really need to be looking at it. And then, you know, once we start measuring, we can start looking at other approaches.
There are times when trying to put it in the head is just wrong. If it's too long between the time you learn it and when you have to practice, it'll be gone. Put the information in the world. Job aids, performance support,
we're neglecting that. We're neglecting our potential role in informal and social learning. You know, when you're doing innovation, when you're doing problem solving and research and design,
you don't know the answer when you start. That's learning too. It's not formal learning. There isn't anybody with the answer you could bring in or you would, right? But you need to figure it out. You need good practices of learning,
good practice of experimentation. We can help there too. And that gives us a much more central role to the success of the organization. Increasingly, people are saying, you know,
as things go faster, continual innovation is going to be the only sustainable differentiator. We could be there. We should be there. Hi there. I'm sorry to break into the show right now, but if you're enjoying this show,
if you are challenged, if you're inspired, if you're learning something, if you think that you're going to be able to get something out of this to put into your practice, do me a quick favorite. Pause right now and just hit subscribe on your podcast player right now.
It doesn't matter which one. Just hit subscribe because that way it'll make sure that you never miss an episode in the future. Thanks. Now, back to the show. Yeah, 100%. So tease that out a little bit for me because I get fascinated by this particular piece of the conversation personally.
How do we participate as a learning and I want to be clear and I'd appreciate it if you'd maybe make the delineation of the distinction for me.
Does what we're going to talk about, does this bleed into what we would consider traditional higher education, K through 12, everything? Or are we talking specifically about just adult learning? My short answer is yes.
It should and could bleed into K12 and higher and right now I'm largely talking organizational learning and adult learning. I don't like the androgodgy movement because what that did was said,
oh, well, creative and meaningful learning is what adults need, what we do, we can do anything to kids. That's not true. They care about meaningful learning too. Our brain architecture doesn't suddenly change when we become an adult.
In fact, where what is the measure of when we're an adult? It's pretty fuzzy at best, right? It depends on a whole bunch of things. Some people are thrust into it well before they're ready.
So the point I want to make, you know, how do we do this? How do we get started? You know, I argue that the most important thing for formal learning is to start measuring, as I said. But for informal learning and innovation,
I think the first step we have to do is start doing it internally. We need to start experimenting systematically, smartly. We need to start sharing sharing where we're at and where we're going and working out loud.
We need to find effective ways to communicate and collaborate. When we get ownership of that internal to L &D, we then have a credible case to say,
we're now ready to take it forward to the rest of the organization. So I think the way we get there is by demonstrating it, practicing it ourselves and making sure we understand it and take advantage of that.
- So we had two people who commented. Actually, before we even started today, which I found fascinating, I wanna thank Christina Rodier, maybe, or Rodier and Anna Santos for chiming in and saying hello there on LinkedIn,
but let's just assume Christina and Anna are learning designers. They're in a corporation somewhere. I have no idea what their background is. They're probably both excellent professionals, but take me through the process or what you're thinking would be.
If I'm either a junior or mid -career learning designer, I'm in an organization that's thriving, I am excited, and I wanna be more effective. What's your sort of sage advice for?
Here's the steps I'd take to do this internally, prove it works on us, and then start to be a champion for it and be effective within the organization. - Well,
in a role, learning designer role, the first thing I truly believe you need to know is understand as cognitive science and learning science. You just need that core human information processing loop to understand how we perceive,
how we, what we attend to is what gets into working memory, how what we need to get from working memory and long -term memory, how that works, how we need to get it back out.
You really need that core basis, and then there are layers on top of that, the role of mental models, the role of examples, the nuances around practice,
there's a lot more that goes into that. You have to understand that actually applies not only to formal learning, it applies to design and performance support, it also applies to the approaches you take to foster innovation.
For instance, we know that hearing somebody else's thoughts before you formalize yours kind of constrained your thinking, which is why brainstorming, they said didn't work because that old model of brainstorming was bring everybody in a room,
share the problem, start talking. You need time for everybody to think on their own. You won't understand that if you don't have this basis of cognitive science. Then you need to look at what the evidence says about how people work best.
And that includes having time for reflection, valuing diversity, not just tolerating it, but valuing it, having an open mind to new ideas,
having a process to explore them, having a tolerance for failure, recognize that with experiments, there's a 50 /50 chance it won't work. Know what you're going to test and what you will learn from it either way.
So therefore, when you do have a failure, it's not a loss, it's a learning. Those sorts of principles, then you apply that internally. So you apply it in your design solutions to others,
but you start applying it internally to create a self -improving loop. And then finally, when you understand it, know it, then you have a basis to take it out for the rest of the organization. So that's sort of my process is,
for your formal learning, get that core down and then start measuring and have evidence that'll tell you whether what you're doing is good or not. And I'm not talking about measuring how much does it cost to have a bum in a seat for an hour.
I'm talking about measuring, is this moving a needle in the organization? And then internally start practicing all those innovative, continual improvement practices,
take ownership of them and then bring them to the rest of the organization so we can help you be more effective too. - Have you found, oh, sorry, I apologize. Have you found, along that pathway,
there's kind of two thoughts I have there. Understanding the cognitive science, science, we just have another, we had another comment come in here. It doesn't give your name unfortunately, but they said they love focusing on neuroscience as well,
you know, sort of understanding the cognitive science, understanding the learning pathways, the modalities, etc. Is it just your own personal savvy of knowing when and what situation we're to say,
hey, actually, we designed this because, you know, Kirkpatrick says, or and, you know, in other words, it's just a hey, no, look, we're gonna do this way because I'm not sure if I'm asking that question, but you see what I'm saying,
like, what savvy do you need as a learning designer in an organization to know when to be the smart ass and when, you know, when to be the academic and when to sort of be like, hey, look, we're, you know, when to just sort of lead.
Um, yeah, it's easier to get forgiveness than permission. So many times I argue, you know, do the right thing. You can do better multiple choice questions, not just knowledge tests,
but are actually many scenarios. You should just already start doing that. Um, organization I'm working with, Upside Learning, they are beginning to do that, right? Um, but I think the point of citing people and when you should share your knowledge is when people are trying to push back on your expertise and tell you,
oh, I understand learning. No, you don't. Um, that's my area of expertise. So you need to maintain their respect for your expertise just as you respect their expertise.
That's a really important boundary that has to work there. Um, but then you have to understand that despite the robust decades of evidence we have about how important,
you know, uh, space practices and repeated practice and varied practice, the amount, exact amount, depends on a number of factors, like how complex it is inherently,
uh, how frequently you perform it in the real world, how important it is if you get it wrong. Nobody's got the exact formula. So you're going to create your best guess based upon those scientific principles.
And then you need to test it to see if it's met your levels and you should have expectations that says, says this will be good enough, this won't, and you test until you reach there. That's how we should be doing it.
That's how it works in other industries and other areas of the business. We should be doing that too, not just if we build it, it's good. - Okay, so then take me there,
because I love the idea, and from your perspective, it's like, look, get your house in order, and I say this to my kids all the time,
ask for forgiveness, don't ask for permission, beg for forgiveness. Anyway, they didn't hear me say that. So getting your house in order is one thing,
but now talk to me about the measurement side of things. If I'm in the C -suite, if I'm in the EVP level, I'm interested in business goals, I'm interested in business metrics and whatnot. What are your advice,
or how do you work with others to be savvy about talking to that crowd and making learning important, making learning essential,
and showing how this piece within the organization that everyone is touched by is moving metrics? - Right, it's a complex story.
So, but this fairly simple version is, we shouldn't be making a course unless we know how we will be sure it's worked.
In other words, we shouldn't just take an order, we need a course, and go to the SME, and they'll give you an information dump, and you dump that information on learners, and we're done. That's not gonna make any change in the organization.
So we need to say, work with our business partners, when they come to us, we should have a process that says, here's how we break things down. Here's how we ask the questions.
Here's how we make sure this isn't a situation where training will help. There are lots of performance problems that are having training won't address. You know, people could do it if their life depended on it,
but they're not for another reasons. Training isn't going to change that probably. And if it's the wrong people or they don't have the right resources, if there's unrealistic expectations,
they're not going to be able to do it. So then, you know, and then there are times when it's better to create a job aid than to create training. And then when we create training, we should make it right when it has absolutely positively has to be in the head.
We put it in the head effectively and we demonstrate and we show that not only were they able to do it at the end of learning, they're doing it in the workplace and it's changing things. That's the measurement we need.
That's what we should be taking to it. I continue to think that eventually, CFOs are going to get wise and come back to us and say, how do we know that the money we're giving you is actually leading to change in the organization instead of it's just we give you money and everybody believes it leads to good outcomes.
Eventually, I think somebody's going to start calling us on that and we have to be better prepared. I'd rather us do that before that happens. Sure. And is there and this is a totally innocent question just only because I'm always sensitive to that where it's like,
hey, look, I gave you a dollar or, you know, a peso or whatever it is, and I want you to show me how that moved some needle, you know, some percentage or whatever. In your experience,
are there other conversations that are more important? Obviously, businesses are, you know, they're economic vehicles, you know, that's a really, really important part of the conversation. But we also know that,
you know, employee morale and well -being and, you know, onboarding new employees is an important process, etc., etc. Are there ways to talk about successes that you've had,
you know, sentiment and those kinds of things that aren't necessarily, dollar for dollar tracking? Or is that always the case? - I argue that in general,
pretty much everything, there should be metrics that show us what we're doing and we can attach dollar cost to those. I don't believe in pure ROI because you could have a very small initiative that has a much better ROI than this one,
but this one's gonna have a much bigger impact and it's probably a better value because it's more important to the organization. But you do need to say that what we're spending is worth what we're receiving for it.
But even things like onboarding, how much time till they're proficient is a metric. It's not just, oh, they're on boarded. How do you know they're on boarded? How do you know they won't leave? What have you done?
Are you reducing retention? Are you increasing the quality of people? Are you building the right skill sets amongst the employee base? There are lots of things and this happened a long time ago.
People were talking about social meeting. They're saying, oh, well, if people are active in a social environment, that's your metric. And they charge by that and it was like, well, no, if you put it in sales and you have activity,
should metrics and sales improve? If you put it in operations, what metrics about reducing error rates or increasing throughput or whatever things should occur from having people conversing and becoming better at problem solving effectively?
And so I don't totally believe in metrics. I believe there's some things that we possibly invest in because we know they're good like learning to learn.
But I think eventually we can tie real value to this. And I think we should at least to start. So talk to me about the learning experience design like or the design of the learning experience both in terms of the tools we use,
the platforms we use but then getting creative about the variety of experience. You've touched on it a number of times here over the last 20 minutes already, where, hey,
a Wargate may be a more appropriate piece than a formal learning. And in this case, you may need to send people to an LMS, but in other cases, this should be a mentoring relationship. Talk to me about those complexities and managing that portfolio,
either as an individual or as a team. Is this something where, again, it's another area of potential specialization for an L &D person, or I'm gonna leave it there. What does your experience show you?
- To me, it's not about the platform or the tools, it's about the process. It's about starting up front and analyzing who is the audience, what is the need, what cognitive changes do we need?
So getting back to the cognitive and learning science. Then we say, how do we deliver it? What experience will make that change? Then finally, we can get to the platform and tools and say,
which one will allow us to build that? So you say, and there are other factors that go into it. So is coaching easily available, or is it very hard to manage resource?
Is, do we need an immersive environment? Two, we have a bad habit back to the, your very early question, what are we doing wrong? Chasing shiny objects,
and I argue we'll get much more value out of investing and making sure we've got the learning science and the process down before we go to VR, AR, AI, all the acronyms.
We need to understand what cognitive changes we need, and then how do we bring those about? And we then create an experience that includes that. And it might be a simulation practice followed by coaching in the workplace.
In fact, that's probably a good recipe anyways. I love the person who told me that they never released learning without having also addressed how the coaches or supervisors are going to support this.
That was a really interesting, you know, sensible thing to do to say you can send somebody through learning to go back to the workplace and immediately gets extinguished.
That's not how we do it here. How do you make sure that that coaching is going to happen and happen effectively? So it's a bigger shift.
So yes, you start with designing better learning experiences and measuring it and looking for outcomes. Ultimately, it'll come into creating a culture where there's inbuilt coaching as part of how people act.
Increasingly, there's arguments that management is really becoming coaching. What you're engendering in my head right now, what's popping up is the thing of the elementary school sort of classic model that you and I grew up with where it's like your kid came home with their math textbook.
But then there was also the textbook for the teacher. You know what I mean? And I feel like would you agree there's an analogy here where it's like, look, you're going to build maybe a skill building thing or a soft skill thing or whatever for the team here.
But then leadership also needs to have something as well to say, hey, look, here's what we just taught them. Here's how you support it. Does that make sense? Oh, absolutely. Absolutely. That's quite honest. I don't think I've ever seen that done.
I don't see it increasingly that people are saying, let's have our manager. When you do a big program, you train the staff, you train the managers. And increasingly,
you have to think about any intervention. It's really an organizational change. You need to treat it as such and you may need to sell it to people. I love the guy I heard Peter Diego talk about change.
He said, there's this myth that people resist change. People don't resist change. They make changes all the time. They get married. They change jobs. They buy a new house. They resist changes that are imposed upon them." They don't understand.
They don't make sense, right? Right. He says, "Make it a choice." What differentiates those are changes that they've chosen. Make it a choice. Say, "We can stay here or we can go to here." If you do that right,
it should be pretty obvious this is better. Then you sustain it through it. That happens just down at the e -learning level as well as changing the culture of the organization level.
You tell me if I'm crazy, but I had this brain. Again, I'm not making this up. This is 72 hours old for me. I was taking a hike with a friend. On the hike,
at least two or three times, they referred to the fact that they are not a scientist. I'm not sure what we were talking about. It dawned on me. I said, "Actually, this is really an analogy to what you were just talking about.
You and I experiment every day. We haven't thought hypothesis about, "Hey, will my car start when it's cold? I'm going to go try it." Then you work through that hypothesis and then, "Yes, the car started." We do this.
We go through the scientific method. It sounds like you are a proponent of that as well. It's like, "Let's stop assuming that we don't know and let's start assuming that people are more adult,
bigger, human, more mature, all of those things and let's meet them at that level." Would I have that right? I think that's right is that we can be more systematic. One of the things we know is a bunch of our folk beliefs about learning and a variety of other things actually aren't accurate.
We should be fostering way back when the late Jay Cross wrote the book Informal Learning. He's talked about maybe the best investment you can make is helping your people learn better.
We have this initiative, Metal Learning Lab. It didn't succeed. I think it it was a decade or so too early, but the focus actually almost two decades now.
But the notion is that if you got your people to learn more effectively and work more effectively, be more systematic, more scientific in just their everyday things. You get this improvement and it may be the best investment.
We see this in software engineering with Watts Humphrey's personal software process, team software process, where all they did was a, reviewed what they did and found their own mistakes and improved them and that led to big improvements in their software process.
Why shouldn't we do that in a learning process and any other process where engaged in? And yes, one of the reasons I think more people should understand cognitive science is almost everybody in some way designs for humans.
And if you understand humans better and not the myths, you look at a whole bunch of HR, it's interesting you were mentioning earlier about, you know, morale and stuff and some of that stuff, I'd say that's not Ellen Dee's role,
right? There are other people who are supposed to be responsible for that. But I think in all these things, if the more you understand people and their foibles and not having the mistakes, we see a bunch of practices in HR that are still essentially based on us being formal logical reasoners,
like yearly reviews. The thought that feedback, you know, some months later is actually gonna lead to any meaningful changes and helpful. You need much more frequent feedback to improve.
It's a Delta, it's not a step thing. And a whole bunch of other evidence that this isn't accurate. And if we were more accurate, if we designed organizations work more closely to the way our brains work,
we'd be far more effective. - So, you know, on the screen right now for those who can see it, if you're listening to the podcast, now you can see it, but Clark has put there that he,
you know, he's known as Quinnivation. That's part of, you know, one of his companies. So I wanna take you down this road for just a little bit because you said, you know, all those acronyms, we've got virtual reality,
we've got augmented reality. And now of course, the talk of the day is, you know, artificial intelligence. intelligence talk to me What's been your experience about how that has impacted or is impacting?
Our ability to create excellent learning experiences are you know? Is it is it again another shiny object that you think people are chasing and for getting the fundamentals? Again,
I'm gonna lead the horse there as you did obviously but talked about what's your opinion around these particular? technologies especially in the virtual realm and the artificial intelligence realm and What do you see happening over in the in the near term like the next two years or so?
um well We are chasing the shiny object of his recent reading something that said that you know There is a psychology of the novelty of the new and we chase it and it's understandable We are curious,
you know our curiosity is one of the features of our cognitive architecture and to the to the good overall but I Think yeah,
and you may have seen the Gartner height curve, you know So there's this over inflated height, and then there's this trough of disillusionment and gradually we find the real value and as somebody once said we overemphasize the short -term benefits and under emphasize the long -term consequences And I think that's possibly true.
So What I regret is us not using these insensible ways There has been a long history way back Doug Engelbart and his mother of all demos Showed how technology could augment our cognitive capability And that's the right way to think about it.
It's not per man or machine. It's not person or processor It's how do they work together best? And if you see the you know chess has a place has emerged where um human computer hybrids succeed better than the best humans or the best computer programs.
And the ones that went, the ones that when aren't the best chess player or the best chess program, it's the ones that work together best. And so I think VR is great when we know when to use it critically,
but not throw it at everything. And, you know, I flash back to the days of second life and going into an event in second life. And it was a PowerPoint presentation in second life,
which was a crazy thing to do because the cognitive overhead and the processor overhead were totally unnecessary. We had webinar software. We need to do a PowerPoint in second life.
We should do something more meaningful. The same holds true with VR. We should be using it when we need that immersive social capability. AR when we need to augment. And I think the AR more is a performance support tool than a learning tool.
We've kind of gotten away from ARGs, which would be a really interesting alternate reality games. I don't know if people remember that acronym, but those actually had power for learning. And we've kind of went away from it.
AI, the hiccups in the genre of AI right now, the hallucinations, and Marcus Bernhardt talks about this pretty well, really mean you can't trust it alone.
I hate when I see LinkedIn sending me an article gender by angle and comment on it. No, I'm not going to waste my time. You first have your own writers read it. I'm not going to fix things for you.
- I want to point out to everyone that this is something that LinkedIn has started recently where they actually have what they think is an important topic for their community and they actually have an AI generated article.
And then they say, hey, we think it's a good idea for us to let's community source it, come and correct it and this and that. It's like, well, why don't you get it right the first time? So I don't know. - Because one of them came out saying learning styles,
which is a robust belief and it's been thoroughly debunked scientifically. Talking about that in an article just shows the that these tools are good at finishing sentences in ways that they've seen other people do,
but they don't have any understanding of what's right or wrong or true or false. So the people are using its thinking partners, I think have it right. See if it generates something you didn't think of it to put into your article,
see if it can frame it in an interesting way, but you have to review what it says and take ultimate responsibility for it. So that goes back to the overall theme. Muse technology is as a partner to our brains,
not as a replacement. Meet together, the whole is greater than some of the parts, but you have to understand how they what they each contribute to be able to do that well. Clark,
I've already taken up 40 minutes of your day, which is it's amazing to me how quickly this this time always goes, but I want to give you the opportunity to kind of tie it in a bow there. You know, if we were to think about bringing, you know,
LND in his 21st century again, how would you summarize it so that, you know, people can take it away from this conversation? I would suggest that we had a major revolution in cognitive science that went from the pharmacological reasoning to this more contextually sensitive emergent cognition pattern.
We need to acknowledge that in what we do in our learning designs, in our organizational designs, in our societal designs, that we also have had technology,
continual technology development. The internet, I think, was a major revolution. I was actually late to it because I had been following technology things.
So there was the use net before the web. There was Waze and go for and these other tools. So when the web came out, I thought it's just another one.
I was willing to, I'm willing to be wrong. You would go gates. It's okay. No worries. So the point being that we need to put these together in systematic ways,
understand technology, understand our own thinking, and then figure out how they work together best. When we do that, we have the opportunity to take L and D and organizations forward more effectively and that's what we should be looking to do.
Fantastic. Clark Quinn, I cannot thank you enough for taking time out of your busy day to speak with me here on The Learned Podcast. If somebody wants to find you, what's the best way for them to reach out? Co -innovation .com is the best way to reach me.
I'm wearing several hats now. I'm a co -director of the Learning Development Accelerator. I'm chief learning strategist for upside learning. I'm also advisor to Elevator 9, several full -time jobs.
But Co -innovation .com is the first place to find me and from there you can find my thinking on my blog at Learnless .com. I tend to think out loud. And thank you a lot for the chance to talk to your audience and talk these ideas.
I greatly appreciate it. And I believe, you know, we're recording this in September, but you're going to be speaking at DevLearn in October. Is that correct? That's correct. I'm running a pre -conference workshop on better learning design,
the emotional component, which we tend to neglect. Not only do we neglect the learning science, but we have good advice about that. But we also, engagement matters and we don't have as good advice about that.
So I'm running a workshop that, running a session on the learning science side of it. The retention and transfer should be our two goals. So yes, I'll be there. I look forward, if you're there, stop by, say hi. Thanks so much.
Have a great day. Thank you again for listening to the E -Learn podcast here from Open LMS. I just wanted to ask one more time, if you enjoyed this show, if you learned something, if you were inspired,
if you were challenged, if you feel like, you know, this is something you can take into your practice, please do me a favor. And right now on your podcast player, hit subscribe. That way you're never going to miss a future episode.
Also, come over to elearnmagazine .com and subscribe there as well, because we have tons of great information about how to create killer online learning outcomes. Thanks.