Hello Learning Nerds!

My purpose is to help you along your instructional design journey. Whether you are looking to learn about designing meaningful learning experiences or landing an instructional design position, this site is for you. This site is dedicated to sharing about best instructional design tips, talking to amazing guests, and more.

EP-34: Smruti Sudarshan - Educational Data Mining for Instructional Design

As instructional designers, we should be using data to help inform our decisions when it comes the learning experiences of our courses and programs. Data can tell us so much about our students with learning behaviors, learning analytics, and even instructional design principles. How do we interpret this data though? One compelling case is to use Educational Data Mining (EDM). Joining us today is Smruti Sudarshan, an eLearning Training Specialist atLinkedIn, and she'll guide us through everything we need to know about EDM.

Connect with Smurti:

New Ebook:

To learn more about myself and show:

***My YouTube / Podcast Set Up***

*Book Recommendations*

*Trainings and Business*

Transcript

SPEAKERS

Luke Hobson, Smruti Sudarshan

 

Luke Hobson  00:00

Today's episode is brought to you by IDOL courses Academy. If you're looking to transition into the corporate instructional design space, you need the right guidance to do so. Dr. Robin Sargent and her team have done an amazing job of focusing on what's important to be a corporate instructional designer like storyboarding ID models and theories, interviewing, sneeze, creating in storyline, and project management tactics. They even cover how to make your resume and portfolio stand out from the crowd and have an impressive completion rate with their students working out organizations like Google, Salesforce, GM, Uber, and Amazon. You can actually hear more about these Robbins episodes on this podcast, which is episode number 26. So if you are more serious about becoming an instructional designer, or looking to further develop your ID skills, I would highly encourage you to check out idle designer put the link in the show notes today and tell her that I sent you over there. And now let's get this episode started. Hello, everyone and welcome on into the nerdiest podcast you're gonna hear today. My name is Dr. Luke Hobson. I'm a senior instructional designer and Program Manager at MIT. I also have my own blog, podcast, YouTube channel, and courses all about instructional design. My purpose is to help you make the online learning experience meaningful for you and for your students. And you can find all my information over at Dr. Luke hobson.com. Speaking of websites, my wife and I have been diligently working on a new website for the book, which means that I owe you an update. The ebook has been written and it has been edited. I decided to call it what I wish I knew before becoming an instructional designer. I wrote the book based off of questions from people like you. And those became the chapters if the title sounds familiar. Well, let's because it was based upon the blog, podcast and YouTube episode, I was all about this exact same topic, going back in time giving myself career advice, what would I say in order to really try to help me accelerate my career growth. And that's what this book really ended up becoming. And Dr. Karl Kapp actually called it a roadmap and talking about how to essentially navigate your instructional design career. Some of the chapters what they look like, well, they include what does an instructional designer do? What are the pros and cons of instructional design? What kind of instructional designer do I want to be when I grow up? Where do I see myself five years from now? How do I teach myself a new skill? How do I build a portfolio, and many other chapters 18 in total, to really help you well with navigating your instructional design career. A humungous thank you to Jen Abbott, who helped me with editing this book as quickly as possible to the timeline is absolutely insane. And she was just so fast and helpful for me, I cannot recommend her enough if you are looking for an editor, of course, if she has the bandwidth to do so. But she was absolutely wonderful to work with. And I just mentioned him But once again, I have to give a humungous Thank you also to Dr. Karl Kapp who read everything also at the speed of light and wrote the foreword for the book and just said so many kind and incredible things about it. So just absolutely so pumped to be able to share this with you soon. The manuscript is actually now in the hands of my designer. Since this is going to be an ebook, I want to look perfect. So I did hire a professional to really help me out with everything. Something that I wasn't expecting is that a lot of you want to preorder this book, which absolutely means the world to me, by the way. So thank you even for just considering to do this. And I'm currently working on the back end, trying to get this up and running for you to give you the option to do so. I hope to have this done within a day or so. And as soon as it's ready, I'm going to blast out a link to everyone as far as for how to preorder it find the link and and everything else have a search. Once it's all said and done, I'm hoping the book will be live within a week or to kind of give myself a little bit of a buffer it should be done actually within five days. But knowing how things work with websites and setting up different pages and blah, blah, blah, there's usually some additional stuff in the back end, you don't really realize it until you start testing it and you're like, Oh, I need to write the copy for this, or Oh, I need to change this link or whatever. So I'm hoping within a week or two, this is going to be out there. I'm certainly going to be keeping you updated every step of the way. If you're in instructional design Institute community, you'll definitely receive my update inside of our Facebook group. So join that one if you haven't yet already. And of course through the mailing list, and never different forms of social media. I'll be sure to update you. And so you mentioned to me but how you've been enjoying hearing about this month. The progress of the book just given you these updates. And maybe you're considering writing a book yourself, which is, I'm guessing is probably why you kind of like this whole insider knowledge about how I've been putting a book together with an only Actually, it's been six months officially. So the book from start to finish is going to be done in six months. So if you were curious about just how I did this, let me know, maybe I'll make something like a YouTube video or a podcast episode or something on just how I did this in six months without losing my sanity while still working, essentially, you know, two full time jobs and still being a husband and dog, dad, and everything else as well. Okay, but that's enough of the update. Let's talk about today's episode. As instructional designers, we should be using data to help inform our decisions when it comes to designing the learning experience for our courses and for our programs. Data can tell us so much about our students with their learning behaviors, but learning analytics, and even just basic instructional design principles. How do we interpret this data correctly, though? That is certainly the biggest question. One compelling case on how to do this is called educational data mining are commonly referred to in the industry as EDM to educate us on this approach. We are joined today by Smruti Sudarshan. She's an E learning Training Specialist at LinkedIn. And she is about to become your go to person, your tour guide, if you will, all throughout data, and really just how we can learn from it. What are the main takeaways and everything of a sort? I'm not going to take up any more time. Here is the one and only summary Sudarshan. smarty. Welcome to the podcast. Oh, hello, cancer inviting me on this? Absolutely, I am so glad that you are here. Because you are bringing on some expertise that we haven't talked about at all yet on the podcast somehow in the 30 something plus episodes, the entire concept of data has not been explored to the depth that we're going to be doing today. So really excited to have you come on and share everything you know about that with us. But before I get ahead of myself, for the folks at home, can you just introduce yourself? Tell us a little bit more about your background, who you are and what it is that you do?

 

Smruti Sudarshan  07:24

Oh, yeah. So I am an E learning Training Specialist. And I work for LinkedIn. So basically, this role was kind of opened up because of the fact that LinkedIn wanted to explore e learning. And they didn't have any self directed courses, as you've seen LinkedIn learning. It's just like videos, videos, videos. So they really want to explore the E learning benefit. And there's me and one more teammate of mine. So both of us work for LinkedIn as E learning training specialists. So we specialize only in the E learning aspect, and the digital learning aspect.

 

Luke Hobson  07:59

Fantastic. And as somebody who like I've love LinkedIn, like that has somehow become my jam. So the fact he works on LinkedIn is so awesome. But when people talk about social media platforms, they're like, you know, like, which ones do you use and like, I have to use all of them. Because it's unfortunate that like, you need to be on everything all the time. But if I had just one preference, and like is let me go on LinkedIn, that's, that's my home. That's my bread and butter. I got like LinkedIn more than anything else. So the fact that we can explore and kind of like talk more like under the hood and talk shop about this, if LinkedIn is really cool. So like, I am super excited to talk more about everything. But speaking of all of this, what we're going to be talking about this evening is educational data mining or EDM, we're going to obviously get into that this evening. But for some of the folks at home, I know that there are just a couple of different definitions that we really should go over before diving into too much, because if not, I fear that we might lose some people along the way. So I just want to start with the basics. Can you give us a general overview of just data mining itself?

 

09:05

Yeah. Okay, so is it okay, if I can take you through an example as of your life?

 

Luke Hobson  09:11

Yes, please do.

 

09:13

So what happens usually in a classroom or in any kind of IoT or reality training is that you sometimes they have their cameras, but then usually in a classroom when you're actually on and face to face platform. So there, you can see your learners facial expressions. So where like, say when your lecture is getting boring, or when your content is getting boring? So basically, what do you do you either throw a piece of chalk or if they're yawning, then you just give them a pop quiz and you say that okay, this is your purpose is a surprise because you got you guys got to attend everything right now. So that's the kind of facial expressions that you can gauge while you're there on a face to face or while you're there on an IRT on on field. But what happens when you come online, so The only thing that a learner is interacting is with your computer. And how do you measure that. So there is one measurement term in that case, which is called as the digital footprints. And those are similar to how your learners expressions are. So for example, if I take Amazon, so Amazon has really good tracking footprints on their website, because if you see you go and add something to the cart, the next time when you come up on Amazon, you have that same product or similar products popping in on your website or on your page, right. So similarly, even Netflix as well. So I love horror movies. So whenever I go and click on a horror movie, the next time when it says that the next you want to watch our movies you would like to watch would be like all the horror movies that come up on Netflix. So similar to that, what we're doing here is that we are going to track the digital footprints of a learner, and which is either on an LMS, or any kind of classroom, you might say, or any ms education or anything past thoughts, which is mostly digital. And that digital footprint is what is put in the database. We analyze those footprints and get in a model, a learner model as such, like how exactly learners behave while they're on your course. So for example, if say, companies do have this compliance courses, which employees have to take it forcefully, and then they just go on clicking Next, next next until you get to the quiz, right? So what happens during that is that so each page that you scroll, or each click that you do is just for one second. So that's where where we analyze, like, if there are 10 learners on the course, then if all the 10 learners are just doing like 10 seconds, or one second click over there, there, we analyze that, okay, this course is not engaging enough. So we better go to change our strategy, or we better go to change something else on that. So similar to that is your EDM. So now what we have done is that we've collected the data, we have put it in the database, and we've created a learner model. So how is this learning model created is by mining the data? So mining the data means like, you know, how exactly do you go to a query and mine of coal or you go and mine chords or any kind of metal, similar to that, imagine you have numbers in there. And imagine you have like, tables in there. So, or some kind of a schematic representation. So that is something you got to analyze. And then that is something that you brought to mind. So that is known as your data mining. Now to add to that educational data mining is where you collect or mine the learners data that you've collected on your LMS, or on your LRS, or wherever that is on your, you know, digital digitally, wherever you're tracking them is where you collect that data, and you analyze that in order to get that behavior model. So that is known as your EDM.

 

Luke Hobson  13:00

That was perfect. I was gonna ask you more questions on definitions, you answered nearly all of them. So thank you for that very well done explanation about all of that. Also, random side? No, I love them when you buy something on Amazon, like I just bought a new door lock the other day, and now it's nonstop suggestions. I'll recommend a product about bond. Now, please stop. Like I only needed one. And now all of a sudden is this like, Oh, you need all of them? Like, oh, okay, well, I wish there was a way to give feedback. But that's fine. That's for another day. But that's not what we're talking about. just random side note over there. Now, the other question I want to ask you about all of this is that this seems like it should be mainstream. I feel like everyone should know about this, how to do this. But I know that for a lot of instructional designers out there, it's not a part of their kind of everyday routine are a part of the process. We say that it is we are collecting feedback, we are analyzing, you know, we are doing everything. But then I hear from a lot of people that they're never really able to go and then implement this later on, or perhaps for the next run the next year, or the next thing, you know, whatever it is, why hasn't this become a mainstream part? Of course design yet?

 

Smruti Sudarshan  14:14

I think mostly because of the fact that as far as I know, in l&d, what we've been doing is that we're kind of having courses or designing curriculums pretty intuitively and not letting the data do the speaking. So this is something that I've seen even here, like, you know, most of the companies or most of the educational institutions that I've seen and consulted with, I've seen them doing it more on an intuitive basis. So there was one educational institution that I consulted, and they were having a DTP process. That is the design thinking process. They were giving out the design thinking process. So very intuitively like in terms of like just having 90 our session and I They had like 90 minutes or 60 minutes session per week, and the students were there online. And they just have to attend like one and a half hour of those sessions. And they will be put in breakout rooms and activities, and so on and so forth. But the point is that they never analyzed that they were supposed to have it in a more interactive fashion, like by giving mostly like business case scenarios. And like those many challenges or scenario based learning where they have like branching scenarios, and then, you know, you got to select each one of them. And then at the end of the day, you get some or, you know, analysis and thought sorts. So this was something that they kind of developed it very intuitively. And even if they even if the data told them not to do so, they did it. So because they felt it was right to do it. And they felt they knew the learners better than then data knew the learners. So I think that is the reason why it's not kind of become mainstream as of now. But I think I have seen in corporate learners, at least, I've seen it becoming, there's a gradual shift, it's slow, but it's happening. But in education institutions, it's still there, the rigidity is still there, in terms of why are we supposed to use this data? I mean, we are teachers, we know our learners. But why are we supposed to use data for this, you know, understanding the process and then telling them, you know, you need that you need this and all that. So I think that rigidity is something that we need to let go of. Yeah,

 

Luke Hobson  16:25

absolutely. Agreed. I know that I have been over here clamoring now for forever about why aren't we doing pilot programs more, which is something that like is become a part of my normal routine is to now for every course that I do design, every product program that I'm launching, there has to be some pilot version, because I want the data. That's what's so important to me is that I've, I have already experienced the failure of thinking, but I know what learners are going through and then come to find out, hey, I was wrong, and I can make something better. So I learned about that a while ago. And I was like, whoops, okay, now I'm just gonna make this a part of my process to make sure that my learners are going through what I actually am expecting them, but I'm really I'm trying to design around, I'm hoping they get this experience out of it. But perhaps really, they're not. So I was served my slice of Humble Pie years ago. So now it's a part of what I do. But for what you were just talking about, I know that some people really don't like change and reading data and making this a part of the routine is a big thing from the individual level, but also more from an organizational stance. So what In your opinion, is an effective approach to try to make some of our key decision makers within our organization adopt everything with the idea of EDM?

 

Smruti Sudarshan  17:34

Hmm, I think that's a really good question. I can answer in terms of corporate learners, because I had a pretty good understanding about the corporate learners. So what happens with corporate learners is that they require something much on the go. So I think in the lnd aspect of it, the instructional designers or maybe like, say, a course developer, or a Content Developer, as such, can get this data. So this is something that they need to ingrain like the data driven approaches. So there's one example like case study, which I can give you like, the previous company I was working with, they were into a lot of this, you know, like b2c customers were what they were handling. And those see customers were kind of getting these professional, certified courses, at the end of the day, when they take up this, you know, whatever self learning course that we used to give them. So what happened there was that we had like, I think, 250 page, just click next self learning course, which is, which had everything to do with, you know, one particular product at that company. And that was something which, which the data when we brought in, and then the net promoter score that was brought in, it said that no one is going to get certified, and typically known as actually getting certified because that certification cost like 250 dollars. And if they just take our, you know, the self learning or whatever they are taking, as of now, they would never clear the exam, because they just had clicking Next, it was just like a textbook approach that they had in there. And plus, we had to invest more on our virtual trainings, because they wouldn't understand what was given in the self learning. That was there. So when we brought in that data, and when we brought in that net promoter score to our content developers, then they understood the fact that okay, this is dropping, because of the fact that we were giving them a lot of next buttons over in there. So why don't we do one thing, we change our strategy, we give them a visual menu, we give them five different lessons. Each lesson has one learning path. And that path takes you to one course, which is much more interactive, like they had like few videos in there. They had thrown in some inline checks, and then they also gamified something in the end, which had a Jeopardy game. And they said that Having them go through the entire course, once again, the summative assessment was in the terms of a Jeopardy game and the entire certification courses, I think they had an exam workshop or something. And that workshop is where they had this exam blueprint. And they created all of that. So I think this data should be really made available to the content developers, instructional designers and the learning consultant before anyone else gets hold of this, because they are the ones who kind of strategize everything, and then they know what to do with that data much better.

 

Luke Hobson  20:34

Yeah, absolutely. And like, for me, it was always like, the proof is in the pudding that when I can go and I can show that or I can like, give me a look at what we did, like, look at the feedback that some people were like, ah, got it. We shouldn't we need to do this more. It's like, yes, like, why are we doing this more? That's awesome. Thank you for sharing all of those examples. So let's say right now, I'm an instructional designer, I'm going to assume that before I dive into copious amounts of data, because like, as anyone knows, like, the behind the scenes with everything for an LMS, or any learning platform, they resist so much data that you can collect it by the run by the year, six months, the individual, uh, you know, whatever it possibly is. So I'm just going to assume, but I need to have a strategy of what I'm going to be doing first before I just kind of just throw myself in there and kind of figure it out. So if my goal is really like step one is which is going to be like for learning behaviors, for instance, like, Where do I begin? How do I really start this entire process?

 

Smruti Sudarshan  21:33

So I think the first way to begin this process would be to understand the different kinds of online learners. So there was an age where we would understand like, you know, what our, like, learners are like, you had like seven learners, I guess, like audio visual learners, kinesthetic learners, and all of that. So there were those learners, we would kind of analyze them. And we would provide just continent teaching material just for them. But now, it's kind of changed to the online learner aspect, wherein they just like five kinds of learners as of now on online, and the most fickle minded, or you can say, like the most explored, I mean, the most adventurous among them would be the exploratory learners, who kind of explore each course on your elements and drop off at one point. And they would just go and then explore another course. And then they will drop off at another point. So this is, so these are the kind of this is the kind of, I think the starting point should be this is to understand how exactly your target audience is, and what kind of learner do they fit into. So if you're catering to like, say, and, for example, if I have to, say, an audio learner, so audio learners, basically, who tends to hear a lot and learn a lot. So in that way, if most of your target audience for that is supposed to be an audio learner, then you say that you give them a lot of podcasts, and then just say, like, you can give them an, you know, an inline check in there. And then later on, just go ahead and give another podcast and then just give another inline check. But again, this is a very simplified way of saying it. But yeah, so the strategy can change according to what your learners needs. So I think the first step would be definitely to determine who your online learner is, and then go about strategizing. So in that way, you would not make your content specific to you, or you're not designing intuitively, but you're designing keeping the learners in mind. So basically, you would be turning into a learner centric design, rather than being an intuitive designer.

 

Luke Hobson  23:40

It's really fascinating about all of that, too, because I've learned over the years to also try to think about things more about learner preferences. As far as for like you don't know what you need, you know, you might think that really you do learn from auditory, but then maybe I can convince you otherwise, which has been kind of my goal is I'm like, yeah, you don't really learn one style. Like I bet I can show you a bunch of different ways. And you're going to be able to learn if I design things in the appropriate manner, you know, it's coming down. And one of the things that you mentioned when I mentioned to you about, like, an appropriate manner is that you also talked about in your blog post, talking more about like the sequences in the patterns as far as for what learners are introduced to. And I've kind of found like my own pattern that works well with like how I design like just even just a module, not even a full, you know, it could be a full week or could be a little snippet, where it's always like, the introductory the reading goes to a video goes through a practice question goes to a video it's like I have it kind of laid out like these different types of blocks with your research and everything you've gone through. Have you found something similar like that? I'm like a certain sequence of how you want that structured in the course itself.

 

Smruti Sudarshan  24:53

The main thing with EDM is that there is no structure so the structure So, it depends on how your learner wants it. So if Okay, so let me give you this example again. So we had this thing, wherein we had this very great, brilliant idea of gamifying, the content for field developers. And those field developers were the ones who would take calls online. And they would kind of rectify the customers in on on the field. So basically, what what would happen was that if there was a troubleshooting ticket that would come to you like troubleshooting ticket as like any issue that would come a customer would say that, you know, I have an issue with my laptop, just go ahead and fix it. So example. So what we would do is that the field developers would call them and then they would ask them and say that, like, what is the issue, and they will troubleshoot the issue in in an on the call itself, or they would share the screen, and they would have it would they issue. So we had this really brilliant idea. I don't know where we got this idea from, we thought we would gamify their content, because it was really good, the content was really nice. And we thought like, okay, let's just go ahead and gamify it and give it to them. We have like all game mechanics, and their progress bar scores and all of that. But the kind of satisfaction that we got the NPS, the net promoter score that we got from the field customers and field review was was that we really don't like this course, because it was really strenuous for them to clear each level of the course. And then give a solution to the customer. That was like being really, you know, that is like the only being, you know, I don't know what it was not kind of going with their job, what they were doing. Plus they needed training, which was like, you know, they just have this training document, they pick the training document, and they say that, okay, this is the solution. This is the training, okay, now, I've learned it, just because I've solved the case. So it's mostly like on the job training and on the go training that we used to do. So what we did was that when we got the score, and then you know, when the data said that, you know, they're not happy and stuff like that, we went back to the stakeholders, and we said that see, they're not actually happy with the scores, I mean, with the course, you may want to change it to something much more simpler, like a micro learning or you know, just maybe just providing them a job aid, or you know, just to help their job because they know everything, it's not that they don't know anything, and game is something that you're starting like introduction, and then you clear one level, and then there's a mission, you clear this mission and then finally went about. So it's just that they just need one job aid, and I don't think much is needed for them. I mean, maybe you could just eliminate the entire gamified content. But since a lot of investment had gone into it already, like creating the game, and then strategizing the game and all of that. So since a lot of investment had gone into it, they could not completely get rid of it. But what they did was the target audiences kind of changed. So the target audiences for gamified content would be the new hires who would come in to like just college graduates would come in and would just want to learn the job. So that was where they were introduced to the gamified content. But then people who already knew the job, and then they went into the job like for two years, three years, we just provided them with a job aid and like few micro learnings for like 1520 minutes, wherein they could just take the call, go through that micro learning and give a solution on the spot. So with that, what happened was that the number of tickets, tickets, as they keep a count of tickets, I need to troubleshooting situation assault. So the number of tickets that was getting resolved were much greater. So because they were using this training content, it was much greater. And also the retention rate for the new hires was much higher. So since they were playing a game as soon as they came in, and then they they already knew they didn't know what was there. But then the gamified content saw sort of taught them the entire process of what exactly happens during that troubleshooting. So I think in that way, the data worked out pretty well for us. So just to say that you know how exactly it works. And for each one of them. So I think in a way there is no pattern as such. So even if there is a pattern, the pattern that can be formed is your online model or your online learner model. So what I mean by learner model is that how exactly do you see your target audience as so what different learners are there? And how can you cater to them universally. So again, this, this, again, can be a little bit tricky in terms when you're, you know, on a global scale, and then, you know, you are, you know, situated on different countries and all of that, but it does help. Like we had a huge problem with just global English. Were in a few places, I think, in Latin America, and even Africa somewhere in some places in Asia as well. They couldn't understand that English. So customization and localization became very, you know, helpful over there. And this kind of, we understood this by the heat model, because in those three places there, there was no one who was taking that course and Since it was an English only course, but then it was a mandatory course, they had to take just too long their job. So this is something that we figured out using the data and we invested in localization. In that

 

Luke Hobson  30:13

step that's really interesting. It reminds me to one of the programs that we developed for 3d printing, where we found that the people who were taking the target audiences you were talking about is that really, we could clearly define the folks into three different personas, where we had those who were like the the super engineers, this is the 3d printing is their life, it's exactly what they do. Then you have those who are the novices where they're like, I'm kind of, I'm kind of curious, I want to like dip my toes into here, and maybe this is where the future is going. I'm just gonna get a jumpstart now. And then we had other people who are more of like the finance perspective, where they're like, Well, my organization is going to start doing 3d printing, I need to understand this more talk about the cost and analysis. And is this right for my organization. So naturally, we couldn't fit the content for those people the same way doesn't work. They're very, very different people all within the same course. And we ended up doing a type of like, choose your own adventure, basically, is what happened. So it says, like, you have like different tracks, essentially, where it's like, oh, you're going to go down and like the the super nitty gritty rabbit hole of 3d printing, you're on like the expert track, you're just learning to learn more about costs, we're going to be using these business models and talking about the finances like you're going down this track. And, and that's how we made it work within our own LMS. And that's everything. But your story reminded me of and that's, that's what ended up happening. So going back to basically just just proving myself from three minutes ago, the sequences were not true. They're not the same way, and to be customized. So thank you for mentioning that as I default back to not going in that direction. So that is actually really funny. But super true. Yeah, this is, by the way, folks, we record this and we just run it live. If you're wondering, that is true, we do. So for this then Han and talking more about everything when it comes to data. So now we talked a little bit more about everything from the behaviors, we talked about the sequences, we talked about all these different things. So after the data has been collected, I know that we're going to have to access reports with learning analytics, supposedly, if our platform has it, if not, then well, we'll digressed about a little bit later on. But suppose that it does produce different types of reports that we really can use. How does somebody who might potentially be a novice about this? How does somebody correctly read and try to interpret this data as far as you're trying to be able to think about next steps, because I know that sometimes when you initially read it, you might have one idea about what to do. But then in actuality, it's a lie, you should be doing something else. So how do you read this correctly? First?

 

Smruti Sudarshan  32:48

So yeah, that's actually a pretty interesting question. So this is what you know, there is one design that comes into picture when you're reading the data, that's nothing but your data driven learning design. So what happens in data driven learning design is that it's basically a cyclic process. So say, first, you get the reports. Now, even if you even if you monitor, like, say, for one, two months, you're like your coaches course has just launched, and you're just monitoring your entire course. So that goes into the monitoring phase. So once you monitor your course, then you get go to the reporting phase of it. So in the reporting phase is where most of them can kind of get confused, when like, which report to pull water pool and how to do it. And you know, stuff like that. So that is where that is where this learning design comes into picture, wherein we kind of have a rubric. So it's kind of plugged in, or to the system. Or if not, this is like a very basic version, what I'm saying a rubric, if not a rubric, then you can actually have just one. So they do have something known as natural language processing. Wherein, which kind of reads your entire data online. So if you're saying that your learner is dropping off at this time, and the second, or what desktop are they using? So if they're using desktop, tablet, or mobile devices? Or if I have to say how much percentage of each course have they completed? So if I'm talking about one set of learners, and they've just completed like 33% of it, like why is it that all learners are just completing 33% of this course. And even if I say that in a gamified content, it does create a leaderboard right? But on the leader board, what happens is that the NLP language gives you real time data. So when you take that real time data, you know what exactly what point with student what they're doing. So if I say like, say I have student one student to student one, gets it gets the question or clears the level at say, the fifth second or the 45th second, but The student three clears the level at like, say 62nd or the, you know, 65th second or something like that. So this difference between them is pretty huge. Like if learner one is clearing it like say 40 a second and this one is clearing it at 60. A second, why is there a 22nd gap. And also there are times what has happened is that it goes into minutes. So, if learner, if learner one has like, say 30, he has spent he or she has spent like 31 minutes on one level, but learner two has spent like say, just 15 minutes on one level, like why is there so much of a vast difference between them spending each one of you know, the time it served. So, this is all the real time data that you can take. But, but again, there's a, but there's a little trick to how you can do that, on any one of your authoring tool, you just have to enable the LMS. And then the tracking part of it. And you have something known as x API, or Tin Can API, which was previously called so Experience API. So this kind of helps us track the footprints that I was talking about the learning part of it. And that is where you get a very, you can pull that data, and you can analyze them in the reports. So again, reports is a little bit of, you know, mostly our l&d consultant, and our ID, job kind of gets tricky when you're getting into the numbers and the rows part of it. But once you get into the routine of it, and just analyzing the data, like the drop off rate, or even if I say that you know, the time part of it, and even if it even if I get a dashboard of how exactly the learners are learning, I think that should pretty much be it. I mean, we don't have to, you know, go in depth into what it says until and unless you're very interested in what exactly data has to tell you. But that's where the reporting comes into picture. And then comes the design. So depending on what your report, say is where you're gonna design. So if you're designing your learning objectives, or you want to kind of pivot and then say that I want to change my learning strategy now, because you know, the learners are not engaged now. But I need to change my learning strategy. So that's where your design phase comes in. And from design, again, you go to monitor, report, design, monitor, report design. So it's a very cyclic process. And it's not necessary that you cannot roll back to one phase. So you can definitely roll back. So from design, you can go into the reporting phase, and you can check the previous learners data and see like, what exactly is it that you know, I can do better. So basically, what happens in there is that your training is aligning with your business needs. And also, what you're doing is that you're aligning your learners from the beginning to the business needs or to the learning objectives that you're going to design. And those learning objectives will definitely be linked to the business objectives that is in there. So that is where the entire reporting cycle, or the data driven learning design comes into picture.

 

Luke Hobson  37:53

I'm really glad you mentioned about time, by the way as your example that you were diving into, because it's exactly what I was thinking about, or trying to interpret something where before in the past, I interpreted the different types of points of data with time, as you know, one thing but then after talking with because I actually talked to my learners, I know shocking, I talked to people What a weird, weird concept. But I talked to them more to say, like, help me understand what you actually mean, because something's not clicking where I was thinking in my head, but this problem type was only going to take people 30 minutes, on average. And some of them told me it took me five hours. And I'm like, Why is it taking you five hours, I did something horribly wrong. If it is. And I'm, you know, I keep on staring at all these different that we use on Qualtrics in my in my work in using Qualtrics to collect this data, and just keep on looking at it. I'm like looking at the report. And I was like something's wrong. Like what what is wrong. And finally, from talking to the learners, I learned that I did not clearly specify how many words people should be writing for this assignment. So because of the way that it was laid out, it was actually in a kind of like a small frame. So naturally, so when I go to answer something in a small frame, I'm like, Oh, it's like a paragraph because it's not meant to be a paper. It's in this tiny, little, little frame. And instead, we had people who are writing books, who were just going into Word and like copying and pasting pages upon pages, and I was like, No, like, oh my gosh. So after I learned that, I'm like, Okay, I'm going to change that for going forwards. So that was just my example. Or as this like lips like I you know, I didn't know what to do with this. And it took that extra step to try to figure out So time is clearly a huge factor. Any other topics, though? I know. There's quite a bit. Any other topics though, from that stance that we should also kind of be aware of?

 

Smruti Sudarshan  39:43

Hmm, I think now since you have gone online devices payer play a very major role. So if say, for example, you're giving a content online, but it has to be rendered on all three devices. So you have to render it on desktop, on your mobile While and also on your tablet. So I think that has played a major role because for us, I think I've seen in both in education as well as in our corporate learning, the only one thing that I've seen in common is that I think they use mobile a lot more now, especially once they have come online, they, they do have a lot of mobile that they use. And also I've seen with kids as well, from seventh to grade 12, standard 12,000, the high school graduate, so they, they kind of have a lot of addiction towards the tablet. So even online class that you have to take, they take it through a tablet, and no one's actually sitting on a desktop and viewing your course. So it has to be rendered on all three. So it's just that people who are there in like, say, like the older generation, like the baby boomers from our times, and then the Gen Z, if you see, it has to be rendered through all three as of now. Otherwise, it would kind of be a little difficult. But for us, I think two years before COVID. Mobile was one and also desktop was another huge one, because people would come into their office, and they would usually schedule their time for training in the office hours itself, nothing to take back home. But they also did take back their phones online. And they had to like, you know, for example, if someone's taking a metro, and if an employee is going from one place to another and the Metro is like an hour long, or like 45 minutes long. So they will just complete their online course in there rather than sitting in office and completing it. Or like say when during a cup of coffee, and they would usually browse through that. So it has to be rendered. So all of them. So the statistics, what I can say is that no, after COVID desktop has gone down to 48 mobile and tablet has gone tablet, I think was 78 when I saw previously, and I think mobile has gone up to 96%.

 

Luke Hobson  42:01

Yeah, that makes a lot of sense. I'm trying to think about when's the last time I use my iPad? Nope, definitely the phone for everything. And I also specifically got like the largest screen version, when everyone I first got that everyone was like make fun of me because I like you have a brick has a phone and like, but I can see I don't, I don't need my laptop anymore. Like this can literally do everything. And I'm really glad you mentioned that too. Because I know that some folks, they have had to write different explanations to let the learners know about like how to use a platform if it's not able to be adapted. So for instance, for some of them, I know that when you're using a discussion board of lovely discussion boards sometimes doesn't really work for a mobile friendly device it like it has to be a desktop and then other times, the discussion board could go on for literally forever, it just goes down the entire screen. There's no cut off point. So that is an excellent recommendation. Because it's it's so true. And sometimes and what's the view the user we're now we're talking more about UX kind of thing. But now we go down the UX rabbit hole is that like, if you're not testing this out on all the different devices, you won't know either until, you know people actually tell you, you're like, Oh, I didn't know. And, and same thing to for different browsers and whatnot, because I have an apple person for forever. You know, I can't tell you the last time I touched anything, Internet Explorer, or even Firefox, which I know I can download, but like I don't, I don't use it. So yeah, yeah. So I have no idea what happens on that until I test products. So

 

Smruti Sudarshan  43:30

yep, that's actually correct. Like the ones that you just say, the apple and then the Android, sort of a thing that's there. So Apple uses Safari, right. So it uses Safari browser, and chrome are usually, you know, Explorer, and all of that is used on Windows. So we do get a lot of mismatch in there. So even if you're using like the highest level of x API, we're in there, since Apple has a lot of control, security control that's in there. So that cookies issue, we didn't know in one of the cases that it would kind of create this cookies issue for us. And what we just did was like we just implemented that x API. Like we were so excited, like your x API, let's do it. Let's do it. And we just implemented that x API, but we never kept in mind that okay, Safari has this cookies issue, and you're supposed to enable the cookies issue. But that was that kind of became like a challenge for us because none of the learners took the course because most of them were Apple users and they were Safari users. So none of them like almost like what we invested like some dollars like 1000 $1,500 in it, and then none of them use it and then we were like kind of like thinking like why is it that they were not using then there was this one student was really brave. And he said that man you know what we cannot use it because we are getting this error here. It says that you have to enable cookies. I don't know what is cookies, you please explain what is cookies. Then we had to go back to the LMS and then we have to say that this is the document for enabling the cookies, please ask your parents to enable that for you. So that was an issue.

 

Luke Hobson  45:05

Yeah, I would definitely do it. That would slightly alter the learning experience. If you can't, you know, do anything minor, minor details there. It's fine. So moving on from learning analytics, a final thing that I have, and you touched upon a little bit of this, but I'd love to kind of go further down this because now we're really going into instructional design land is, in the article you mentioned about how EDM helps address certain questions related to instructional design practices and strategies, really just in order to make everything more learner centric. And you've talked about a little bit as far as for how EDM does inform design. But can you share any more examples just about this a really does how everything with instructional design and EDM should be should be playing nice and coming together?

 

Smruti Sudarshan  45:49

Oh, yeah. So this one, okay. So one thing with EDM and instructional designing is that, so it, so when you say that you are an EDM user, and when you say that you are an instructional designer, it says that you're using, you're using this entire data driven learner strategy, maybe of like this for it. So you're using that entire data driven strategy just to make your learning more quantifiable and more measurable in terms of data. So it's like the entire strategy depends on what the data says. So for example, if say, if one of the course content has got learning objectives, and the stakeholder comes back to you and says that, see, we need a learning objective page here. But then you go back to the stakeholder and you say that no one's actually spending time on this learning objective, you're just spending way wasting a lot of time just doing this learning objective page, which is not of any help to the learner as well as to, you know, whoever is seeing it. So what we do is that we go back with that data. So your teaching, or your learning strategy is not just backed by what you think intuitively, it's basically backed by that entire data that is there in place. So one case study, or one example that I could give you, as of now of my head is that we had a series of videos for one of the courses. And that was supposed to be the best course, in the entire LMS. That was supposed to be the best course, because we had invested a lot into the video making part of it, we are not invested much into the reading part, or writing part, or nothing much with a clickable element, which is just a series of 10 minute video. And the entire course, I think had six videos. So it lasted for about 16 minutes or so. So it was like one continuous stretch of videos, nothing in between, just in the annual, the learner will get like, say one quiz in the end just to check how his understanding is going and what is happening throughout the entire course. So what happened through that entire video, and that entire one hour stretch was that what we saw was that after two videos, the learner paused. And then there was a drop off, there was a pause. So basically, there would be like 20 minutes on our on our logs, how we would see that there would be just 20 minutes first, and then there would be a pause of about 30 minutes. And then there would be a drop off after that. And then the learner would log in, again, like say, after two to three days, from the 20 minute there would be again, another continuation and then again, a drop off. So this kind of drop off. Initially, that 20 minutes was engaging, because the content was engaging, because it was just an introductory content that was in there. And obviously if you're learning something like an AI or you know, data science or something like that, like 3d printing, as you said, it's very interesting when someone introduces something, that concept to you, rather than, you know, you, you kind of implementing it on your own. So what happens was that, since it was very engaging the first 20 minutes, so a learner would sit through that 20 minutes stretch on there, and then he would he or she would drop off. So this is kind of this was a repeating pattern that we saw. And what we did was that we kind of analyze this pattern and we set it like why is it for every batch, so we would divide them into batches. So why is it for every batch that this is dropping off? Like exactly at 20 minute, why is it dropping off? And then when we actually sat and analyzed and as you said, we went back to our learners. And we did like, you know why why are you dropping off at this point? Is there any issue with a course that you're you're facing? Or is there anything? So what had happened was that the first 20 minutes was just theory. And then later on came the coding part. And then coding part, you just cannot see a video for 10 minutes on coding until and unless you implement something on a sandbox environment. So that is where we kind of figured out okay, this was this is wrong. This is completely wrong. You cannot just sit through a video for like, say 20 minutes and then you know, just give them coding stuff. So what we did was that we had a 20 minute video in the beginning. So we divided that, I think it was seven and seven. So the first seven minutes was introductory. And then we gave them a sandbox environment on our, on our LMS, which says that you're now you've learned this, why don't you solve this ahead as of now, so we would give them coding questions in there, and then go on to the next topic, and then give them coding questions again. So basically, what was happening was that from 16 minutes, we shortened it to, I think, seven, seven into three or so 24 minutes. So basically, it was just half an hour. And then the rest of it was for their coding. So how much of our time they take for coding is up to them, it's on the learners. And in case they wanted to reach out to the, you know, instructor or, you know, whoever they needed help with, they had an email id over in there. So this is the most creative that we could get with the data that was available. But yeah, we don't have chat bots and stuff like that now, but then that time, this is pretty creative, wherein we just give them an email id and they will drop in an email. And the ability was that when you click on an email id, that email would pop up. So that was the best that we could do at that time. And then what what happened was that not only did the drop off rates come down, it even the retention rates kind of increased. Because, you know, they were showing their performances on the sandbox environment or in the coding environment that was in there. So I think this, this is the kind of thing that the data will usually drive you towards. And they will also few other things which we saw with corporate learners, wherein they would sit for like five days straight of eight hours of training every day, and just learn like AI. Ai, day one would be like no coding nothing, they would just sit and learn what exactly they are, they would be given textbooks. So this is your textbook, this is how you got to learn, go home and implement whatever it is, if they implement implement, if not, they do not implement. So what we also learned throughout this research was that lot, okay, again, you may contradict me on this. But then, I mean, anyone could contradict me on this, because we had a lot of, you know, tough time convincing the stakeholders as well, saying that higher the effort, learning is less, so the more efforts you put in learning was less for the cartridge learners. So what happened was that while while they were giving exams, or while they were like exams, as an assignments, and then taking up certifications, and all of that, so for them, that effort was calculated in terms of how they were sitting through the course. So if someone is really active in the IoT class, they really Hey, what is this question? What is that? What is this and all of that? So then that would be their performance gates. But then the same person who's asking so many questions will not do well, at the end of the course. Similarly, what happened with our online courses was that when we went in, and we said that, you know, you are putting in higher efforts for quiz 1234, and eight until eight, until quiz two, the performances were good. But then after quiz two onwards was where the performances started declining, because we had given them a DIY project sort of a thing. And in the eighth quiz was where we just saw like two or three members, who would clear that entire course. And that was the kind of effort that they put in, in the end of it, but then from one to whatever, seven that was there, they didn't perform that, well, it was just in the eighth part of it is when they performed really well. So that's where we came up with this theory of, you know, higher the effort doesn't mean, you have better learning, sort of the thing.

 

Luke Hobson  53:39

That's really tough to wrap my head around. Because now, of course, like as an instructor, I'm like, Well, what could I have done? Could I provide more support? Could I have done you know, like, what, what potential additional steps can we take to try to do that, but all obviously at the same time to understanding about there's only so much time in the day, the results are expected basically, yesterday, and like literally any of these types of different trainings, Silver's always these different factors that I'm sure you've already thought about what you can and cannot do within the budget, the scope, yeah, timeline, you name it, there's always some parameters. You're stuck in, you know, so yeah, that's, that's actually really interesting. So with that, and the last question I kind of want to just ask you about, which actually segues very well into that it's more about when you have the lack of ability to do your soul trying to obviously go through everything, but let's say either your ability is lacking as far as for collecting the data or processing or you know, whatever it is, is there another way to obtain and try to use this information when you have so many different barriers that you might be facing?

 

Smruti Sudarshan  54:42

Yeah, I think that's the challenge with this is is that like you require everything they all time, but there are a few things that you could do like offline as well. If you if you do not have a proper infrastructure or you know, you do not have an LMS in place, but you If you have normal office 365, and you have like Google Sheets, you have Excel. And if say your number is like, say 10 students, 10. learners. I mean, that should that's, that should be it. So I think, you know, you just need to have an Excel if you have Google Forms, that's good, because Google Forms does give you data. So if say, if you're telling, like, from what scale to what scale you like it, or from, you know, what time to what time the learners are setting, through my course. And then you give them like options, Google Forms does generate data. So in case you do not have any of like, the LMS. And then you know, you don't know anything else. So I think those two are like the go to tools which I use, in case I don't have anything online. And the other thing that I use is menti meter, I mean, this, there is a thing known as mighty meter. It's an online software. But it does give us you know, it's usually used for engaging students online, but I use it to collect data as well. So if say, suppose I have to, for design thinking, if I have to say, I need to collect data on how exactly the students are empathizing with a customer, or, you know, if they are given some kind of empathy interviews, and then you know, you're saying that, how do you exactly empathize with them. So that is where your multimeter comes into picture. And you can kind of send out those links online, and you can collect the data online itself. So these are a few tools which you could use, in case you do not have a profit LMS or an LRS in place until and unless it's been set up. I think Excel is the best until no offline Excel, Excel for any data.

 

Luke Hobson  56:40

I agree with you 100%. I use Excel for everything, Google Sheets, you name it, like those, those are my jam. That's what I use. Everything I understand completely. I do, there was an entire data team, I used to work with every university. And while they generated the report, they then gave it to you over into Excel. And then you could you could pick and choose what you want to do and what exactly you're gonna do for a strategy for outreach. And then you know, all those different things of that nature, too. So I understand I totally get it. Well, thank you so much for coming on the podcast and enlightening us about a number of things with data. And I certainly learned a thing or two this evening. So it's been absolutely awesome to have you on. For everyone else who wants to be able to follow along with you, where can they go to connect with you and learn more it is of what you do.

 

Smruti Sudarshan  57:23

I think the best possible thing would be LinkedIn, as of now. So LinkedIn is the best possible thing. So you can connect with me on LinkedIn. Also, my blogs that are in there on my website, maybe you can leave a comment in there, too. I think that should work.

 

Luke Hobson  57:38

Wonderful. I will put down all the links in the show notes, as always. But thank you so much for coming on the podcast once again. Really appreciate it. Thank you for coming on the show. Once again, folks, be sure to connect with her on LinkedIn and read her blog posts. They're beautiful looking, by the way too, she did a fantastic job with the aesthetics of everything very easy to follow and to read. Absolutely check out those things. Both of the links are down below in the show notes to be able to connect for on LinkedIn, and to read all of her work. If you enjoyed today's episode, share this podcast on LinkedIn, Facebook and Twitter. If you haven't already, connect with me on LinkedIn or Facebook, and check out my YouTube channel. The YouTube channel is almost at 1000 subscribers. I'm still in complete disbelief and shock seeing that number. So help me hit that milestone help me hit the 1000 and subscribe if you haven't yet so far. If you're looking for a group of learning nerds to talk all about things instructional design related, check out the link to our Facebook group called instructional design Institute community. As always, folks, your five star reviews on Apple podcast and Stitcher, Spotify, Stitcher, wherever else you are listening to this podcast right now. Oh, those reviews are so so appreciated. Thank you for taking the time to write those. I read all of them every single one. So I really appreciate that. But hey folks, that is all I have for you today. Stay nerdy out there. And I'll talk to you next time.

EP-35: How to Write Your Book in 6 Months - What I Learned from Writing an Instructional Design Book

EP-33: 5 Futuristic Instructional Design Ideas for Your Courses