Adaptive technologies are becoming responsive in nature then sharing those learnings within a networked ecosystem, creating enormous implications in the classroom and beyond
An underlying principle behind successful UX design in gaming is to make sure you’re accommodating players of all abilities. If a level is too hard, players get frustrated and walk away or if it’s too easy, their minds begin to wander. Ideally there is a sweet spot of engagement, which early video games accounted for by allowing players to choose their own level of difficulty. Fast forward to today, and designers are already crafting video games which use biometric data to adapt content to a player’s emotions in real time, creating an experience that is more responsive in nature.
The idea your technology will better react to you is not altogether new, but what’s different today is the variety of inputs developers can now consider. Within the Internet of Things (IoT), sensors are coming online and collecting data that can be used for automatically improving efficiencies and tailoring performance over time. As these systems become more advanced, they are also working to share that data with other machines, creating a network of intelligence that is, in theory, continually building upon itself.
We chatted with Dr. Sinem Aslan, principal investigator at Intel, to get her thoughts on how tech that can automatically reprogram itself will play a larger role in our lives going forward. Aslan was particularly interested in the application of these technologies in a classroom setting, and how adaptive learning platforms can help students stay engaged.
Are we going to be seeing more of these technologies which adapt and learn from human behavior from a consumer standpoint?
With the effect of standardization in the industrial age, technology was treating everyone as the same with no surprises. With information age technologies, however, everything is getting more and more personalized. Now, technologies are becoming more aware of us – our likes, dislikes and habits – and can offer personalized responses in our interactions with them.
First of all, we need to make such technologies optimize user experience. Such adaptive systems learn normal usage and then can detect variation in input. We are using more and more of these smart systems in our daily lives, and now we’re seeing them playing out in schools – which is very promising!
Can technology connect to the knowledge of a larger ecosystem and interject at key moments to optimize efficiencies?
Well, let’s look at the educational system. What we have today is one teacher and a lot of students in the classroom. This is a difficult situation for teachers since you cannot know each and every individual student or their needs, so how can you create a more personalized environment? People usually resort to 1:1 tutoring when they are struggling because you can adapt learning experience to what the students already know, or what the students want to learn, or the difficulties that he or she is facing.
As a research team, we wanted to see how we could implement the same type of 1:1 tutoring interaction in each classroom. We came up with the idea of an adaptive learning project. The goal here is to enable machines to understand individual students; monitor each of their behaviors during their instructional experience, and then detect whether s/he is engaged in the learning process.
We treat engagement as a major variable for this research since educational research indicates that engagement is highly correlated with performance. And since we want students to perform better, we said let’s focus on engagement to see if these machines – computers, tablets or whatever devices are being used – can understand students, and help provide a more optimized instructional experience for them. That was our starting point, and then as a first step in this research, we needed to define engagement.
What is engagement? We understand that engagement doesn’t have one single meaning. So, we captured a huge amount of data from various students including their gestures, gaze and body language, and now our team is working to come up with a generic engagement model that will be customized to individual students later on.
There will not be one definition of engagement for all students. The system will understand each individual student’s level of engagement by personalizing the engagement model. Then, you can customize the instructional experience. If engagement level is getting low, then you can have some room for improvisation. You can change the type of content. For instance, if a student is reading something, and the system detects the student is getting bored because his/her engagement level is low, it can then suggest another instructional format, like a video format or an interactive game.
This is how it works in a 1:1 mentoring experience. The mentor knows you and will look at your behaviors, your body language, and understand when you are not engaged in a subject matter. Consequently, the mentor would provide personalized intervention strategies to get you back on task.
How could this play out from a technical standpoint?
To create an optimized learning environment, such an adaptive learning system should communicate with the teacher’s device and other students’ devices. Therefore, from a technical stand point, all devices in the classroom would need to be connected to each other and communicate seamlessly.
Let’s say the student is stuck on one page for five minutes, or it looks like they started sleeping. The system can notify the teacher’s dashboard and say, “It looks like Ali’s engagement is very low, so you need to take some action.” That’s just a simple example. Or if a student is having a hard time solving a specific question, the system can suggest a peer in the class who has solved that question earlier for peer tutoring.
What are some of the main challenges to implementing these types of systems?
One big challenge is creating a differentiated model that can address everyone’s needs. There are so many people living in the world, and they each have their own learning styles, educational perspectives and learning habits.
There are huge cultural differences too. For instance, a smile may indicate something different from one culture to the next. Understanding those differences and creating optimized experiences for all is going to be the most important challenge.
A system may be first designed for a classroom setting, but educational research indicates that learning is not limited to a school’s four walls. Learning is everywhere, which limits how well a system can understand engagement in different contexts. It shouldn’t just work for Turkish students. It should also understand how an American student behaves in the classroom, and what’s expected of him/her. There shouldn’t be any barriers limiting such technologies to understand learners, like language barriers, or differences in body language across different nations.
In The Real World Web, iQ by Intel and PSFK Labs explore the role internet-enabled technologies will play in connected ecosystems of the future. This series, based on a PSFK’s recent Real World Web report, looks at the rise of the internet of things and its impact on consumer lifestyles.