The IBC Hackfest is a hackathon in which over 100 talented developers, designers, and entrepreneurs have 36 hours to build out an innovative new concept that leverages partner technology. This year, Kaltura Front End R&D Group Leader Itay Kinnrot and his team took first place with their hack, Skipaclass.
It all started on a sunny Saturday morning in Amsterdam. I debated with myself over whether I should spend the day exploring more companies at the IBC show or check out the hackathon event. I finally decided to check out who was participating and see if there were any interesting ideas.
Most of them, it turned out, had come to the Hackfest well prepared with many ideas of their own. After speaking to a number of participants, I met and connected with Martijn Snelder (Creative Entrepreneur), Madli Uutma (Data Analyst) and Taavi Kivisik (Data Analyst). We started brainstorming to come up with what we hoped would be a winning idea.
At first, we thought about going for the sport challenge. (You can see the soccer ball in red on the brainstorm map!)
But after few more rounds of brainstorming, we decided to focus on the education section.
We wanted to think of a way to increase the number of students that will complete online courses. We thought that if we could give more data on how the average student reacts to specific parts of a video, we could make it more effective. If students had more transparency on the “hard” sections of the video, it might lower frustration. If staff could see where students were getting stuck, they could create more effective videos in the future.
Google, who was sponsoring the event, gave us access to the Vision API which enables mood detection from a human face, so we used that as our base. The algorithm is pretty simple—we capture the student’s face every 2 seconds and analyze their mood using the Vision API. Every time we detect that the mood has changed, that change is mapped to the associated point in the video. We are checking for both positive and negative mood shifts. In addition, if a student reverts back to a position in the video, we assume that the section of the video may have been difficult to understand and that data is noted as well. We collected the data to a FireBase real-time database, and presented the data in real-time below the video, tied to the timestamps of each event.
This data enables students to see which parts of the video are difficult to understand and may need more concentration, and which are fun and easier to watch. The lecturer is also able to see how students react to their lecture and maybe do better for the next time.
We believe this analysis will help students to (skip an in-person class and) learn a lot more and be more engaged using online courses.