Improving Video Editing With Eye Tracking
For a while now I was fascinated by the idea of commercial eye tracking and being able to use that data in game development.
This year when I was at GDC, I met with the guys from Tobii (who were super cool by the way), and I managed to get my hands on one of their “Eye Trackers“. (This is -NOT- a sponsored post by the way, I just think this thing is awesome)
As any other developer, I was skeptical of their accuracy, however after trying one myself, I was quite fascinated by how well it works, that and how weird it is to have somebody next to you know exactly where you are looking on a monitor.
So once I got the tracker itself, the first thing I did was go in a quiet corner of the convention center, unbox it and plug it into my laptop. I quickly installed the drivers and tried to figure a way to hide the preview bubble while still recording the eye data. If you use the tracker how it was intended, you have the option to enable a preview bubble that shows you exactly where you are looking on the screen, and it looks like this:
The problem with that is that if you want to record somebodies eye tracking data, that bubble messes with their attention and it influences where they are looking. So I figured out a way to record the eye data with OBS, by using one of their “overlays“.
Once I figured that out, I was thinking how could I use this on my already-released game “Move or Die”. And I figured I should focus on what most indie developers forget to focus on, which is MARKETING! More specifically, the Move or Die Trailer.
For those who haven’t seen it yet, it looks like this: (I recommend you watch it before you keep reading, and try to pay attention what you eyes are focusing on)
Because we are such a small team, I had to record and edit that whole trailer myself. So it’s very important for me to understand how to better control the viewer’s attention through good timing, framing and overall better video editing.
So while being at GDC, surrounded by other game developers weird enough to be willing to help me with my odd experiment. I figured I should gather as many people as I can and ask them to watch the Move or Die trailer while I record their eye tracking data.
I did this with around 35 willing candidates (I promise I didn’t force them), and recorded their eye tracking data individually. The end result was a white dot on a black background, something like this:
After GDC, once I got home to my PC, I compiled all the recorded data into some sort of heat-map, and overlayed it with the actual trailer using the audio as a sync reference.
In the end, with everything put together, I ended up with this:
And this is some super awesome data to have! In this case, the result was pretty predictable since the editing style of the trailer was a very fast one with most of the action happening in the center of the frame.
This means the viewer didn’t had to move his eyes too much to focus on new elements, because most scenes only lasted 2-3 seconds, and that’s not enough time to introduce new elements and allow the viewer to re-focus on a new part of the frame while also understanding what is happening on the screen.
You can read more about this style of editing in this article about Mad Max Fury Road.
It’s also interesting to see how the viewer’s eyes naturally focused on the players’ faces when they were on screen, and how you can see the viewer read a piece of text from left to right once it appears on screen.
All in all, this was a fun experiment and I’m definitely intending on using this again for the future videos I create in a more iterative way as I’m developing a better editing style.
I will also follow up sometime next month hopefully with an article going in-depth about the techniques used in editing the Move or Die trailer, with a short making-of video of it.
Keep being awesome!
~ Nick (Xelu)