Eye-tracking is cool, but products we’ve used in the past have been too inaccurate or slow for practical usage in games and applications. The Eye Tribe is a company which specialises in the technology, and a demo we received at Mobile World Congress this week proves their expertise.
Language tutorials for the SDK are available for C#, C++, and Java.
Our experience started with a game which a lot of people with a smartphone will be familiar with, Fruit Ninja. It’s a game which requires co-ordination and fast reactions, but I only consider myself as having the latter skill (as reflected in my leaderboard score!)
On a standard touchscreen I’ll see the fruit, but my fingers won’t swipe quick enough to slice and dice them. This isn’t a problem with The Eye Tribe’s technology, as my FPS-honed eyes quickly shot from one fruit to the next to create a virtual cocktail across the screen.
This displays three key things; 1) the speed of the technology, 2) the accuracy of the technology, and 3) the immersion offered by the technology.
My first question was to ask if EyeTribe planned to implement motion detection, but it was obvious as I was led to the next demo station this was already the case.
A collaboration between the company and Lego has resulted in an application which can build a model from the iconic building blocks on-screen, whilst moving side-to-side will rotate around the piece. This is available to try in their store in Copenhagen.
Here’s a video of the demonstration we were given:
An SDK is available for you to implement eye-tracking capabilities into your games or applications. For games, it is best as an additional input to a keyboard and mouse to track where the character should be looking or aiming. For applications, an innovative use for the eye-tracking is for analytics purposes to see where a user is looking first/most.
For times where the eye-tracking is used to evaluate human attention objectively, this is called ‘passive’ tracking. Where it is providing actual control, this is called ‘active’ tracking.
On The Eye Tribe’s website, they list other active tracking examples;
- Web browser or pdf reader that scrolls automatically as the user reads on the bottom part of the page.
- A maps application that pans when the user looks at the edges of the map. The map also zooms in and out where the user is looking.
- User interface on which icons can be activated by looking at them.
- When multiple windows are opened, the window the user is looking at keeps the focus.
- A first person shooter game where the user aims with the eyes and shoots with the mouse button.
- An adventure game where characters react to the player looking at them. For instance, if the player looks at a given character, this character will start talking to the player.
- An on-screen keyboard designed to enable people with severe motor disabilities to write text, send emails, participate in online chats, etc.
Language tutorials for the SDK are available for C#, C++, and Java. You can get more information about how to get started, where to download the SDK, and everything else you might need right here.
Would you implement eye-tracking into your apps and games? Let us know in the comments.