In VR it is normal for users to use handheld controllers to interact with their virtual environment. However, sometimes using these controllers can interfere with other equipment such a physiological sensors or sometimes a user might not be able to use their hands. In this project “After Life” Floor van der Voort (CS student) wanted to use eye tracking as command input for users. She decided to use Machine Learning techniques to use shapes drawn by eye movement as commands.
The application takes the user to an imaginary after life experience where the “all seeing eye” asks the user some question before allowing to pass on further. Each question can be answered by drawing one of four shapes with your eyes; a square, a diamond, a triangle pointing upwards and a triangle pointing downwards.
To make the system recognize these shapes, image recognition ML is used. The system is trained using a set of eye tracked tracing images of the shapes. Once a user traces a shape the system tries to recognize it and returns the most likely shape. This is used by the Virtual Environment to for the rest of the story.
The Tech Labs create the virtual environment and together with Floor set up the ML environment.