The inset design of the screen and the large protective bezel offer a unique opportunity to use the bezel as a secondary touch input for the UI. It provides an unobstructed way to preform swiping and tapping gestures to scroll, select, dismiss, etc.
Using the no-code machine learning classifier Wekinator. I was able to quickly recognize different gestures which are in turn used to navigate through an interactive Figma prototype.
An Arduino reads data from four capacitive strips picking up changes in touch and movement across the surface.
Wekinator is trained to classify various gestures based on unique sensor input patterns.
A processing script listens to the Wekinator classifications and emulates a unique keyboard press for each gesture.
A Figma prototype can be built in the normal fashion with art-boards linked by the keypresses assigned in Processing.