InvisibleOS is an exploration in using hand and gesture tracking to control a simple user interface. This project controls data from MediaPipe’s hand tracking system using JavaScript and p5.js and combines it with satisfying haptic-style audio. The user interfaces with the system by walking up to the canvas and raising their hands into the calibration squares. There is no visible computer and the screen is a simple canvas, creating a contract between futurism and traditional artistic convention.
After the user calibrates, they are taken through a tutorial that explains the basic actions and gestures needed to unlock the system. Once completed, they engage in a randomized unlock sequence featuring gestures, buttons, and puzzles. The system cycles a few times. After the final sequence, the user is met with a screen explaining that the system has had an error and all their data has been deleted. They are given the option to talk with an AI support system, however whether they select “yes” or “no”, the program restarts and they have to re-calibrate and try again.
Try it out & view the code here.
InvisibleOS
Blank canvas (12”x24”), short-throw projector, webcam, MediaPipe, JavaScript, p5.js
2025