Annisa Sofea Binti Abdul Rashid Universiti Sultan Zainal Abidin
OptiCom is a system designed to enable non-verbal communication through eye movement, allowing
users to select message prompts that build into complete sentences. It addresses the challenge faced by
individuals with conditions like Locked-In Syndrome (LIS), aiming to restore their ability to express
thoughts. The project's objectives include studying machine learning technologies for detecting iris
position and blinks, developing a user-friendly communication board user interface, and testing the
efficiency of a trained MLP for gaze-to-cursor mapping. Using Python, VS Code, and resources like
GitHub, Discord, and Stack Overflow, the system will apply a Multilayer Perceptron Artificial Neural
Network and a machine learning algorithm for accurate eye-tracking. The expected outcome is an
easily accessible functional prototype that helps verbally disabled individuals construct coherent
sentences, via a web application to improve communication accessibilities.