Clinicians of the Future Communication

Gazing Towards an Inclusive Future


The eyes view and observe the world daily, influencing one’s decisions and choices. They contribute to body language and reveal our feelings and thoughts. Eye movements are a source of communication contributing to everyday interactions, recognition of faces and places, while creating and retrieving memories connecting the external environment to the internal environment. A person’s eyes assist with everyday activities such as driving, working, and mobilising from place to place.  

Imagine wanting to turn on the TV however you are unable to move your body. What would you do? How would you communicate with a friend or family member. Now imagine being unable to talk or move your body except your eyes. You can see the TV and you can see a family member. What do you do? 

People living with communication difficulties face these challenges every day. This is where the eyes become the only tool to being a person’s voice and giving a person with a severe disability the ability to turn on the TV or ask for help. Assistive Technology utlising eye gaze calibration creates a bridge for a person to complete occupations, socialise and communicate within the community.

Eye Gaze Technology allow a person to control their environment, communicate, mobilise, perform everyday activities, and increase independence through eye movement. Eye gaze technology provides health professionals research opportunities to identify disease and understand human behaviour by tracking eye movements. As future advancements of assistive technology become available the world is evolving to create a future of inclusive possibilities.

As a 4th year Occupational Therapist student at Deakin University  I decided to explore and research Eye Gaze Technology for an Innovation Project with Eazilee.  This project and experience explored the advancements in eye gaze assistive technology and the importance of access, for an inclusive future. 

Communication and Occupations

Eye gaze technology allows the possibility for someone to communicate, type and write emails, play games, watch videos or participate at work. The overall aim of an eye tracker device for laptops is to replace the traditional mouse and keypad to continue with occupations. Eye gaze technology is available for children and adults with software and programs to assist with person’s specific needs. 

Tobii Dynavox Eye Tracker

The new Tobii Dynavox ‘PCEye’ eye tracker is a slim and portable product that works indoors and outdoors fitting onto Microsoft laptops and computers. Tobii offers a range of products from communication systems and environment control.

Look to Speak App

The Look to Speak app was developed by Google and is compatable with Android smart phones. This experimental app utilises eye tracking to select phrase and communicate aloud for the person. The eyes are calibrated to the app and the app is controlled by looking up, left or right. If your eyes become fatigued you can select can select snooze on the app. Tips and tricks can be found on this downloadable guide: Guide: Look to Speak App. This experimental app is not recommended to replace other communication eye gaze technology required for a person with a disability. As an occupational therapist seeing this technology on smart phones is an exciting opportunity to give a person a voice and increase inclusiveness to discussions while participating in activities. 

Look to Speak App

Controlling the Environment

Eye gaze technology can be calibrated through software to link up eye tracking to control the environment in a person’s home. This technology gives a person the choice and freedom to independently complete certain task or activity independently if they want to, providing inclusiveness and increased participation with activities around the household. As an occupational therapist student controlling the environment through eye gaze is an exciting tool to be seen and provides options for people to continue to participate in activities important and meaningful to them.

Environmental Control Devices
Take for example the Tobii Dynavox ‘EyeR’ an extra connection for Windows laptops and tablets that can connect to infrared devices through the virtual remote on Windows Control 2 software to control appliances within the environment surroundings through eye gaze. Or the NeuroNode Trilogy software that combines eye gaze, sensor control of electromyography (EMG) and the NeuroNode 3.0 software to control the home environment. EMG detects small finger muscle responses such as moving the index finger through the wrist band and sensors allowing for a person to use both eye gaze and muscle control to complete tasks or control their home environment. Through this different type of software, a person can control lighting, beds, air conditioner, telephones and many more appliances.

Tobii Dynavox EyeR

Dr Jordan Nguyen and Riley Experiment 1
Meet Riley and Dr Jordan Nguyen who worked together to develop environmental eye gaze technology. Dr Jordan Nguyen is a biomedical engineer who designed eye gaze technology with electro signals and artificial intelligence to allow Riley who has cerebral palsy to control environmental appliances. Riley already uses eye gaze technology for communication but was interested in controlling the environment and Dr Jordan Nguyen was there to give him this opportunity. Within the video below, see the end product of Riley’s eye power and his family’s reaction to the outcome. If you are interested in the whole process I recommend watching the full documentary Superhuman Part 1.  

Mobilising from A to B

Inclusiveness and independence are an option for a person who mobilises in a powered wheelchair inside or outside. My EccPupil are eye gaze controlled German designed glasses that direct a person’s wheelchair. A person wear’s these glasses and the software connects the persons eye movements to the direction their wheelchair moves.  The My EccPupil technology won the German Design Award 2021. Watch the video below to see the difference these glasses can give a person’s experience of independence and interaction within the community.

Dr Jordan Nguyen and Riley Experiment 2

Meet Dr Jordan Nguyen and Riley again, Riley has a goal to be able to drive after experiencing environmental control with eye gaze technology. Dr Jordan Nguyen and his team create a buggy Riley can control through eye gaze technology and sensor control. This is an exciting documentary of the development of the buggy and watching Riley putting his eyes and mind to the test and learn and adapt eye control movements and sensor control technique in one day! Watch the video below to see the end product or watch the full documentary of Superhuman Part 2.

Research and Eye Gaze Technology

AdHawk MindLink
Alongside eye gaze assisting a person to function within their environment, communicate and mobilise, the technology has been incorporated into research. For example, AdHawk Microsystems are researching and creating eye tracking technology that unlocks the connection between the eyes, the brain, and the external environment. The AdHawk MindLink glasses are a new eye tracking tool utilised by researchers for collating health data such as detecting eye conditions, neurological conditions and tracking human behaviour. 

Eye Gaze and Early Interventions
Research has found that eye tracking technology can identify and diagnose children with autism through the detection of instinctive gaze patterns and tracking the child’s eyes. This is beneficial for early interventions to be in place for the child to assist with communication skills, socialising skills and enhance the child’s ability to participate in everyday activities like play or school.


Henry Regains Independence After Losing Vision to Rare Disease

For Henry Arguello of Florida, who works as a home improvement retailer at Home Depot, life after losing vision required changing his habits and lifestyle. But that does not mean that he is not being able to perform his tasks at work all by himself.

At the age of 32, Henry started losing vision to Stargardt disease, an inherited form of macular degeneration that affects younger people. The reason Henry is able to perform his tasks independently at work and at home is thanks to the revolutionary assistive technology device, OrCam MyEye.

Before he started using the OrCam MyEye, Henry constantly struggled with reading small print, whether it was text messages displayed on the screen of his mobile phone, or on everyday items and packages where text provides information about the ingredients of food and cleaning products.


While working at Home Depot, Henry is responsible for receiving inventory, then sorting it out, and stocking it on shelves. He is also required to ensure that all of the deliveries that come into the store are the right ones, in the right quantities and measurements. As Henry’s low vision progressed and he started losing vision, his attempts to read the products’ names and details became extremely difficult for him. In order to improve his lifestyle and the way he is coping with low vision in his everyday life, Henry joined the Job Readiness Program for the Blind and Visually Impaired led by the Miami Lighthouse. The instructors, most of whom are blind themselves, helped Henry gain the confidence he needed, as well as teaching him new skills that help people with low vision increase their daily independence. In addition, they taught him how to utilize the newest technology to maintain and to lead a fulfilling career regardless of his condition. While attending a Low Vision Course at the Miami Lighthouse, John, the store manager at the Lighthouse shop, told Henry about the OrCam MyEye. John knew Henry worked at Home Depot and believed he could benefit from using the device.
Henry reading box with Orcam Eye

After receiving training with the OrCam MyEye device, Henry quickly realized how easy it is to use it is and saw the daily benefits that using it can provide him. Thanks to the OrCam MyEye, he is able to easily identify the products and stock them on the correct shelf with complete accuracy. Henry can now help customers find their items by using the OrCam MyEye to read the barcodes of any product. He can also use the device to read the prices to customers, and even have the device read the product description out loud to any customer he is providing service to.

The OrCam MyEye does not require internet connection, so Henry never has to worry that the device won’t work inside the store or anywhere else where internet connection is limited. Henry is now able to complete daily tasks on his own, faster and better.

OrCam has also improved Henry’s life after losing vision in more ways than just being able to work independently. Henry uses the OrCam MyEye to read his text messages and the articles he is interested in on his mobile phone. While going out shopping on his own, he is able to easily identify the products and read the ingredients on his own. He is also able to recognize money bills while paying at the checkout counter with his device.

Henry says about life with OrCam that “Without the OrCam I would not be able to do my job to the best I can. It helps me succeed every day.”

Story replicated with permission from Orcam

Communication Every day

Seeing AI – a bridge for everyday independence

In 2017 Microsoft launched its Seeing AI mobile applications with the aim to make the world more freely accessible to individuals living with a visual impairment. The free IOS Seeing AI app allows users to complete multiple tasks within the one application providing audio descriptions to the user for what is in front of the camera. Features include

– Speaking short text as soon as it appears in front of the camera

– Providing audio guidance to capture a printed page

– Gives audio beeps to help locate barcodes and then scans them to identify products

– Recognising friends and describing people around you, including their emotions

– Identifying currency bills when paying with cash

– Describing colour and generating a tone for the brightness of a room

Seeing AI although initially launched with English only now has support for Dutch, French, German, Japanese and Spanish opening the door for many non native English speakers to engage in their world in their native language. 

Robin Spinks was born with albinism and has experienced partial blindness all his life. At CSUN 2020 Robin recently described how applications like Seeing AI have impacted his life as a dad. 

Using an example of family trip to the zoo Robin explained how Seeing AI had acted as as “bridge” between a dad “who can read but not see” and his young son who could “see but not yet read”. Being unable to visually identify animals beyond “a big black blob” in their enclosures, Robin and his son were able to combine his son’s sight and Seeing AI to share a learning experience utilising the animal information boards in front of the enclosures. 

Robin said “I was able to talk to my son about the black blob being a Visayan Warty Pig, and have conversations about endangered species and sustainability”. “We were able to enjoy a full day at the zoo, talking about the animals together.” These types of assistive technologies have been a “game changer” as a parent.

Robins’ wife Emma who is also visually impaired highlighted the limitations of everyday products for those with visual impairments and how new Assistive Technologies are helping to close this divide. “Something as simple and private as taking a pregnancy test becomes a challenge when despite huge advances in technology there are no accessible pregnancy test kits for the visually imparired” shared Emma.  “Image how hard it might be for a young person taking a test who doesnt want to tell anyone the results”. Applications like Seeing AI can now take a photo of the test and tell you the results, enabling an added layer of independence to a task, which sighted people often take for granted. 

Emma shares many of her experiences and wisdom as a visually impaired mum of 2 young sighted boys through her podcast One Blind Mum 


Robin Spinks and child
Robin Spinks and Son. Image- RNIB