Egocentric Vision: Why a Scientist Leverages Wearable Cameras as a Sensor Modality

Oct. 7, 2024
A neuroscientist leverages first-person vision (egocentric vision) by using wearable cameras and deep-learning models to enhance sensory feedback systems for helping patients with vision loss.

The final installment of a three-part interview series with Dr. Brokoslaw Laschowski  highlights his research in vision technology.

Laschowski, a neuroscientist who holds expertise in robotics and artificial intelligence, underscored the extent and significance of human reliance on vision. He said that his research lab takes special interest in “teaching computers to see like humans.” 

Up until a few years ago robotic prosthetic legs and exoskeletons were designed with no vision. “These robots were essentially walking blind and that was really the genesis of my program,” explained the research scientist and principal investigator at the Toronto Rehabilitation Institute.

Since then, other research labs have started to investigate the use of wearable cameras as a sensor modality to perceive the surrounding walking environment. Researchers like Laschowski develop various machine learning models, recurrent neural networks, convolutional networks and various LSTM models by using basic models, such as support vector machines or linear discriminant analysis, to make sense of visual input.

Designing high performance, highly efficient deep-learning models that “can be deployed, run in real time and accurately detect the environment and are able to understand the visual scene,” remains a complex challenge, he said.

All the technology his lab develops is general purpose. Predictions from the smart glasses his team develops can be used to interface with a robotic prosthetic, such as an exoskeleton, a powered wheelchair or a smart walker. And instead of using it to interface with a mechatronic system, it could be used to interface with the user.

Future Vision for Seeing

Laschowski plans to expand the vision research that he and others are doing in brain machine interfaces. He imagines an invasive intracortical, bidirectional interface that would be able to read and write information to the brain. For inspiration, he looks to systems developed by Neuralink, one of Elon Musk’s companies.

“We could do something along the lines of having a chip that’s implanted into the visual cortex of the brain—the back of the brain, the part responsible for vision,” he said. “The idea is that we could wirelessly interface our smart glasses with a brain implant, and then translate the pixels that are sensed with our camera, decode that visual information and then stimulate areas to the brain in the visual cortex to be able to elicit a concept known as phosphene.”

A phosphene is the artificial sensation of seeing light without light entering the eye. Laschowski explained further that if one is able to map pixels from the camera to electrical stimulation of the brain through neuromodulation, then one could potentially recreate artificially the sensation of seeing.

“That’s one of my long-term goals, where we’re able to use technology like smart glasses to restore vision for patients with blindness and visual impairments,” he said. 

Watch additional parts of this interview series with Dr. Brokoslaw Laschowski:

About the Author

Rehana Begg | Editor-in-Chief, Machine Design

As Machine Design’s content lead, Rehana Begg is tasked with elevating the voice of the design and multi-disciplinary engineer in the face of digital transformation and engineering innovation. Begg has more than 24 years of editorial experience and has spent the past decade in the trenches of industrial manufacturing, focusing on new technologies, manufacturing innovation and business. Her B2B career has taken her from corporate boardrooms to plant floors and underground mining stopes, covering everything from automation & IIoT, robotics, mechanical design and additive manufacturing to plant operations, maintenance, reliability and continuous improvement. Begg holds an MBA, a Master of Journalism degree, and a BA (Hons.) in Political Science. She is committed to lifelong learning and feeds her passion for innovation in publishing, transparent science and clear communication by attending relevant conferences and seminars/workshops. 

Follow Rehana Begg via the following social media handles:

X: @rehanabegg

LinkedIn: @rehanabegg and @MachineDesign

Sponsored Recommendations

Flexible Power and Energy Systems for the Evolving Factory

Aug. 29, 2024
Exploring industrial drives, power supplies, and energy solutions to reduce peak power usage and installation costs, & to promote overall system efficiency

Timber Recanting with SEW-EURODRIVE!

Aug. 29, 2024
SEW-EURODRIVE's VFDs and gearmotors enhance timber resawing by delivering precise, efficient cuts while reducing equipment stress. Upgrade your sawmill to improve safety, yield...

Advancing Automation with Linear Motors and Electric Cylinders

Aug. 28, 2024
With SEW‑EURODRIVE, you get first-class linear motors for applications that require direct translational movement.

Gear Up for the Toughest Jobs!

Aug. 28, 2024
Check out SEW-EURODRIVEs heavy-duty gear units, built to power through mining, cement, and steel challenges with ease!

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!