Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Learn more.
Vision is arguably the most important of the five basic senses, providing most people with the ability to navigate through the world — but there are 285 million visually impaired people worldwide who need assistance to augment their sight. In some cases, the best solution might be a large vision-boosting headset, but a less conspicuous haptic wearable might be a great primary or secondary choice. That’s where Foresight’s new AI-powered navigation aid comes in.
Developed by a team of Harvard students, Foresight places soft robotic actuators inside a device that’s worn like a vest, turning camera input from a smartphone into localized sensations of force across the wearer’s torso. Using a custom version of the computer vision AI system YOLO, Foresight detects, classifies, and estimates the movement of objects surrounding the user, then uses the actuators to apply more or less pressure at various points depending on the user’s distance from those objects. Even without vision, a user could distinguish between a mostly open path ahead, a wall to the left, and a person approaching from the front.
As is often the case with vision assistive technologies, the end product isn’t merely about providing helpful functionality, but also about balancing the wearer’s needs for dignity and practicality. The researchers describe Foresight as “discreet, affordable, and intuitive,” using inflatable soft textiles on the body rather than uncomfortably vibrating haptic motors, a design that can be mass-manufactured without “high-end fabrication facilities.” That’s key to enabling the wearable to spread into parts of the world with lower income levels.
Unsurprisingly, Foresight relies upon the aforementioned smartphone to provide the core camera and computer vision/AI functionality, with renderings showing an iPhone 11 Pro sitting in a central location below the wearer’s neck. This position enables the device to monitor an environment from the wearer’s perspective, then issue real-time haptic commands to the wearable using a Bluetooth connection. It’s highly likely that the product will support a range of smartphones, rather than requiring ones that cost at least $1,000, though there will surely be a base level of required camera technology and AI processing capabilities.
Foresight’s team is currently working to refine both the software and sensors to ensure that the wearable’s environmental imaging is useful for wearers, and expects that the finished solution will be “another tool in their arsenal,” rather than fully replacing other assistive navigation technologies. There’s no release date or pricing yet, but the team is working with Harvard Innovation Lab’s Venture Incubation Program to commercialize the design.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.