Blind Facebook employee develops tech which uses AI to verbalise content of an image or video to help visually impaired

Facebook engineer Matt King is leading a project that is making solutions for visually impaired people on the platform that could eventually be used to identify images and videos

A blind Facebook employee is developing a technology that will use Artificial Intelligence (AI) to verbalise the content of an image or video and enable the visually impaired to "see" and determine appropriate content for people and advertisers.

People stand in front of a logo at Facebook's headquarters in London. Image: Reuters

People stand in front of a logo at Facebook's headquarters in London. Image: Reuters

Facebook engineer Matt King is leading a project that is making solutions for visually impaired people on the platform that could eventually be used to identify images and videos that violate Facebook's terms of use or that advertisers want to avoid.

"More than two billion photos are shared on Facebook every single day. That's a situation where a machine-based solution adds a lot more value than a human-based solution ever could," CNBC quoted King as saying late on Saturday.

King, who was born with a degenerative eye disease called retinitis pigmentosa, lost his vision by the time he got his degree and started working at IBM with the tech giant's accessibility projects.

He worked on a screen reader to help visually impaired people "see" what is on their screens either through audio cues or a braille device. IBM eventually developed the first screen reader for a graphical interface.

Matt King. Image: AP

Matt King. Image: AP

He worked with the accessibility team till Facebook hired him from IBM in 2015.

At Facebook, he works on features to help people with disabilities use the platform, like adding captions to videos or coming up with ways to navigate the site using only audio cues.

"Anybody who has any kind of disability can benefit from Facebook. They can develop beneficial connections and understand their disability doesn't have to define them, to limit them," King said.

One of his main projects is "automated alt-text," which describes audibly what is in Facebook images.

When automated alt-text was launched in April 2016, it was only available in five languages on the iOS app. Today it is available in over 29 languages on Facebook on the web, iOS and Android.

"The things people post most frequently kind of has a limited vocabulary associated with it," the Facebook engineer said.

"It makes it possible for us to have one of those situations where if you can tackle 20 percent of the solution, it tackles 80 percent of the problem. It's getting that last 20 percent which is a lot of work, but we're getting there," he said.

In December 2017, Facebook pushed an automatic alt-text update that used facial recognition to help visually impaired people find out who is in photos.

Loading...



Top Stories


also see

science