if you are blind, a modern phone can be life-changing. apps like taptapsee connect the phone’s camera to a sighted person so someone can tell you what’s in front of you. though there might be a delay matchmaking that person, and you might not feel entirely comfortable sharing your life with a stranger. could the phone recognise things by itself?
the answer is yes and no, and it’s what i’m working on at city, unversity of london for a while. yes, in that machine learning algorithms have been trained on image datasets and phones running them really can pick things out of the camera feed. the no comes in practice, as the things being picked out don’t seem to be what’s salient to the visually impaired. so, amongst other things, i’m building an iOS app so the blind can show us what’s important to them. a camera app, for the blind.