A new project explores automatic recognition in a visual surrounding using the search giant’s new gadget.
A new project pairs a smartphone with Google Glass to help the wearer identify people based on their clothing. ‘InSight‘ is complimentary to facial recognition, for even if a person’s back is turned, it will recognize them based on visual clues and human motion patterns.
Developed by Dr. Srihari Nelakuditi from the University of South Carolina and Dr. Romit Roy Choudhury from Duke University, the project has received the prestigious Google Faculty Research Award. It is expected to facilitate human recognition and also provide a useful primitive version of human-centric augmented reality.
InSight examines the colors and decorations of clothes and people’s movement to make up a “fingerprint” that helps recognize individuals. New Scientist reports that this is constructed by a smartphone app, which takes a series of photos and creates a file that captures the spatial distribution of colors, textures, and patterns of a person’s clothes. This makes people easier to identify from different angles and greater distances.
The “fingerprint” is only temporary, as it changes when the person swaps their outfit. In the team’s tests with 15 volunteers, they identified people 93% of the time, even when they had their backs to the Google Glass user.