Android developed by Cornell computer scientists uses a Kinect sensor and a dataset of videos to analyze human movements.
A robot developed by computer scientists at Cornell University uses a Kinect sensor and a dataset of videos to analyze and predict human movements. It can identify body movements that are assigned to an activity, like reaching, moving, or eating to help people perform everyday tasks.
Anticipating which activities a human will do next (and how to do them) can enable an assistive robot to plan ahead for reactive responses in human environments.
The Verge reports that the researchers use complex algorithms to detect movements associated with different activities and detect people’s actions. This enables the robot to predict future actions and it can also learn from its mistakes to improve over time.
The robot achieved 83.1% accuracy for the detection of high-level activities, outperforming the algorithms it was tested against. You can check out the robot anticipating human activities in the video below: