A ‘Truman Show’ like experiment covering 90000 hours of real life footage hopes to uncover the means by which word formation develops in babies.
How language emerges during infancy has been a subject which has fascinated enquiring minds for well over two milennia. Deb Roy from MIT has offered amazing new insights into this field, with incredible data analysis software and terrabytes of raw data.
Fast Company reports how the raw data in question was provided by Deb’s family:
From the day he and his wife brought their son home five years ago, the family’s every movement and word was captured and tracked with a series of fisheye lenses in every room in their house. The purpose was to understand how we learn language, in context, through the words we hear.
Using cutting edge vocal analysis software, the Roy’s were able to parse 200 terrabytes of data representing 90’000 hours of their life. This vocal parsing software was able to capture the emergence and development of word’s in their son’s vocabulary.
The software up the ante even further with 3D visualisations which enable Roy’s team to zoom through the auditory topography of the Webs’s household, enabling them to see where the most frequent words were uttered. This remarkable technological research will continue at MIT but Web has plans to bring the platform into more commercial contexts such as charting how social media environments interact with televisual live broadcasting events.