Situated Interaction project features automated secretaries and smart elevators.

In an effort to prove that humans and robots can interact on a more meaningful level, Microsoft has showcased itsSituated Interaction project, an immersive experience ed by distinguished scientist Eric Horvitz, and his colleague Dan Bohus. The fully-immersive experience feature elevators that can predict if you need a ride, and robot secretaries that rely on work calendars to allow, or deny, appointments.

The project relies on intensive integration of multiple computational competencies and methods, including machine vision, natural language processing, machine learning, automated planning, speech recognition, acoustical analysis, and sociolinguistics. It also pushes into a new area of research: how to automate processes and systems that understand multiparty interaction.

READ THIS ARTICLE FOR $15
$15 provides access to this article and every case-study, interview, and analysis piece that we publish for the next 30 days. Our Premium Subscription also provides access to a database of over 100,000 articles on innovation in brand, customer, and retail experience.
Already a subscriber? Log in