Automatic DJs And Invisible Instruments
A handful of standout projects from Music Hack Day NYC demonstrate how magic and wonder are part of the new face of music creation and distribution.
This weekend, General Assembly hosted the first ever Music Hack Day NYC, bringing together programmers and coders to combine service APIs in interesting new ways around music. Opening remarks noted appropriately that clearly the structure of the music industry is undergoing radical change at this very moment, and the old gatekeepers of how music is created and distributed are being actively replaced by the people in attendance. We’ve captured a few standout manifestations of how people are accessing and creating music below.
Automatic DJ creates party/venue playlists tailored directly to people who walk through the door. A camera at the door uses facial recognition to identify each person’s Facebook profile. Combining data from Facebook likes and users’ linked Hunch data, Automatic DJ is able to capture specific tracks from the Echo Nest’s expansive database of musical data tied to songs. These recommendations then fuel a dynamic Spotify playlist that can be streamed during the event.
While the Automatic DJ team was only able to demonstrate a brief proof-of-concept during demonstration using only one person’s face, the camera-at-entry idea is entirely feasible even at this point, executed well during an exhibition of digital artist Zach Gage’s game Killing Spree – which used the faces of people entering the exhibition within the game itself.
A similar project was Youzakk , a working service intended for use by bars and other night spots. Youzakk automatically generates a music playlist based on the tastes of the people who frequent an establishment, requiring only the venue’s name. Youzakk matches the users who have checked in on Foursquare to their musical preferences based on Hunch and Facebook, and creates a Spotify playlist accordingly.
The crowd-determined winner of Music Hack Day NYC was Tim Soo’s project titled “Invisible…Stuff,” which that demonstrated how an wiimote accelerometer matched with various mobile apps and some clever coding can create an invisible, gesture-controlled violin. Watch a video demonstration below, captured during the demonstration:
Winning second place among the crowd, djtxt allows party guests to instantly create collaborative playlists via SMS. People are able to text in song names, which automatically populates a central playlist.
Stringer was the third place winner, a virtual string instrument resulting from the combination of the metro-line-inspired mta.me and a Kinect 3D camera. A person can draw his/her own instrument in the air using gestures. Multiple people can play at once, through separate hand-tracking. Strings can be set to morph as people interact, with length determining pitch, creating a constantly changing melody.
Watch a video demonstration below: