Google Has Begun Replacing Apps With Neural Networks
Google is advancing its AI capabilities to give 'ambient awareness' to the Pixel 2 phone
Google is making phones smarter with machine learning—and essentially replacing apps with a more advanced operating system. Now Playing, a new feature for the Pixel 2 phone, is essentially a new-and-improved Shazam: it is constantly listening and able to match 70,000 songs without help from the internet.
Instead of users having to ask what song is playing in the background, Now Playing shows the answer immediately on the phone’s locked screen. Google is calling this anticipatory technology “ambient awareness,” which took years of development. Since the audio matching is done on the phone rather than the cloud, a database is needed. In order to ID a song, it must match an audio fingerprint stored in the phone’s database.
The researchers who created Now Playing first built a database of 70,000 sound fingerprints that are a snapshot of a song’s waveforms. To do so, they used a neural net that transformed audio fingerprints into recognizable and unique tiny files. This was difficult because the audio fingerprint had to have enough data to be useful with distorted samples.
When sound comes into the phone it is filtered by the Pixel’s DSP chip, which is a low-power chip that listens for activation words like “Okay Google.” The DSP is constantly listening until it recognizes music. Once music is recognized, the DSP activates the Pixel’s processor, which runs the Now Playing neural network. Within a few seconds the algorithm tries to match the incoming fingerprint with the songs in the phone’s database.
Researchers at Google feel that more opportunities to activate AI on a device are on the horizon. While some functionality will always rely on the cloud because of its big database, there are others that could take advantage of the neural network.