Environment as an instrument

In a collaboration between Artiphon and Snapchat, we developed the Scan Band lens together with Studio ANRK. This lens turns your environment into an instrument. Therefore, you can play those instruments with your body. We used Lens studio’s new machine learning capabilities to achieve a few different goals. In this blog we will tell you more about Machine Learning and the filter we’ve used it for!

Recognising objects in your environment

First, we used machine learning to recognise objects in your environment. Packed into Lens Studio 4.0’s new scan functionality. After recognising and placing objects around you, Snap’s new SnapML Audio comes into play. Why? Because this new feature enables the use of machine learning to analyse, control and play sounds.

Machine learning

After playing the music, we made use of Snapchat’s persistence system. This way we could save the selected instruments, scanned objects and placement. Therefore, the experience can be replayed without having to start from scratch everytime.

Making use of SnapML Audio also meant it was possible to play sounds, without having to record them through the microphone. In other words: playing them directly from your phone. This means the quality of played audio is a lot better than before. This is because the sound doesn’t has to get passed from your speaker to your microphone.

Artiphon Machine Learning

Want to read more about turning your environment into an instrument? Find the portfolio item here.