Environment as an instrument
In a collaboration between Artiphon and Snapchat, we helped develop the Scan Band lens together with Studio ANRK. This lens turns your environment into an instrument and lets you play those instruments with your body. Lens studio’s new machine learning capabilities were used to achieve a few different goals.
First, machine learning is used to recognise objects in your environment, packed into Lens Studio 4.0’s new scan functionality. After recognising and placing objects around you, Snap’s new SnapML Audio comes into play. This new feature enables the use of machine learning to analyse, control and play sounds.
After playing music, we used Snapchat’s persistence system to save the selected instruments, scanned objects and placement. This way this experience can be replayed without having to start from scratch every time.
Using SnapML Audio also meant it was possible to play sounds without having to record them through the microphone, but playing them directly from your phone. This means the quality of played audio is a lot better than before, since the sound doesn’t have to get passed from your speaker to your microphone.
Want to read more about turning your environment into an instrument? Find the portfolio item here.