AR keeps evolving, fueled by advancements like Apple's Vision Pro and Meta's Quest 3/Quest Pro. At arfected, we know AR is the secret sauce for next-level experiential marketing. From uncovering hidden information layers and virtual signage to gamified content unlocks and augmented stage designs, AR lets you push the boundaries!
We've teamed up with OBE on some mind-blowing AR experiences over the past years. With our shared expertise in crafting meaningful AR interactions that take attendee experiences to the next level, here’s our take on the top AR trends to look out for in experiential marketing in the year ahead and beyond.
Generative AI tools like Midjourney and ChatGPT can speed up creative workflows, particularly for early concept development stages. For example, arfected uses Midjourney for ideation sessions to get quick and specific inspiration at the start of projects. It’s a helpful tool for putting color palettes together and creating a wide variety of looks and feels, as well as for evaluating options for materials, surfaces, and compositions rapidly. ChatGPT can similarly be used for concept thought starters and generating rough copy ideas. Generating 3D design is much more complex, but some steps of the 3D pipeline, such as texturing, can also be accelerated with AI tools like the Dream textures add-on for Blender.
We are also seeing more AI-powered tools added to platform frameworks, such as Tiktok Effect House’s Generative Effects, which utilizes generative adversarial networks (GANs) on effects, or Snapchat Lens Studio’s ability to create filters that incorporate ChatGPT prompting.
Generative AI tools accelerate our workflow and increase productivity, resulting in more accessible custom-built experiences at lower price points. Brands will, therefore, be able to accommodate AR experiences within a wide range of campaign budgets.
We will also see more personalized AR experiences that can be fully customized to the individual.
Since 3D design is an essential component of AR experiences and 3D modeling is a complex task even without using AI, it will take some time before Generative AI will be able to assist with 3D design portions of AR workflows (which are significant). AI generative models that can make complex 3D models with good topology, UV, materials with textures and mesh animations, while still keeping file sizes and poly counts low, are quite a ways off.
Advancements in script generation that allow for more complex functionality will also increasingly help reduce development time.
With the release of new Mixed Reality headsets in recent months, including the Apple Vision Pro and Meta Quest 3, we are continuing to see significant investments into next-generation mixed reality interfaces by major tech companies.
Hardware advancements in AR headsets mean increasing support for more realistic and immersive experiences. For example, the Vision Pro introduces the R1 chip, a new Apple silicon which can process large amounts of coconsummplex sensor data in real-time for a more accurate understanding and rendering of the real world. Shortening the processing time — Apple claims it can process data eight times faster than the blink of an eye — will also help improve the problem of motion sickness common to XR (extended reality) experiences, which includes AR, VR and MR.
However, while these headsets will gain more ubiquity as they become more powerful, more affordable and more comfortable, these headsets are still facing challenges in wide adoption by everyday consumers.
Experiential is a strong mixed reality use case that doesn't necessarily require people to invest in their own hardware. We will see more mixed reality experiences in experiential marketing built for headsets and more sophisticated user experiences for both headsets and mobile AR that can fully engage the user with an extended layer of IRL events. Brands in experiential marketing will be able to leverage mixed reality for more complex and nuanced storytelling.
In the last four years, we have already seen how social media platforms have increased their filter size because of gains in mobile computing power with new generations of phones. As computing power increases further on both mobile and with headsets, we will continue to see more AI support, improvements in render engines, more realistic and higher-resolution depictions of reality, and much more.
We also see big leaps in solving the problem of comfort while wearing a headset for an extended period as they get lighter and lag times are reduced. This will go a long way in consumer adoption of headsets. While their functionality is less advanced, Ray Ban Meta smart glasses, for example, are already a highly wearable product and illustrate a viable version of an AR interface for daily use.
Powerful cross-platform AR tools like ARCore and Unity allow creators to build AR experiences once and then easily publish to multiple platforms, including Android, iOS, and different headsets, as well as the ability to retrofit previous builds. Apple, for example, has released an SDK for the Vision Pro but does not have their own platform for development, instead relying on partners like Unity that are gaining more users every year.
With improvements in compatibility across different devices, AR experiences built for experiential marketing campaigns will be able to reach larger audiences with a lower cost of development per user, providing better ROIs and creating stronger business cases for brands.
Platform-agnostic software may provide a way to publish augmented reality across social platforms simultaneously. Social AR is currently limited by each platform having its own tool and features, and brands often have to make tradeoffs on which platform to use.
The introduction of spatial computing with the Apple Vision Pro SDK has created a new paradigm for user interaction. Built for applications beyond gaming (compared to the gaming-first approach followed by Meta and other headsets), the Vision Pro has created new methods for interaction that were not previously used by default — most notably, the removal of hand controllers as the primary way to interact with a program.
In place of hand controllers, new interaction behaviors like eye tracking, hand tracking, pinching motions, and voice control are opening up many fresh possibilities in how we interact with experience elements in 3D space.
Although these new interaction methods may take some time for users to adjust to, they are more intuitive and similar to our natural human movements and behaviors compared to interactions created for 2D interfaces. Spatial audio also adds an additional layer of multi-sensory engagement with users, furthering the realism of XR experiences.
XR Experiences will have the ability to become more immersive and require less effort to engage with, with more fine-tuned control over the level of immersion for each use case. XR for brands in experiential marketing will become even more impactful in building a memorable connection with the audience, providing the ability to literally transport them into a brand’s world.
Each platform currently has different best practices for user interaction design, such as how to tap and drag or move an object. We see new standards or protocols for user interaction in 3D contexts that everyone will converge towards, similar to the evolution we experienced with smartphone gestures like swiping or tapping that are now second nature for many. arfected is conducting internal research to determine these best practices.
In 2022, Google released the Geospatial API, which leverages billions of images from Google Maps Street View for Visual Positioning Service (VPS) — this allowed developers to anchor outdoor AR content easily in over 87 countries, with just a location’s latitude, longitude and altitude. Last year, they released the Streetscape Geometry API, allowing developers to “interact, visualize and transform building geometry around the user” by providing a 3D mesh of nearby buildings and terrain from the user’s mobile location and classifying features in a scene. Similarly, in 2022 Niantic introduced Lightship VPS for Web on 8th Wall, which brought their location anchoring technology to browser-based AR experiences.
These and other developments in accurate localization outdoors help developers understand the user’s location and what they’re viewing, making it easier than ever to build AR experiences that layer on real-world locations with centimeter-level accuracy.
Discovering and interacting with AR is getting faster for users, with AR interactions that can be very precisely matched with what they are seeing in the real world to provide more delight and utility. Brands at outdoor event venues like sports stadiums or large parks use AR layers that help improve the attendee experience, such as virtual directions to help fans find their seats, concession stands and restrooms or play an AR game while waiting.
Currently, implementations of environment-wide AR experiences like Niantic’s Peridot leverage small-scale AR on a worldwide scale. This is a possible precursor to applications that would be able to go beyond localizing small AR experiences and create a system that works on a much wider environmental scale with a wider experience radius using GPS.
Advancements in AR hardware and software are creating exciting opportunities for experiential marketing. We partner with brands to develop bespoke AR experiences that achieve specific campaign goals. Contact us here to explore the potential of AR for your brand. Or have a look at our work, to see all the possibilities with immersive tech.