Meta rolls out new tools & capabilities for Spark AR creators

Meta Spark AR

Meta has released the latest version (136) of Spark AR Studio that integrates new audio capabilities, tools that create depth responsive effects, depth colour overlay, and more to enhance the creation of immersive AR experiences.

Version 136 of Spark AR Studio also includes access and controls for occlusions and can be used in addition to the recent updates launched by Meta such as hand and body tracking.

Audio Capabilities

Meta has launched a new audio engine that enables advanced audio integrations in effects, along with improving the audio processing that lets creators fuse multiple audio sources together to create layered audio effects. Voice effects, sound effects, and music tracks can now be processed to create effects for Reels.

The audio engine currently only available for AR effects on Instagram has also introduced six new patches – Mixer, Gain, Oscillator, Vocoder, Filter, and Compressor, a collection of new asset patches in Spark AR Library, including Audio Fade, Pulse Limiter, Loop Player, and more.

The engine will enable creators to mix music tracks, add voice notes, and put together multiple audio sources, and is designed to customize and control a range of audio elements in AR effects. To allow creators to experiment and discover audio ideas, Meta has also released two new templates today called Piano Project and Audio Visualizer, that can be used to initiate a new audio effect or create multisensory effects.

Also Read: Meta tests new monetization tools for Horizon Worlds’ creators

Depth Responsive Effects

Meta has released a new depth mapping capability called Camera Depth Texture that allows creators to detect the relative distance of surfaces and objects from the camera, and extract this data as a texture. This technology is similar to smartphone cameras interpreting a scene and creating depth-effect in portrait images. The extracted data can be used to create effects that respond to depth or create post-processing and lighting effects.

Occlusion Controls

Occlusion feature that enables creators to blend virtual objects into real space, to create a mixed reality experience that adds augmented elements in a real-world backdrop has been streamlined to give creators easy access. With occlusion, creators can give virtual objects a more believable sense of space, by partially obscuring them with other objects or by completely hiding them from a user’s field of view.

Depth Colour Overlay

The new template called Depth Color Overlay that can be used to create a colourful sonar-like pulse effect has been released by Meta, as a part of the new updates.

The new features and tools can also be used in combination with one another to create experimental AR effects and experiences.