With its rivals like Meta making major investments in AI, Snapchat, the popular social media platform owned by Snap has also announced its own artificial intelligence-powered features. The company, which is known for its role in pioneering and promoting Augmented Reality (AR) technologies, will be bringing special effects through its Lens Studio 5.0 to users.
AR overlays computerized effects onto real-world videos and photos and Snapchat's new feature promises to provide a more immersive and engaging user experience through its real-time image model. It will allow users to type in prompts and produce realistic AR experiences in real time by applying effects to their videos and images.
What are Snapchat's real-time AR experiences?
Snapchat's planned AR experiences utilize the power of generative AI, which creates special effects called lenses that will be available to AR developers. Developers can then use these lenses to craft unique experiences, which Snapchat users can integrate into their content. For instance, users will be able to change their surroundings and clothing in the videos they shoot using such experiences.
When you apply an effect to your photo or video, it will automatically adjust itself to match the colors and the lighting in your content, resulting in a seamless experience. You can generate custom lenses using different text prompts.
Lens Studio is getting an upgrade
Snapchat's developer program Lens Studio is also getting an upgrade in line with the new AI features. The platform allows developers and artists to create AR features for applications and websites including Snapchat. With the new update, Lens Studio now has several generative AI tools, including an AI assistant that can help answer user queries.
Additionally, developers can now directly generate 3D images using text prompts, so there is no need to model them manually. This can significantly reduce the time and effort involved in creating AR effects while allowing artists to create more complicated effects. The new 3D images can imitate users' expressions and creators can even create face masks and textures.
While the earlier version of Lens Studio allowed for simple effects, such as placing a hat on a person's head, the new version allows the hat to adapt to the lighting in the video or image and even move along with the person's head.
Besides this, the new version includes immersive ML and face effects, which allow creators to change the user's face on the basis of an uploaded image or written prompt. Immersive ML can transform the surroundings as well as the face and body of users in real time.
With its new generative AI-powered features, Snapchat aims to take on its rivals and attract more users through unique effects. It is also targeting developers by making the process of designing AR features easier and faster. The company also has plans to bring more AI features to the platform, such as designing entire outfits in AR, in the future.
As for the availability of the already announced features, the new user-centric ones will be available to regular users in the coming months, while those designed for creators will be released by the end of the year.
Member discussion