Motion Portal - An Experience Powered by Kinetix
Kinetix Motion-Capture for Metaverse Engagement.
Jan 2023 - 2 sprints // Kinetix // Metaverse, Web3, UGC, Marketing.
Role: UX/UI Designer & Researcher, Product Owner/Manager (Scrum Leader).
Tools & Methods: Double-Diamond Method, Figma, Jira, AGILE, Notion, Slack, Kinetix Tech, Pen & Paper!
— Partners & Project
In January 2023, Kinetix alongside 4 other young start-ups was selected to take part in the ‘Creativity in the Metaverse’ programme run by Meta, L’Oréal and HEC Paris at Station F in Paris, France.
The Metaverse, however seemingly immature is still considered promising and through this programme, L’Oréal were (and still are) focussed on pioneering different approaches in the service of beauty tech.
This initiative was an opportunity to experiment, explore and create synergies between Kinetix technology and L’Oréal’s brand objectives.
L’Oréal has also been keen on honing on the possibilities that Web3 has to offer, spearheaded by Camille Kroely, L’Oréal’s Head of Digital Innovation. Kinetix has also been present in the world of Web3, having launched the first NFT Emote Marketplace in 2022 .
A mutual partner has been Ready Player Me, a company that enables users to create and integrate personalised avatars seamlessly across platforms (over 4,000) and commonly used by Virtual World builders.
In November 2022, two of the Group’s leading brands, Maybelline and L’Oréal Professionnel premiered exclusive makeup and hair styles for avatar creation on Ready Player Me. This provided Kinetix with a wonderful opportunity to experiment using our proprietary motion-capture technology to enable a new kind of Brand Activation to showcase this collaboration.
— Brief
The core objective of the brief was to curate and design an immersive digital escapade, enticing L’Oréal’s brands’ communities to partake in an engaging experience. The goal was to create an interactive journey that not only fostered content creation for social media dissemination but also signaled brands’ prominent presence within the Metaverse.
Following discussions between Kinetix Head of Product and the L’Oréal team, it was decided Kinetix would create a mobile-first progressive web app (PWA). This app, featuring a Ready Player Me character as the brand's avatar, would enable users to generate and share videos of their moves performed by the avatars on social media. The decision to build a progressive web app was driven by its mobile-first approach, offering cross-platform compatibility and the potential for future expansion into a standalone application.
Although initially designed and branded for L’Oréal, the Motion Portal is targeted as a B2B2C solution for other brands that would like to partner with Kinetix. The objective was that Kinetix would ultimately be able to offer a white-label version of this experience which can then be adapted (using a .JSON configuration file) and re-branded depending on the client and their campaign/business objectives.
This opportunity would and indeed did allow Kinetix to carry on exploring use of its technology within a mobile context and draw lessons which could then be applied to the Kinetix SDK’s Companion Web-App (a product which would allow Virtual World builders to enable their players to create custom Emotes in-game).
The main Kinetix tech features (based on Kinetix Studio’s existing functionalities) that were to be showcased in the Motion Portal were to be the below:
Motion Capture: via a video recording or uploading a video file.
Video Rendering: deliver a video of an animated avatar performing the moves recorded or uploaded by the user, in a custom-branded backdrop.
Potential Additional Features: avatar and environment personalisation, adding a Style filter to the move (drunk, robot, old, etc), minting the movements as .glb and/or .fbx files into an NFT.
* Technical Constraints *
Ensure users submit clear videos for accurate movement extrapolation by Kinetix tech to precisely apply movements to 3D avatars. This is crucial as non-conforming videos could potentially impact the quality of the final video output.
Processing times for final output are resource-intensive and impact video rendering times , especially with simultaneous processes.
Steps include: Extrapolating the user’s movements from the video, applying the movements to a 3D character and generating the final video output (Avatar performing the user’s move in a branded environment).
➡️ We would therefore ultimately needed to test processing times and design variant versions and flows for the experience to accommodate the results of the test and provide users with the most enjoyable version of the experience.
FBX & GLB animations define skeletal joint positions, but require retargeting for different character rigs to ensure accurate movement.
— Process
01/ Understand L’Oréal and the End-User
Given this product was destined to be a B2B2C solution, it was necessary to adapt our UX Methodology to address “the problem(s)” facing both the brand (“B”usiness - L’Oréal) and the end user (“C”onsumer).
➡️ L'Oréal (Brand): Establish brand presence and credibility in the metaverse with a high-performing solution.
➡️ Users: Engage in a shareable, rich, and novel metaverse experience.
02/ Identify the problem
Created a persona (Lucy) based on our user research to help empathise with the user.
Determined the technical constraints of Kinetix tech which would impact the overall solution to the “problem”.
Defined a problem scenario based on getting Lucy engaged with the brand activation:
😕 Problem: “In a landscape flooded with ads, users are searching for ways to identify L'Oréal's presence in the Metaverse, ensuring their experience remains focused and meaningful.”
🤷 How might we use Kinetix technology to provide users with personalised and relevant interactions with L'Oréal's brand within the dynamic landscape of the Metaverse?
03/ Ideate Solutions & Test
Tech limitations (i.e. generating 3D videos) were a major determinant in crafting an appropriate solution. We needed users to sign-in to our services and and agree to the cookie policy for example.
Undertook Product Discovery, carrying out a competitive and comparative analysis of mobile-first digital experiences which use the phone camera and avatars. These included, Animaker (a VTuber interactive anime character builder Android app), Avatar Cam ( a mobile application leveraging augmented reality to create customisable digital avatars from users' real-time camera input) and Geenee’s experience created for Prudential and the NHL team New Jersey Devil.
➡️ We specifically looked at how these experiences dealt with flow, onboarding, creation process, customising the character & environment, as well as sharing (share-kit integration).
The Product Discovery Process
Campaign Scenario
Created a scenario for a brand activation for L’Oréal, which would structure the flow for the product. This first version served as a demo for how L’Oréal could use this product in real-life brand activation. The below was the scenario:
L’Oréal plans to launch a competition where users animate their Metaverse Mascot (by Ready Player Me), Anya, using Kinetix technology and share the video on social media for a chance to win a trip to Paris Fashion Week.
➡️ The guiding strategy is to turn Kinetix’s weaknesses in terms of relatively long processing times into opportunities for brands to sell products/services or deliver a message.
Designing & Testing User Flows
Created a flow with 2 variations depending on video processing and generation times: a synchronous and an a-synchronous version of the flow.
After testing the video generation at scale we ultimately opted for the a-synchronous version so as to mitigate any frustration from the user (we were unable to accurately estimate processing times in real time to users so did not want to provide users with inaccurate feedback, which would degrade the experience).
Created the wireframes in Lo-fi for these two 2 variations of the app flow. Keeping in mind the white-label nature of the product i.e. the design needed to be simple and adaptable to any other future brand partners via a configuration file (JSON) to customise: visuals, font, competition scenario, brand avatars & environment, etc.
A snapshot of one of the first versions of the Motion Portal flow
➡️ After making some corrections to the final flow based on feedback from Head of Product, Developers, and some guerilla user testing with colleagues, the flow was finalised and it was time to jump into the Hi-fi prototype.
A snapshot of some of our Lo/Mid-fi wireframes
— Motion Portal: Final Flow
— HI-FI Prototype
The Design System for the Motion Portal Experience was based on Google’s Material Design System. It was then modified to match L’Oréal’s brand identity.
🔀 Fork Between Sync. and Async. flows:
🛣️ Sync. and Async. flows reunite:
— Outcome
01/ Presentation to L’Oréal Group
Our Head of Product & CEO presented a demo of the PWA to L’Oréal group. They were very pleased with the result and we began working on a version of the product for one of the Japanese brands within the group (I must keep this brand confidential).
➡️ Ultimately, we did not go ahead with a campaign for both technical and commercial reasons
02/ JSON Configuration file
Created a document for future clients to fill in order to be able to configure the Motion Portal to their brand identity and campaign objectives.
Created a .JSON configuration file for the Motion Portal in order to minimise any future design work and allow developers to handle the campaign with minimal back & forths.
A snapshot of the JSON configuration file
03/ Motion Portal for The Sandbox
Released a version of the Motion Portal for popular game The Sandbox, a Kinetix partner and investor.
The Sandbox used the Motion Portal for promotional purposes on their stand at Paris Block Chain Week 2023. We were technically able to go ahead with the Motion Portal as there would be very few simultaneous processes taking place.
We received some very insightful user feedback from their team which pushed us to design a simplified and stripped-down version of the app for events. This version of the app would allow staff at the stand to handle the whole process (film visitors’ moves) and guests would receive the result via e-mail. The event version of the app was used by The Sandbox at an event in Hong Kong.
04/ An opportunity to learn for future Kinetix (mobile) products
The entire Motion Portal experience was a tremendous learning experience in many regards: in terms of designing for mobile, usability, our AI (powered by Machine Learning) and most importantly our backend architecture.
In the video demo, you can see how we used some of our learnings in developing the user-generated emote feature on the Kinetix Emote SDK.
— View the Demo
— Conclusion
Key Learnings
Our backend in its state at the time, could not handle the amount of simultaneous processing and video rendering which would be necessary given the scale of the audience the brand was seeking to reach.
This type of campaign made little commercial sense to Kinetix given our computing costs and the budgets which clients would usually be willing to spend.
Onboarding and video recording instructions are crucial for successful animation extraction. Users tend to gloss over or ignore text so we were forced to re-design a more visual and less text-heavy introduction process.
We understood that we also had to optimise our re-targeting capabilities in the backend and Kinetix Emote SDK, in order to cater to a greater variety of avatar types (bipeds only).
Next Steps
Kinetix as a company was focussed on developing and commercialising the Kinetix Emote SDK for virtual world builders so allocating both material and human resources to what essentially was an advertising format made little strategic sense.
Work on the Motion Portal were put on hold so we could re-focus our on products complimentary to the Kinetix Emote SDK (such as the front and back office to manage the SDK), which showed greater market potential, .
We kept a simplified event version of the Motion Portal live for demos and any future events .