Enhance the virtual makeup experience

I have tested a lot of the capabilities of [ DeepAR Studio & effect tester app ] in the past days and I am still learning the software and I compare my experience with it to other platforms custom softwares such as Snapchat and TikTok , [DeepAR] is obviously a powerful and easy tool at the same time, but I see that the face mesh tracking is not as accurate on the mobile as the computer. on Mobile (IOS) seems very shaky . also other 2D assets such as lips and eyes also unstable compared to other platforms
I think this thing will not be exciting for the app user
Could you check that out please?
@jelena @Zak

Can you give us more info on what exactly is behaving incorrectly and in which circumstances?
The 2D meshes are used in the Beauty3000 app and in our experience are pretty stable

I tried makeup effects in beauty 3000 app on ios
There is a noticeable difference in the performance of the effect and the stability of face tracking
For example, the beauty effect template on deepAR effect tester app the lipstick and eye makeup look shaky (vibrating in its position)
Unlike beauty 3000, the effects are stable and the performance is perfect
Please check it yourself @jelena

Will do, which platform for the tester app were you using?

iPhone 11
Ios 16.4.1(a)
Thanks @jelena I hope that issue resolved quickly

That should be the exact same tracking then, what was the effect you were testing?

demo0.mp4.zip (5.6 MB)
Hi Mr @jelena I hope you’re okay.
I made this video as a comparison between the beauty 3000 app and the deep ar tester app
I still think that the beauty 3000 is better in the stability of the parts and the face tracking generally, although it has not been updated for a long time.
Please check that issue.

I came here to ask the exact same question. The beauty filter in the DeepAR tester app feels shakey like 2015ish technology. But(!) when looking at the official YouTube Tutorial it looks fine Editing the Beauty Template in DeepAR Studio - Tutorial Shorts - YouTube

So I guess we’re doing something wrong. I’m using the web preview function on Firefox 115.0.3 x64 Win10 x64.
What I also recognized is, that in the tutorial an older version of DeepAR Studio is getting used (UI looks a little different).
So is it maybe an issue between an old beauty template and a newer DeepAR version?

1 Like

@jelena please check it

These differences are likely caused by the different SDKs. The Web Preview uses the Web SDK and the Preview used in the tutorial is the MacOS Native preview which uses the MacOS SDK. We’re always working on improving tracking but some minor differences between platforms still exist

@jelena But with the videos provided above :arrow_up: inside the zip file as uploading videos not available on the forum
The face tracking accuracy and of the old version used in beauty 3000 app (last update was 3 years ago) is better than the last new version of the iOS tester app. however, both works on the same OS (ios last update)
Also that issue of the tester app found in android too and web preview on ios
But I noticed that mac of preview from the studio is stable like beauty3000 !!

Hi Mr @jelena
I want to mention that the accuracy of face tracking on the iPhone after the last update to the SDK has become much better on the effect tester app. Thank you and deepAR team and of course I hope more development and functionalities for the SDK.

1 Like

Great, thank you for the feedback. We are always working to improve, some tracking improvements can take longer to develop while also maintaining the speed needed to run in real-time. I’m glad you are seeing a difference with this update.

Good day! How can we improve the facial tracking? is there a way on our side the user to do some cheats/tricks to track the face better or its all on your side? I did a comparison between deepAR and Tiktok tracking and I find Tiktok Tracking is more advanced. How can we achieve a better facial tracking?

Please see the link below for your reference.

The left is DeepAR and on the Right is Tiktok. As you can see in the video. I did a simple test of drinking a coffee. The glasses on Tiktok is more stable even with the mug infront of the face. I also tested it with a female co worker playing with her hair using the beauty lite that I bought in the DeepARshop hoping to have a better tracking for the face. But as soon as her hair goes infront of the camera, the make up instantly tracks the hair. And when the hair falls back the make snaps back to the face.

Do you have a time frame of a release of an improve version of DeepAR with regards to facial tracking? Thank you!

Our SDK is optimised for live tracking on a a wide array of devices, including some lower end ones.

We are constantly working on improving tracking, but tracking in edge cases such as when the face is mostly obscured is not our focus. There is no way to change the tracking from the user side.