11/24/2023 0 Comments Actor who plays thanos in endgame![]() ![]() When the facial performance is going from one expression to the next, the end points of that transition aren’t changed at all, but the transition itself gets more of the sense of the tissue in the face. It’s not a simulation, it’s intermediary shapes. “Another thing we did on Endgame that we used for the first time,” adds Aitken, “was implement some tech called Deep Shapes, a methodology for adding another level of complexity to the facial performance in an analytic way. We ended up having to patch that for Infinity War, but once the first film was done, we had the time to circle back and fix that.” “Firstly,” says Weta Digital visual effects supervisor Matt Aitken, “we had felt as we were doing the work on Infinity War that we could improve our ability to control the corners of Thanos’ mouth in a more detailed fashion. In terms of extra Thanos development for Endgame, Weta Digital introduced a few new approaches. Disney Research Medusa scans of Brolin were also used to validate the work. What this means it that they would first solve the tracks from the head-mounted camera capture of the actor, check this was delivering the right performance on their actor puppet, before migrating that motion to their digital Thanos model. And he’s in lots of intimate performances, where you get really close on his face, where you could really show how the mechanics of how it looks.”Īs noted, Weta Digital retained its Infinity War Thanos approach of using an ‘actor puppet’, an intermediary stage of the digital Josh Brolin, to deliver Endgame Thanos shots. He was just at home, no longer this aggressive being that we had met before, but rather more humbling. “In the yurt sequence, for example,” adds Cramer, “this was a sequence where we were able to really showcase this because it was a bit of a different Thanos. “And, I mean, we thought the last one was really good, but then once you really study the details of the lips, and we started just introducing lots more controls for the animators to get absolutely the micro-level of the performance out.” “We wanted to really take what we had achieved on the first one, and just find more nuances to show on Thanos,” notes Digital Domain head of animation Jan Philip Cramer. It lowered turn-around times significantly (hours instead of days) and allowed the team to dive even further into the detail on Thanos. ![]() For this, in particular, Digital Domain relied on a new piece of software called Bulls Eye that uses machine learning techniques to automate the 3D marker tracking from the head-mounted camera. What was extended here was the use of machine learning to simplify other procedures that needed to be done surrounding tracking the face. On Infinity War, Digital Domain had used machine learning to up-res the motion capture data on a per shot basis to ensure the performance included movement at the micro-level.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |