The Future of Rendering – RenderCon 2025
By Render Network
Summary
## Key takeaways - **GPU Rendering's 20-Year Rise**: Jules Urbach and Ariel Emanuel recount starting GPU rendering in 2004 with early demos like a Star Trek ship, facing skepticism but launching OctaneRender on AWS in 2013, now powering major productions as Nvidia's value grew a thousandfold. [01:02], [04:22] - **Actors Owning Digital Likenesses**: Scanning actors' likenesses allows them to save time and money by reusing assets across movies and games, own their image for future use, and control it amid AI's rise, as seen with multiple scans of stars like The Rock. [07:20], [08:20] - **Render Network Powers Iconic Projects**: The Render Network has rendered shots for Star Trek: The Motion Picture re-release, Severance openings, The Sphere's 16K LEDs, and even NSA projects, demonstrating its scalability for high-end spatial and panoramic content. [20:04], [22:30] - **Blender and Redshift on Render**: Redshift is now fully launched on the Render Network for motion graphics, and Blender Cycles is in beta with two million users, backed by investments from OTOY, Apple, and Nvidia to enhance open-source 3D creation. [22:57], [23:17] - **Neural Rendering Extracts 3D from AI**: AI video models like Sora contain implicit 3D understanding in latent space, allowing extraction of neural radiance fields from single frames for relightable 3D assets, enabling text/image-to-3D for historical recreations like Apollo 11. [29:26], [31:09] - **Ethics in AI Likeness Revival**: Using SAG agreements, the Roddenberry Archives digitally rebuilt Star Trek with Leonard Nimoy's likeness on actors like Laurence Luckinbill as Spock, emphasizing frameworks for consent and royalties to preserve legacies ethically. [09:27], [40:17]
Topics Covered
- GPUs Unlock Parallel Creativity
- Actors Must Own Digital Likenesses
- Decentralized GPUs Democratize AI Compute
- AI Enables Real-Time 3D World Models
Full Transcript
My god. Render Con. We did it. We did
it. I can't believe this is real. It's
not rendered. It's real. Um, wow. It's
uh it's so amazing to see so many people here and to uh to sort of culminate uh the last 20 years at least for me in this amazing event. Uh more to come
hopefully. But uh we're here to talk
hopefully. But uh we're here to talk about the future um of rendering of many other things. Uh you know the world
other things. Uh you know the world today is just um moving so fast and uh there's a lot to cover today. I'm
excited by the speakers we have. Uh the
things that we're showing the actual venue itself is beautiful. Hopefully you
guys will get a chance to see everything. Um, but before we talk about
everything. Um, but before we talk about the future of rendering, um, it's important to sort of consider how we got here. And, uh, a big part of my journey
here. And, uh, a big part of my journey here, in fact, a huge part of it has been with a close friend, colleague, and mentor for 21 years. Um, Ari Emanuel.
Let's, uh, bring him in. Ari, are you there?
Hey, everybody. Ari. Ari. So yeah, it was I hate that you said 21 years. It is
21 years. It was 2004. August of 2004.
You showed up at my mom's house and I think you remember the story better than I do. And Yeah. Saturday
night. Yeah. Yeah. And uh I was having dinner with Rod. I mean I remember getting a call saying something Ari is going to show up at your house. I had no idea who you were. I'd never watched Andre Nothing. And then the start of a
Andre Nothing. And then the start of a beautiful friendship began. Yeah. 21
years ago. And what a journey it is. 21
years ago, you you had you had um computers in one room with the shad carpeting in the valley. Yeah. All the
way up to the ceiling, kind of two banks of it. And you were showing me a Star
of it. And you were showing me a Star Trek Yep. ship floating through just a
Trek Yep. ship floating through just a web um like a portal. I don't remember exactly what it was. Yeah. Portal. And
you started at that point you put diapers on me about CPUs and GPUs. Yeah,
I uh you know I sort of onboarded already to the GPU world back in uh 2004 and I took a patent that's my first patent was on the enterprise going through one window to another which was
uh you know one of many things that I was working on 20 years ago but yeah that's where it began and uh I remember also early on like I wanted it to work in films and things like that but I remember I told you I I needed two years
of hard coding to get GPUs cloud rendering all this stuff to work and I had you like in 2006 like I told you I'm ready to really get Oyster started and that's when we began like getting in uh more people and everything. But it's
been crazy. I mean how the movie business of course has changed so much.
I mean everything that we were doing about you know trying to get all that to change. I mean how do you see the film
change. I mean how do you see the film business right now? Like what state is it in from your perspective? Well let's
just go back a little bit. Let's go back because then you started doing for Transformers. Yeah.
Transformers. Yeah.
started making uh little shorts for Transformers and at that point I think I sent them to three directors. I think I sent it to Michael Bay, David Fincher.
Yeah. And JJ
Abrams and uh they all were like um I don't know if you're going to show them um but um they were all blown away uh by what you had done. I didn't know if those were on GPUs but or how you they
were there was all GP early GPU rendering. Yeah. 20067 and we both show
rendering. Yeah. 20067 and we both show and then right after that who was the guy that had stitched together all the computers that was we
thought was competition and then it didn't remember that. Yeah. Online.
Yeah. Yeah. 2009 that was that was a thing. That was a joke. Yeah. Well, it
thing. That was a joke. Yeah. Well, it
was I'm not going to say it, but No, it's it's it's it's crazy that the sort of battles that we've been through and and and things we've seen come and go over the decades. Um yeah, I mean poor man. He was friends with Fincher
man. He was friends with Fincher actually and he had another he had a capture system like light stage a little different. Um but it's funny. I mean
different. Um but it's funny. I mean
yeah, you know 15 years later like people don't even remember it really.
But you know the thing that's funny is that the GPU was always at the center of it and I think in 2004 when we first started this like Nvidia is worth 3.7 billion. It's like a thousand times that
billion. It's like a thousand times that now. And you know the world is now GPU
now. And you know the world is now GPU rendering up I think doing video game chips. Yeah. Yeah, you know, cool. Yeah.
chips. Yeah. Yeah, you know, cool. Yeah.
And it was Fincher actually that that got us going because, you know, the way that I ended up talking to first AMD was through David Fincher. He had contacts, you know, at AMD and I started doing demos and then Jensen uh to see if
Nvidia got wind of that and pulled me over to the Nvidia side and that's how we sort of got going with him. But it
began really, I mean, all the doors you opened at the very beginning really led to all the things that that came after um through these weird sort of circuit.
You remember we also talked to um I was in Abu Dhabi and we talked to the CEO of AMD at the time before I don't remember and then we we
we met with the one group there and we tried to help them redesign their GPU kind of processing power. Yeah. AMD get
done. It was Durk Meer and then it was the first the founder of AMD. Durk Meer.
That's right. Yeah. And I was on stage with him and he got fired a week later, two weeks later, you know, which is what So the original plan with AMD was we were going to build the GPU cloud. In
those days, it was unheard of. People
thought I was crazy. GPUs what in a server? No. You know, like and they
server? No. You know, like and they didn't see it. Um I mean he saw it and then they the board I still didn't really understand everything you were saying. I just went with it and put you
saying. I just went with it and put you in rooms with people. Yeah. And they
kept on saying this never's going to happen. Never going to happen. It's kind
happen. Never going to happen. It's kind
of a crazy path to uh where we're at now. Yeah. No, there's a lot of it
now. Yeah. No, there's a lot of it that'll never happen and you put me in the room and I remember like there's some early Pixar guys saying you cannot do rate tracing on a GPU it can't be done and you know now it's done right so
many things like that but yeah it's uh and of course you know now we have you know AI and and what's I mean you know people are so accustomed now though to the power of the transformative power of technology and how it's affecting
everything but I still love you know for me I mean all of this started because I wanted to make games and movies with technology that was like magical you know GP CPU has helped with that. I
mean, it was a way of if you think about taking an old system on a CPU and putting it in parallel on a GPU, you can you unlock all these problems, you know, including AI, including rendering. And
we've been doing that. And I remember in you remember I did a chatbot in 2008. It
was very similar to chat GBT. It didn't
use any of the newer stuff, but it was the idea was that could be on a GPU. And
I had to pick two paths. Do I focus on rendering or AI? And I was like, I'm going to do rendering because it's just easier and the world's probably not ready. you know, I don't have enough
ready. you know, I don't have enough data to feed anything in, but that was an early decision. So, I feel like we did I remember that. Yeah. I mean, one of the other things I remember is we had a lot of meetings with studios. Yeah.
And we talked to them. You could you get capture one, if you use the light stage, you could capture all these assets.
Yeah. And then you could continually use them. So, one, it would bring down
them. So, one, it would bring down costs. Two, uh you then could use it for
costs. Two, uh you then could use it for the movie and then simultaneously build the game around it. Yep.
and uh they could come almost out simultaneously as opposed to the structure now that takes so much time and they would control those assets into the second movie, third movie if the
movie worked. So um I think people are
movie worked. So um I think people are getting there now. I think with AI and um and your and your and your uh processing right now that people are
understanding what the potential is, especially I don't know if you're going to show the Star Trek stuff or any of the other stuff we've done. Um I think people realize that these assets can last over time. You don't have to
recreate them and the cost structure can kind of come down which is actually crucial for Hollywood. Yeah. No, it is.
And you know, it's funny. I mean over the course of light stage I think we scanned in every after we've done the light stage has done every Marvel and DC movie we scanned in the rock five times and I think you know he's your client right and and he would you know the
fifth time he's telling Clay like also Robert Downey he did yeah yeah I mean everyone multiple times so that was when before co I remember we were talking about why don't we just have the actors own their likenesses because that'll matter even more when AI and all these
things and we started on that process and we were talking to everyone from Tyler Perry to Kim Kardashian all these but I do think that now this is the year where AI just explodes codes everywhere and I think people are really sensitive to that and all this groundwork that
we've been laying out I think is going to play out in a you know pretty relevant way. So uh everybody also
relevant way. So uh everybody also realizes they don't know how to control their destiny but they know that they have to. Yeah. And especially for actors
have to. Yeah. And especially for actors we're not there yet with writers but I think we'll get there. I don't I think it's less using render, but for actors
for sure, I'm probably eventually directors owning their image and being able to control their image and their states controlling their image into the future I think is crucial. Yeah, it is.
And we're, you know, the Starship project that we're showing, I mean, we have one actor who passed away, which is Leonard Nemoy, and we have his son Adam here who's going to be talking about his father's legacy. But you know it's it is
father's legacy. But you know it's it is interesting the structure project we just did. You know we had the SAG
just did. You know we had the SAG agreements done finally and we brought that to Shatner and Nemoy estate and we did it and it was you know it was one of the first tests of how you know how digital replicas can work in this new
world and it went really well and I think that you know you opened one door which is pretty crazy. you show it to David Ellison, right? And and we did and he, you know, I think the response there was really positive, but also talking to
him, it was interesting. Clearly, you
know, he's so tuned in to all the different things that are happening and he honed in on the fact that what you're doing with with characters and faces is is something you can do in real time.
I've never seen that. Um, but obviously just given the transformative things that are happening, I think even studio heads are just really aware now that you have to think about all this new technology. So, it just seems to be
technology. So, it just seems to be playing out in these different ways across the board. I think it's also the case um because last week I started talking to um I'm gonna mispronounce his
name maybe Sheree from now the White House who's dealing with how to deal with copyright, trademark, etc. And the White House is thinking about because this you know entertainment is such a
big export for the country. Well, you
used to represent the president, right?
Yeah. Yeah. how they how they balance between technology companies that are doing AI and the um Hollywood and kind
of factoring in that that line between everybody and and the process and the royalty structure. I think it's hard,
royalty structure. I think it's hard, but they're starting to think about it.
They're talking to a bunch of lawyers about it and uh we're gonna I'm going to help them get in front of the gills to think about it so that um there's a
system by which everybody can benefit from it. But for sure into the future
from it. But for sure into the future actors have got to go through your system so that they can control their destiny. Yeah. I mean that's what render
destiny. Yeah. I mean that's what render is all about. You know Otoy and Octane were where we began. But then render was like let's use all these GPUs even if it's not Octane our render right? You
know Oto's render let's bring it all together. And you know one thing that
together. And you know one thing that the Deep Seek stuff showed is that you can run even the stuff that OpenAI was doing on much lowerend GPUs. The whole
premise of what we're doing a lot of our talks today is that yeah we can run on these lower-end GPUs also made by Nvidia mostly but also Apple and uh and distribute all this compute power and maybe the whole you know Sam thing we're
trying to build a 500 you know billion dollar data center doesn't have to happen three or four years down the line as these GPUs get better. So that's
definitely you know our plan and on that note I think I want to get into the slides for everyone and I you know I love you thank you so much for for being part of this amazing day. I mean, we're opening with you. So, I'm so happy it's
only 21 years, not 25. So, I can actually think that I have a couple more years left in me. Wow. No, we I think we got I think we got decades left in this adventure. I love you, Ari. Thank you so
adventure. I love you, Ari. Thank you so much for joining us. Thanks, Ari.
Yeah, I mean uh this journey that that I began really began with uh with Ari and it has been, you know, 20 years. So I
want to take the next I got 43 minutes which is going to be hopefully enough to cover my uh my tech talk and you know this this sort of journey that I'm going to take you guys through will be a
mixture of you know some some tech some hardcore stuff and also just themes that uh some of which we touched on with Ari and we'll be talking about with the other speakers and panelists uh many of you and I think frankly the world is
just familiar now with the transformative things you can do with generative AI and how a computer right can create images. It wasn't always so obvious that this would be something that the masses could touch, you know,
20 years ago or even 10 or 5 years ago.
Um, but the way we look at things, at least from my perspective, has been we want to create tools that do allow people to create at all levels. You
know, if you want to create a high-end film, great. If you're, you know, a
film, great. If you're, you know, a motion graphics artist, which has been our sort of bread and butter at Oy, where we built Octane and provided Cinema 4D plugins and things like that, great. Um, but I do think that no matter
great. Um, but I do think that no matter what you're doing, we want the tools to be easy to use and we want them to be voter real, right? That was that's been my goal from the start and and to sort of power these experiences
holographically. Uh, that 20-year
holographically. Uh, that 20-year journey, I mean, we, you know, putting GPUs in the cloud began uh really on stage with, let's see, it's skipping
through these slides, but um, let me go back to this. Uh 12 years ago I was on stage with Jensen at the GTC keynote and you know he had sort of seen what we were doing and tried to do with Durk
Meer at AMD um and in terms of putting you know their AMD Radeon cards in the cloud and he said okay Jules come on stage I want you to launch your product which was Optane render in the cloud but really it was GPUs in the cloud that
that had never really been taken seriously before and we had just gotten a deal with Amazon to put Nvidia GPUs in the cloud. I had helped design those
the cloud. I had helped design those systems the G2 instance on AWS and uh we were going to launch Octane as a service on it and Jensen brought me out and we showed you Ari's asking about show the
transformers clips. So this was uh 20 13
transformers clips. So this was uh 20 13 and we had done the transformers piece about um six years before and we did run it on a 100 GPUs. Now it can be run on
one uh even an iPhone. But that was the genesis of us officially launching you know cloud rendering. Um and a year later we launched it as Octin render
cloud and today it's um it's the render network. Um the uh I think that the
network. Um the uh I think that the other part that's important to consider is that the relationship with Nvidia has continued and persisted for um for a while. I mean we now we're still working
while. I mean we now we're still working with them right on on on their GPUs.
They just put out a new uh RTX uh ProGPU with yeah 96 gigs of RAM. That's crazy.
And to the point earlier, you're going to see GPUs that come out that are able to do a lot of the stuff that's been done in a data center. Uh it is also important to sort of note the rise of Nvidia. I mean my this slide really
Nvidia. I mean my this slide really represents Jensen's rise, right? I mean
I was saying that Nvidia has become a thousand times more valuable over the course of the last 20 years. And that's
because AI has just driven that. I mean
AI and the cloud in particular. uh and
you know I think that it's just fundamentally uh you know driven a lot of angst. So a lot of our discussions
of angst. So a lot of our discussions today will talk about how does AI affect us? What is you know what what's
us? What is you know what what's fearbased that's real? What isn't? Uh
and some of these headlines are just from you know a week ago really. Uh you
have luminaries um many of whom I think have good takes on it. I mean, you have James Cameron saying that effectively, you know, policing AI is really about the output, not the input, not what it's
trained on. Because the truth is that
trained on. Because the truth is that genie is already out of the bottle.
There um there are companies in China and Singapore where there's no laws about using any IP to train an AM model.
Those things get put online. Anybody can
create content with it. And my premise is that those tools are just new tools.
They're not necessarily going to replace, you know, the highest end art art that you can do. And you know from um you know perspective of studio head right which we're discussing with Ari
you know I think her point is you know it's still a humanentric world humans creating art for humans is what matters and I agree with that 100%. Now some
people think that maybe that's not going to be the answer either and that you know anything that's done with this can be can be useful even if it's not really thoughtful or or you know it's just
focused on memes maybe. Um but one thing that was really clear is that you know of all the things that GPUs um were really being pushed for with OpenAI's work. It wasn't until they started doing
work. It wasn't until they started doing image generation and that became massively popular right with this meme that was this meme factory that came out a week ago that you started to get you know Sam saying my GPUs are melting I
need more GPUs. It does point to the fact that it's not just artificial intelligence chat GPT style that's going to drive GPU demand it's image generation and that's why renderer was
started. was to provide GPUs distributed
started. was to provide GPUs distributed decentralized DPIN uh that can you know handle all this demand and not have it be centralized not have it be done just
through Amazon or Google or Microsoft or Open AI. Um and I think the future for
Open AI. Um and I think the future for decentralized and this is a huge part of why render exists is very likely and and you know trending our way. The first
thing is you do have GPUs from Apple for example that have 512 gigs of of RAM.
You could stick two of those together.
you've got a terabyte that can run the full Deepseek model which is as good as anything OpenAI is doing. Uh you have Nvidia putting out cards that have 96 gigs. You can chain them together and
gigs. You can chain them together and get 192 that starts to give you enough to run a lot of these newer models. The
other thing that's changing the world is that a lot of the compute power, right, render is there to take on as much work as possible. The more work that goes on
as possible. The more work that goes on the system, the better the render economy does. And things are, you know,
economy does. And things are, you know, things are trending very clearly towards not spending all this money and time pre-training a model, but doing it while the model's actually running at inference time and all the stuff with
reasoning that you're seeing with 01 and deepsee R1 is using the GPUs that you're talking to, right? Rather than having it all pre-trained. So that means that the
all pre-trained. So that means that the hardest thing to do on decentralized training is something that we can move to the endpoint. And then there are even newer techniques. And this one, I don't
newer techniques. And this one, I don't know if you guys saw this Tom and Jerry one minute, you know, AI generator. This
is crazy because it's doing the shots, right? It's it's planning out the one
right? It's it's planning out the one minute video and it's picking the shots.
And the way that this works and the the breakthrough here is that it's actually it's not pre-trained on it as much as it is it's training itself while it's running. So doing test time um training
running. So doing test time um training while you're running the job. Basically,
it's learning while it's thinking and versus all doing this in a pre-step.
That's a lot more like how we think as humans. And that's leading to these
humans. And that's leading to these interesting breakthroughs. But all of
interesting breakthroughs. But all of that again is something that can run more on a single GPU node versus having it be you know locked into some sort of um you know massive pre-training step.
And then you have something called you know AI agents. I don't know if people that are not in the AI world are tracking that but basically you have an AI that connects me these services and does things. And for 3D content creation
does things. And for 3D content creation it can talk to Blender, right? There's
an MCP. This is the new API for AI that can run Blender 3D jobs. And this is great because one thing that AI could be useful for instead of just generating a single image is can it help assemble a
scene can it give be given certain tasks that can go right into the 3D tools that we as artists use. And for sure that's you know that's already somewhat possible with these tools. So we're
seeing this you know trend sort of accelerate and I think also in order for that to work you want to have multiple you know you don't want to have a single polarity of of compute. you want to have multiple GPU services out there that can
interact and that trends towards decentralization and that's really where rendering comes in right I mean we built this system and launched it um you know eight years ago to to handle the workload for rendering to leverage all
the millions of GPUs that Nvidia and others have been putting out as uh consumer GPUs and I want to sort of showcase some of the things that have been done on render and also a little bit with octane um for those that aren't
that familiar with rendering um you know it it is used for film CG and movies of course One of my favorite things that has been done on the render network was the recreation of of certain shots for
the um uh 2022 re-release of structure the motion picture. My favorite film. I
see it David. I mean this was done in Octane didn't even know about it until there was a support ticket and it was done on the render network. So Final
Frame films for studios. I guess this was you can see the time stamp 2021. Um we also have been doing as AR's
2021. Um we also have been doing as AR's mentioning the Star Trek project. it.
It's been featured, you know, uh, by Apple in a couple of the keynotes. We
can even take these assets and render them on iPads now and then send those to the cloud. And another huge driver of
the cloud. And another huge driver of what we've been seeing with render is doing these high-end renders for things like the Apple Vision Pro, also um, related to the work we're doing with uh,
the archive for Star Trek and uh, and those and those renders are just massive. I mean, to to render something
massive. I mean, to to render something that's panoramic, that's spatial uh, pushes things even beyond uh, 16K sometimes. Um also I want to note this
sometimes. Um also I want to note this uh this is beautiful. This is done in Octane the opening of severance for seasons one and you just found out
season two.
[Music]
[Music] [Applause] [Music]
Heat. Heat.
Heat. Heat.
[Applause] [Music] [Applause] [Music] [Applause]
[Music] Yeah, it's beautiful. It's always a great pleasure to find out that your software has been used on something you
love. And I I do love Severance. I think
love. And I I do love Severance. I think
it's a great show. Uh also the sphere, right? Uh that's 16K by
right? Uh that's 16K by 16K giant uh set of LEDs. And uh you know in fact MSG was an investor in Otoy because they were you know before they had this sphere they had similar large
format venues and this is the kind of stuff that render is absolutely awesome for uh we've also had NAZA use render which is uh something I'm very proud of
and I love and just looking at some of the recent technical achievements you know I mentioned earlier it's not just our you know Octane oto render it's also redshift which is now fully launched on
the render network and there's a huge chunk of the motion graphics industry that uses it. So, we're very proud to work with Maxon on supporting that. And
we also have recently taken out a beta, Blender. And Blender is really
Blender. And Blender is really interesting because it's got two million users. Um I think there's a huge market.
users. Um I think there's a huge market.
Uh in fact, and so we're pretty excited that we've launched Blender Cycles on the render network. Uh in fact, uh I want to sort of take a minute to talk about our relationship with the foundation. We've been backing the
foundation. We've been backing the Blender Foundation uh for two years. Uh
that's Tom, the creator of Blender. Uh I
think that the future of a lot of what we're going to be talking about connects to Blender because it is open source.
You can build for example it's pretty straightforward to build AI connections to it. You can modify it. You can run it
to it. You can modify it. You can run it all these different environments. So for
us and even though I think there's there's many more steps investing in Blender making it work better with a render network even with Octane possibly other renders has been really important.
Uh and we're going to continue to invest in the Blender Foundation. I think you know later this year maybe towards SIGGRA we'll have some bigger announcements along those lines. But our
goal is we want to support Blender for as many things as we can. I mean we have 26 different 3D tools we do run on including Unreal and others. But Blender
has a really important place because it's so accessible and there are millions of these users and you know it is crazy the way the foundation runs. I
mean you know the way that that Todd set it up it is truly like Wikipedia but for you know 3D rendering or 3D content creation. Uh and we were using it all
creation. Uh and we were using it all over the place. I mean we, you know, mixed blender seamlessly with other things like Cinema 4D and Autodesk 33ds Max, but beautiful works being done in it. Uh it didn't used to be the case.
it. Uh it didn't used to be the case.
Blender wasn't you know where it is. But
I mean over the last few years from investment from us, Apple, Nvidia, Unreal and others, it's become a much more, you know, powerful tool. Uh and I think that now I want to sort of think about where do we take you know
rendering further? Where does what's the
rendering further? Where does what's the future look like? And you know if you're noticing I've been trying to predict these things for for a while and I had predictions from 10 years back. So if I look at my GTC talk from 2015 for
example certain trend lines there was a lot of things where I was like thinking about what's going to matter in the 2020s you know and uh you know and how we spent the 2010s I mean pretty much went according to plan starting with
Octane changing the way that GPU rendering work from images to full productions which of course were well past and I was trying to think about the things that would come in 2025 you know immersive I mean we have the Apple
vision pro real time of course we have Unreal driving virtual production photo realism I think has, you know, largely been achieved. And I also would always
been achieved. And I also would always start my talks with quotes from Tim Sweeney or John Carmarmac. Um, actually
Epic invested in Otoy and Tim Sween's been a great, you know, supporter and proponent of our work. Same thing with Carmarmac. Um, but you look at that last
Carmarmac. Um, but you look at that last piece, right, from 10 years ago, yeah, yeah, it's going to be important and and not just for things like generating images, but just in terms of real time and for games and and everything that
matters to the world of of how we do CGI and video gaming. And then now that's sort of come home full circle 10 years later. You know, I'm still talking, you
later. You know, I'm still talking, you know, to and about Tim Sweeney and Carmarmac. And just recently Microsoft
Carmarmac. And just recently Microsoft put out this crazy demo where you're running Quake, but it's completely generated the video game, not just the image. You're playing it live in a
image. You're playing it live in a browser. And you know, there's a lot of
browser. And you know, there's a lot of people that just absolutely reject anything that AI is connected to. So
there is a fan of Quake saying, "This is absolutely disgusting." And Carmarmac
absolutely disgusting." And Carmarmac replied saying, "No, it's not. I created
the game. I think this is amazing. It's
a step forward. these are tools that we can all use to um you know to build something even more interesting. And Tim
Sweeney basically said the same thing.
And this is a week ago. So this is right now. This is this moment in time. You
now. This is this moment in time. You
have I think this is so representative of what's going on in the world where you've got camps that are against AI because of how it's been trained or what it's worked on. But you're going to see things where the creators of these works, you know, where there's such
outrage are are absolutely in favor of this happening. And if they understand
this happening. And if they understand like Karmic and Tim Sweeney do that there's better tools for all of us um on the other side of this then that's I think the side that I'm sort of leaning towards. I mean there's still a ton of
towards. I mean there's still a ton of bad horrible things that can be done with AI especially against artists and creative will. But there's a huge piece
creative will. But there's a huge piece of technology that feels a lot to me like CG and and 3D and real time was back in the day when that was sort of taking on a bigger place in the content creation pipelines. One of the comments
creation pipelines. One of the comments though to this article, right, which is really interesting, was well, you're going to need this kind of technology.
You're going to need an AI that can basically on the fly generate a 3D world because that's how the Star Trek holiday worked. And we all want that. And that's
worked. And we all want that. And that's
been a big theme of what I've been talking about from the beginning. Um,
the Star Trek holiday is truly, I mean, this is 1987. This is encountered at Farpoint on the left. On the right is 2005, the last episode of Enterprise.
They're almost bookends to the Star Trek story. And the way that the holiday
story. And the way that the holiday works, you go into a room, you get anything you want. It's rendered. Right?
So for me, it's been a dream to actually see that thing happen. Not in a 300 years, but today. Um, and I'm not the only one that thinks about this. In
fact, you have, you know, Jensen talking about the Nvidia holiday and you have uh, you know, everyone from the New York Times and others. I mean, it's it is sort of this representative goal of how
we want to see all this play out.
Because when we watch Star Trek, and I love Star Trek and it's a huge part of my life, I think of a better world. I
think of a future where all this stuff's been figured out, where AI is is not is is only there to sort of help, you know, with a with a better human experience.
Uh, and it also is something that experientially we want to try to achieve. So getting to these world
achieve. So getting to these world models is really important. Um, even Sam Alman, you know, who I mean just all these different ends of the spectrum for for people approaching AI just quote the holiday, right? Straight up say this is
holiday, right? Straight up say this is what we want to do. I don't even know how how much of a structure fan he is, but it is sort of in the mindset of the world out there that this is something that we want to try to achieve technically emotionally
philosophically. Um, and I think a year
philosophically. Um, and I think a year ago, right, so it's it's really interesting to check the year-by-year progress. Sora was announced and this
progress. Sora was announced and this was the really the crux of my talk a year ago where I made these slides and I was showing that for the first time people can't tell the difference between generated AI video and real video. And
this one in particular was something that that caught people's attention.
It's crazy now that a year later like sore is that out of 20 video transmission models is the probably the least interesting one but at the time that I made these slides a year back it was the only one and it wasn't even out but one thing that I mentioned back then
which was super interesting was if you look at what these video generation models are doing you're seeing that there's 3D consistency and even more so I was like well wait a second let me grab this sort of thing and let me see
if I can extract a 3D model out of this because I think the AI has a model in latent space and I did that's me taking the you know, basically generating a nerf, you know, a neural radiance field
from just one frame or a few frames of it. And you can absolutely see that the
it. And you can absolutely see that the AI does have a 3D model. And this points towards the future where yeah, you can have AI render a scene. Is it traced or or is it is it handled with neurons? You
know, that's a big question. That's a
technical question. I think it's going to be a mixture of both those things.
And that gets back to this big trend line towards what is everybody saying they want to do with with um with their generative models, world models. In
other words, everything is generated in real time on demand. Um, and that does feel a lot like a video game and and ultimately as a video game developer, as a content creator that uses CG, you
know, this is something that feels like it's important. And the, you know, the
it's important. And the, you know, the paper that um, Microsoft put out in, you know, conjunction with showing Quake running in real time was talking about their early steps, but I think it's important to talk uh, a little bit about
the core pieces of technology for how this would work for artists creating 3D.
So instead of having a 3D model as we traditionally think of it, you train an AI on what the model is, maybe different viewpoints, maybe send it the mesh. In
our case, when we were doing early tests, we would just give it a 100 images. It would then create, you know,
images. It would then create, you know, not a a, you know, a gazian spider, but an actual latent space model in memory in the neural network in this tissue of connected things. It knows what the
connected things. It knows what the enterprise looks like and you can ask it for a view and it can render it. If it's
trained on just that, it's it looks a lot like a 3D model. And that's the kind of stuff that is much more interesting for all of us as content creators than just hitting a prompt and getting back
an image or a video. So, you know, this is this is a huge part of the future of rendering. One of the coolest parts
rendering. One of the coolest parts about how this works is that you can take a photo, any photo, right, from history, like the Apollo 11 moon landing, and you today can go to chat GPT and, you know, they just put out two
weeks ago this this image generator, right? Um the more more interesting for
right? Um the more more interesting for me is that you can test how good that image generator is in understanding what a 3D model is of the photo you're sending. It's really good actually. So
sending. It's really good actually. So
you can, you know, give it this photo, it'll give you a wireframe of it. It has
a basic understanding of the scene, which is what I was saying a year ago, right? There's a 3D model somewhere in
right? There's a 3D model somewhere in latent space. If only you could extract
latent space. If only you could extract that and use it as a 3D model in a game or in a, you know, 3D pipeline for content creation, you'd have something amazing. And that's what Otoy is working
amazing. And that's what Otoy is working on. And that's what we want to build for
on. And that's what we want to build for the render network as a tool. So the
problem is though that AI is a little bit messy. Even the chatbt thing, right?
bit messy. Even the chatbt thing, right?
it starts to lose track of what the original image is. It drifts. It becomes
you get a finger at it here and there.
It's uh it's not great but you know there is progress being made. So this is something that's a huge breakthrough I think technically we can take an AI this just generated it from HIGS a new service that I use um you know can
generate an image but I can take that generation and I can bring that now on the bottom is Octane our our software right you know and I can run it in the traditional 3D scene. This is Octane's a little grainy because it's it's running real ray tracing and I can move it
around and I can treat it like a 3D object with the reflections and all that and that's a huge breakthrough. So if
you can generate something not just a you know a simple uh 3D mesh that doesn't look like the original but anything that AI can do can come into the scene as a 3D object that's a big deal and that makes you know you know
creating things let's say with precision and control much more interesting. So if
we wanted to, you know, recreate a scene in history, for example, uh from the ' 40s and World War II, um you can give it a, you know, the mesh of the of the boat. You can basically, you know, mix
boat. You can basically, you know, mix and match different pieces together, but ultimately the idea is that it could be rendered like your traditional CG pipeline. Uh and this is much more of a
pipeline. Uh and this is much more of a useful um you know, piece of of kit than I think anything that I've seen from the pure generative models. I mean, it's great for immune generation when you can
type in text and get something back. But
even talking to the guys that you know that created flux the image generator like and there you go. I mean this you can do a lot of crazy things with a mixture of neural rendering and traditional 3D. I think even the guys
traditional 3D. I think even the guys that are creating these AM models don't want to have text to image or video be the way forward. I think that's maybe okay for concept art. Um so I think that
we should look at some of the things that um are aren't part of this future pipeline. So for example text to image
pipeline. So for example text to image to a 3D mesh. It's possible. Um here's
an example of you know that works successfully from you know two photos you can see here this is running on Korea which does some really good work and using a uh a model called tripo it's
able to extract a 3D model and takes 32 seconds and I can then bring that model into Octane is it is it great no but it's not bad so if you're just doing blocking things or concept art and you
want to give it an image and get back to speed model this works 25% of the time and it's okay but it's it's interesting this is so much better than it was like 3 weeks ago. I mean this is the how fast things are moving in the space that you
know 3 weeks from now I'll probably have a whole different set of slides if I were to give this talk again. Um and
there's a lot of value maybe maybe in having a single part generated for a larger 3D scene you're building. So this
is an example of um somebody on the team that was testing building like a single piece that goes into a larger um you know part of a 3D scene that you're building traditionally. That may be more
building traditionally. That may be more useful. Um there's also you know work
useful. Um there's also you know work being done with just taking photoggramometry which is now best exemplified by gazian splats and those are really good. I mean those are much more towards you know they're closer to the neural rendering pipeline that we're
imagining and we're about to show more you know work on but you can't really relight these things. That's one
limitation. Um and I would say the text image the stuff that we're using and seeing this you know online in all these different services where does that fit in? I mean the thing is that's um that's
in? I mean the thing is that's um that's neural rendering sure but it's to 2D and it's hard to control because it is 2D right um but here's me testing we had a shot in our recent structured piece that
we had to you know it didn't work on set and we didn't even end up using it really in the piece so I ran it through every single video model this is like last week and they all sucked none of these things were usable for production
um at all in 2024 and we worked with you know the luma guys runway many others and this was as of you know a week ago I'm just I mean every single model.
Sora, uh, Google via all of them. I
mean, it just not usable. It only was two days ago where I was finally able to get one off of Higs. They gave me something that actually worked pretty well. So, it it these things are moving
well. So, it it these things are moving forward. But what I really want is just
forward. But what I really want is just give you the 3D scene and let me then, you know, mix and match it like I would as a 3D artist so that the artist is in control. And that's basically what we're
control. And that's basically what we're working on. Um, and there's a lot of
working on. Um, and there's a lot of parts of the production pipeline even outside of CG where the technology that Otoy has been working on it that we want also bring in into render make make
sense. Um, the render network um, you
sense. Um, the render network um, you know, effectively we're adding all this the services. So, if you want to use one
the services. So, if you want to use one of these generative models in in even in Octane and launch something and bring back the data, you can do that. We've
added those. Um, but I think that we're going to be building on that and shipping, you know, many more interesting pieces in the coming months.
Uh here's an example of something that goes right back to the practical world of of prosthetics and makeup. And if
you've seen anything we're doing on the Starford pieces, I call with the face replacement digital prosthetics because it truly is something that is coming from the world of of makeup and and the
physical. So I I had the team test this.
physical. So I I had the team test this.
I've been showing different slides of makeup transfer. I wanted I love to show
makeup transfer. I wanted I love to show the penguin. Colin Farrell gets
the penguin. Colin Farrell gets transformed into the penguin. It's an
actual physical prosthetic. Uh he spends four hours in the makeup chair. And one
of the things we wanted to change was is this necessary? Can we take something
this necessary? Can we take something like this and simplify this so it doesn't take four hours. Um, and so this is just an internal test uh that we were doing to see if we could even build that model. And we didn't, you know, we just
model. And we didn't, you know, we just basically did photoggramometry on, you know, pictures of the sculpt. There it
is back in Octane. Right. So we have the 3D model of it. Um, and I just did I had the team just give me um a digital prosthetic that is mapped, no dots on my face so that I could just try this out
and see how it worked. And this is in real time, by the way. This is running on a single 4090 and I am now the penguin. And it's it's kind of it's it's
penguin. And it's it's kind of it's it's a makeup thing. It's not giving me the penguin face. It's putting the actual
penguin face. It's putting the actual physical prosthetic, that 3D model on me and matching it in real time, giving me the teeth even, right, which we also had the, you know, the team build. And it's
pretty crazy. That's in real time. So,
if you're and and one of the things we'll be talking about today is as an actor performer, um you can just seamlessly have things like that happen and be creative. And for the the work we're going to be showing with Star Trek
going back a few years, we have actor Lord Celich who's been playing Spock, we started with nothing digital. We just
put him in a physical prosthetic and uh it was again it took hours. It was
complicated and a lot of work. Um but
you know we started there and then we slowly had the pieces where we could replace those prosthetics with even better pieces that were digital. Um and
this was three years ago we were doing tests where this is again live running on a GPU. So you know uh Lawrence could see himself at Spock on the monitor and
this was without the digital without the physical prosthetics I think one of the first tests and it looked really good um really good. So from there we also been
really good. So from there we also been you know looking at can we take scenes and scan them in and bring them into Octane as you really high quality 3D assets relight them. I mean we could sort of do that for uh static stuff uh
you know that was scanned in the light stage and now what we can do is for example I mean this is a 6K render uh what you guys saw with me in the penguin mask was real time but this is a full
head replacement um and it's you know it's beautiful. I mean it really does
it's beautiful. I mean it really does hold up well and that I think is something that's pretty unique with this tech is that you can use it for full production and it looks absolutely amazing. Uh the other part is this these
amazing. Uh the other part is this these are it's not a defect right it's a 3D model so I can take the actual thing that I'm seeing um and I can even include that with the scan of the set
and and the actors I'm doing it and I can move this thing as a 3D object in real time that feels like traditional CG pipeline or even a video game. Uh, and
by the way, this is real time for, you know, in the last piece we did, Spock was old. He was, you know, 80-year-old
was old. He was, you know, 80-year-old um Leonard basically. And we basically didn't have um any really great reference for that. So, we built a 3D model. We scanned in his life mask. And
model. We scanned in his life mask. And
that was the basis for how he did um the Spock prosthetic. But you can see this
Spock prosthetic. But you can see this is um you know, this is a full 3D model that I can look around and I can bring that in Octane. This is the latest nightly build of Octane in real time.
And it's incredible. So it's not this is not just for final frame rendering. This
is something that could be used in the future for games. I mean this is running on a single laptop and it's incredibly powerful. We can also relight. Remember
powerful. We can also relight. Remember
I seen Gaussian splats are really great but you can't relight them easily. This
can do that. And so this is a great example of where tech is going that effectively empowers artists that have existing tools with existing workflows with something that I think is pretty magical. And of course, you know, a lot
magical. And of course, you know, a lot of this goes and touches on the ethics of what we're doing with with the kind of technology where we can take Leonard Nemoy's likeness and put it on Lauren Celic and and, you know, do these these
pieces of content and video games. Um,
and I think there's a lot of phil philosophical discussions, some of which we're going to have on our talks today about what does this mean? I mean, you have people that are looking to start relationships with AI and and have, you know, even um, you know, sort of
committed to that lifestyle. And there's
so many crazy things that this affects.
But I go back again to the world that Gene Ronberry created and imagined which um was beautiful. And there was a moment in my favorite movie Star Trek the motion picture where you know there's this decision where um Decker one of the
characters decides that what AI is missing is love and the human connection and merges with it and and goes into the singularity. Uh and it's this beautiful
singularity. Uh and it's this beautiful moment that I love in the film. It's
deep but profound and it's the unknown.
And it it does almost to me represent um an ideal you know you know introspection of what it means to be human. what all
of this possibly can mean for us in a positive optimistic way. Um, and by the way, you know, Kirk doesn't join. He
just keeps going and living his life.
And in Gene Rodenberry's book about the movie, the one and only book he wrote on Star Trek, right? Um, you know, Kirk is there to sort of almost show people like you have to go out there in the world.
You have to explore. You have to bring in new data. And on Earth, in the Star Trek novel that Gene wrote, there's people called the new humans. They live
in VR. And it's just there's a cycle where these don't have any um you know experiences that sort of add you know to the to the stillness of their lives. So
lots of great philosophy in Star Trek. A
lot of it encompassed not just in the motion picture film but in the novel that Jean wrote as Ari mentioned. I mean
you know and here's a will be speaking on the next panel with me. uh you know standards are important and built you know working with SAG and all these you know other you know organizations really critical to create the frameworks where
um you know there sort of clear lines between what you can and can't do with AI especially people's likenesses but we're just at the beginning of a pretty crazy journey I think into that world um and I think I've talked a lot about of
course you know the impact that Starfix had I want to actually talk about the project that we're doing um which really started with my best friend Rod Rodenberry sitting in the front row here 20 years ago
um to document um in any way we can uh the life and work of of Gene Ronberry.
Uh and you know Rod's been doing this for years with documents um and and and photos and things like that. And and
even 20 years ago I remember I think the enterprise that RH saw was actually something that Jeff Hullman who's um artist that's been working with me for 20 something years had created for Paramount. We've been playing with these
Paramount. We've been playing with these pieces for a while, but about 3 years ago, um, we decided to go full tilt and have Otoy work on just rebuilding everything digitally, everything from the world of Star Trek, right? So, a lot
of the things you're seeing about this project came from that um, intention.
And so, yes, we have these incredible documents, you know, Gene Rodenberry writing letters to Steve Jobs, uh, photos of the, um, cast recruit during the filming of Star Trek. So the archive
itself was created um you know Rod invested in Otoya 15 years ago and I consider that as an endowment and that did help you know start this project where we were starting to build 3D models. We focused a lot on the
models. We focused a lot on the Enterprise because it is one in some ways you know a character in the show and and not just on how it appeared in the show but also in production like this is the set where it was filmed and
this is a 3D model of it so you can explore that uh and and one of the goals that we had from the beginning is let's build the actual motion picture Enterprise why because we have a deck plants for it you know Lord Johnson you
know that's the only ship that we have every single piece for and it'll it's taking us years but we eventually want to have a full complete digital double of that ship. lifesize, right? I for me
that's a passion project and we're doing it. I mean, you know, we have uh almost
it. I mean, you know, we have uh almost all the the interiors that have been you've seen on screen built. That was
actually three years ago. Um we're
building the entire universe around it.
So, it is in some ways a virtual world simulator that's in the universe of Star Trek. Uh and you can see these beautiful
Trek. Uh and you can see these beautiful cutaways of the ship. Uh it's it sort of drives for a Star Trek fan, I think, a lot of, you know, strong feelings of imagination and and just seeing this world come to life has been just a
pleasure. both I think to you know to
pleasure. both I think to you know to have the experience from a as a fans perspective but also creating it. Uh
this is the uh interactive uh portion where you can go on the set and um you know see and experience it. But there's
also a lot more than just that you know original enterprise. I mean there's all
original enterprise. I mean there's all the designs that went into it when we talked to meta version of it. We're
talking about the history of Star Trek which is um you know which has there's 14 enterprises right? So if you just look at some of the earlier work you can even see there you can do time lapses through the world of Star Trek. And
that's the thing is I think the future of content is multimodal. I mean you want to of course you know see movies and have the story told that's why these things matter. But if you love what
things matter. But if you love what you're seeing like many Star Trek fans do then being able to explore that world in almost the same experience which is what the archive represents is beautiful. And you know to Ari's point
beautiful. And you know to Ari's point earlier about this this goal of well what you film and what you create for movies is something you can reuse for video games. I mean we effectively have
video games. I mean we effectively have that working today and you know using that for this project. these sets you're seeing, we can run that on a volume in an air wall which we have in the um in the venue and you can try it yourself
and we can actually film things on there. Um when we launched the archive
there. Um when we launched the archive it it was people loved it. We had like four million visitors in the first week and and you know Smithsonian um you know everyone was was delighted by it and we
eventually um worked our way towards pulling a you know a VR version of this thing and we uh we've official licency from Paramount for this immersive uh experience license and we launched that
um with Apple u launching the vision pro in February of last year. Um you know if you are able to see the vision pro that we have set up there you can see this experience which is you can see the world of Star Trek brought outside of
the you know outside of been brought into the real world and this is a beautiful and amazing experience and to my point earlier about things maybe moving towards a holographic future. I
think this is this is a really important way of testing that. I mean the vision pro is still early days right? There's
there's you under a million of these out there. But if you want to experience
there. But if you want to experience what it would be like to see things in your space with a high the highest fidelity possible today, that's the device to do it on. Uh, and it's been a
challenge really to sort of build um the right pipeline for this and and and do the content. I mean, it's just, you
the content. I mean, it's just, you know, imagine if you actually had the holiday UX for that would work. It would
be pretty crazy, right? But we're we're we're going through that. And of course, the challenges for rendering things at this level. Um, you know, even Apple, I
this level. Um, you know, even Apple, I mean, I think they're, you know, they're still working on their immersive media format. The experiences themselves are,
format. The experiences themselves are, you know, are going through a lot of iterative changes, but in our case, we figured, well, the one thing that we should do if we're going to do the holiday is let's render holograms, which we can do on the render network, right?
It's uh, and then give you back an actual cube- sized hologram you could just move quickly through. This is one of those cubes. This is um, a light field. It's um, effectively a really
field. It's um, effectively a really quick lookup table, which would be like a digital hologram. And we can run that on the Vision Pro. You can see that in the archive and we can do that for anything. Uh this takes a lot of GPUs to
anything. Uh this takes a lot of GPUs to render but again easy to do on the on the render network. Uh going back to the um to the
network. Uh going back to the um to the parts of the archive that sort of led us to where we are today. Uh we started to do you know in addition to these beautiful documentary pieces and letting
people explore the world of Star Trek in 3D and Gene Rodberry's other works as well. uh we started to put out little
well. uh we started to put out little vignettes that would accompany these um these other pieces and we we've done four of these. The third one which I'm going to show here, you know, we put that put that out right after Star Trek
season 3 uh where they just show the Enterprise D and it you know you turns out oh this thing was recovered from the planet u in the novels uh you know at the moment of um of Kirk's death uh and
and even in the comics right you know Spock goes to visit Kirk Gray. This is
something that people fans knew about, but it was never shown on screen. So, we
figured as part of the archive, let's let's do something that touches on all these elements. Let's put this out there
these elements. Let's put this out there shortly after this episode airs. And uh
I'm going to play this next and we can take a look at it. It's 2 minutes long.
[Music] [Music]
[Music]
[Music] [Music] [Music] Thank you guys. Yeah, we're very proud
of that piece. I mean, you know, this was 20,
piece. I mean, you know, this was 20, yeah, 23, so we were, you know, early days with a lot of the tech and tools that we had then, but we, you know, every CG scene was done um on the render
network, right? I mean, we absolutely
network, right? I mean, we absolutely depend on that for production, Octane as well. And of course, you know, our early
well. And of course, you know, our early work with doing digital prosthetics for faces were was put to, you know, even greater effect on on uh in those couple of scenes with Spock, but people really loved it. I mean, the feedback was was
loved it. I mean, the feedback was was great. Um, and, you know, when we were
great. Um, and, you know, when we were doing um the piece that followed, right, which we're going to have a whole panel to discuss. The cast and crew will be
to discuss. The cast and crew will be there to discuss it. It is, you know, eight minutes long, so I'm not going to show it here. We'll be playing it before the uh panel uh that's coming up next.
Uh it's called Unification. This is a beautiful poster done by a fan. Um uh
you know it's uh it was an incredible piece and I think that you know you've seen some snippets from it. I'm going to talk a little bit at least show some pieces ahead of our talks so you can sort of see some of the work and and behind the scenes stuff that went into
it. There's a lot of beautiful things
it. There's a lot of beautiful things that were created for it. Um and uh and it was touching. I mean that piece with uh with spot that you just saw me we I we had few million people I think see
it. Um this one this is part of the
it. Um this one this is part of the immersive experience right? So it's
inside of the vision pro app. inside of
the web portal. But the link from YouTube that was unlisted, it 26 million people watched it. That's crazy. Um, it
was beautiful. It was so wellreceived.
I'm for those that haven't seen it, please go either watch it in the theater or come back right before the talk to see the full thing. Uh, but the making of it was crazy. And also the day that we put it out there, right, so you know,
spoiler alert, Kirk's back. Um, you
know, we had William Shatner, who is an executive producer on it. um they're
watching it with everybody and there's Robin um uh Larry who plays Spock, my wife May who plays Colt and um you know who was there at the very beginning of these uh things with me and inspired me
to get into film making at all. Um but
he you know hadn't seen I mean he'd seen some early test but he hadn't seen the whole thing until that day that moment the day that this came out and he was sitting in the captain's chair. We're
all around him watching it. Um, and uh, you know, I'm going to play sort of, you know, something that Larry filmed of of Bill at the end when uh, the piece was done.
[Applause] Yeah, it was it was emotional for all of us and it was very very very nerve-wracking for me because obviously
this was bringing Kurt back in any way, shape or form was crazy. Um, and it took a little bit of of time to even have him, you know, agree to do this at all.
But we've had been working with him for years. We'd interviewed him for the
years. We'd interviewed him for the archive. And in fact, what spire
archive. And in fact, what spire inspired unification for me was the last minute of his interview that he did in March of 23 where he said this. We were
brothers and I love it's about a few months before he died, I don't know, something happened and I don't know what it is. And I've been
assured by both of his children that he loved me. But he stopped talking to me
loved me. But he stopped talking to me and I couldn't break through. And
towards the end, I was told he was dying. I wrote him a letter saying how
dying. I wrote him a letter saying how much I loved him and how I regret not seeing him. And that was and he died.
seeing him. And that was and he died.
And I don't know to this day. I'm told
stories by people who said to me, you know, that happened to me and it was because of the illness and uh their minds they weren't
thinking exactly the same way they I'd like to think that that was it that as he got further and further along with his emphyma
that that uh it wasn't something I did because I loved him. Yeah. So I I called him uh shortly after that with Ari and
and I said listen I want to do a piece with um you know it just you made me feel that the closure between Kirk and Spock didn't happen right I mean Spock and Kirk died on offscreen uh at least
Spock died offcreen and the characters never got to say goodbye so for me even though of course we're documenting the story of Star Trek emotionally that was an important piece that I know was missing and so with that he you know
after almost a year of conversations he agreed to let us do this And we even considered and given the option of coming in and and deaging him, which we
can do. Um, in the end, um, he, you
can do. Um, in the end, um, he, you know, he effectively chose to have an actor play a younger Kirk. Uh, he did provide, um, guidance on using his voice, but Sam Whitmer, who's an
incredible actor, who've been working with me for two years before we put out this piece to be Kirk and help us with the actual technology for the uh, digital prosthetics you guys are seeing.
Um, it it is amazing. And at the end, I'm very proud of the piece. We show
Kirk at different ages. Initially, we
were going to have Spock and Kirk at the different ages that was cut, but you can see here there's Sam on set without the digital prosthetics and with I was 26, 27 when I came in. How do you account
for that? The age and am I still
for that? The age and am I still recognizable? How would you do that? I
recognizable? How would you do that? I
mean, science fiction's filled with magic. So, you could do it because we
magic. So, you could do it because we live in science fiction. So, yeah. I
mean he he was while he was giving the earlier interview he was talking about how to bring Kirk back how to make him younger. I also reminded him that you
younger. I also reminded him that you know this is something we can do. So the
first thing we did is we just deaged him to 60 the age that Kirk would have been from 93 and he saw that he's like that looks perfect um but is there a way for me to look like this and not be on set because you know a lot of the filming
was torturous you Huntington Gardens and things like that. So that's when we did was due to step down. We've done our bitid for kicking country talking right.
So you know we didn't have them talking in the piece but you can see that we actually can do voices and everything really well. And when Bill saw that he's
really well. And when Bill saw that he's like just go with that you know it looks dagging me and having Sam effectively do the same thing. You're going to basically have a better system there.
And that's how we proceeded. So the
crazy thing is it's real time. So here I am u this is on a 4090 laptop. Sam is is this is the first time we're actually testing him as generations Kirk and it's just mind-blowing. I mean I posted this
just mind-blowing. I mean I posted this on Twitter. Uh for Spock it was a lot
on Twitter. Uh for Spock it was a lot more challenging as I mentioned before.
We had to build this 3D model of uh of land at that age and you know for Spock Prime um but you know we did it and then we also brought in Robin Curtis from um
start the legendary Starry actress and brought her in as as an older version of her character Savic. There's a scene that was cut which is from the early voyages comic book that was meant to sort of describe what's going on in the
piece uh where she's deaged and this is from the set. Uh it's pretty amazing.
She's precisely the young woman who vanished that day aboard the Enterprise.
However, there is a problem. Her
temporal integrity is decaying. She is
alien to this timeline and it is beginning to reject her. If she is not returned to her natural time frame soon, she will perish.
If you're a Star Trek fan, that's pretty crazy. So, uh, you know, it's it's
crazy. So, uh, you know, it's it's something where daging and being able to, um, take CG characters and blend them together. For example, Gary
them together. For example, Gary Lockwood, who was 87. We brought him back as Gary Mitchell. Um, there's a lot of really, you know, interesting pieces unification pushes to do and we've
learned so much from it and we're taking all of that and applying it to the next set of things that we're taking on.
Again, we come from a world of movies.
Physical props are important. We built
Eric, which is one of the characters from Star Trek, the original series, really only showed us a cartoon. We
built a CG version. People went crazy when we showed that years ago. And so we did a physical model. We tried to film it. Um, and we are now thinking, how do
it. Um, and we are now thinking, how do we animate this physical model, right?
So we're doing early tests with that.
Um, we do have a lot of work that can basically take and map um, you know, performance to a a physical uh, sculpture or makeup. And, you know, it's getting there. It's still early days,
getting there. It's still early days, but we're working on that. Uh and as we're filming in vacation there again there were shots where it was you know we filmed it all live action you know with cranes dolly operators but there there was a shot in particular where we
needed to redo it right but we we scanned in everything. I mean we had cameras on set so we could there was one shot where Kirk meets you where we redid it and did it all in CG and it worked
great. Um so we we've always when we're
great. Um so we we've always when we're filming things we always do full coverage and we bring that back at Octane. But then again we have a larger
Octane. But then again we have a larger world that we're also creating in Octane that looks gorgeous. These are all renders from the opening scene of that um of unification and it's just beautiful and we had one shot there's no
AI video generator that was able to do this where we go from the sky full C shot to Kurt's feet which is a live action plate uh and we you know we tried runway we tried luma um even to this day
it wouldn't work. So traditional
pipeline of rendering mixed with uh compositing was the only thing that got this to to work but it was it was just an amazing shot. It was very ambitious and it and it worked great.
uh lots of um you know set replacement.
We could do this differently now. The
volume of course was popularized by the Mandalorian, but again we're working towards being able to do much more of this in real time. And I think you'll see I mean Richard and Caris from video
who'll be talking in our next panel producing 70% of films will be shot this way. But we also could do um multiple
way. But we also could do um multiple camera views for face replacement as well. And I'm going to end by just
well. And I'm going to end by just showing some of the tests we did. This
is in the middle of the production for unification testing the TMP or Kirk and Spock. Uh and uh you know this one is uh
Spock. Uh and uh you know this one is uh is again it's just a test so it's just an early thing but it looks pretty
cool. I think I like that Mr. Spock.
cool. I think I like that Mr. Spock.
I commanded a ship named after an idea. Not bad for last year, right? So,
idea. Not bad for last year, right? So,
I think that if that's something and and remember everything I've shown you, you can move the camera around. You can
basically experience that in the holiday. That's the point of making
holiday. That's the point of making films that are leveraging this technology and the artist full control.
You have Sam and Larry playing Kirk and Spock. You have artists building these
Spock. You have artists building these sets. It's an artistdriven pipeline. And
sets. It's an artistdriven pipeline. And
whatever we're doing at Render, at Otoy.
Um, we want, you know, artists and humans to be at the center of that. And
I think that actually is our place uh in this future. Um, so I want to have uh
this future. Um, so I want to have uh one last thing, one more thing before we close out. Just a a sort of moment of
close out. Just a a sort of moment of zen before we go on to the rest of the conference. And I'll play this and then
conference. And I'll play this and then we'll uh we'll wrap up. things that we see on Star Trek at the holiday. What
kinds of things can you imagine that are partway there that could be much better than the three window eye chat that we might see in the next five or 10 years?
Well, I don't I don't think Steve's going to announce his transporter.
Uh I want Star Trek. Just give me Star Trek.
Amen. you know, and I think that that is that's a future that we want to. So,
again, I'm so excited for you to all be here today. Uh we have some incredible
here today. Uh we have some incredible guests and panelists with an incredible venue. Um and thank you for, you know,
venue. Um and thank you for, you know, being part of this important day and the journey of of rendering. Um thank you everyone and enjoy the show.
Loading video analysis...