LongCut logo

NVIDIA GTC Live Washington, D.C. Keynote Pregame

By NVIDIA

Summary

## Key takeaways - **AI Infrastructure Enables App Boom**: All investments have focused on infrastructure like semis, power, and large language models, but now value is accruing at the application layer with breakout companies like Cursor in coding, Open Evidence in medical, and Harvey in legal delivering real productivity gains. [10:44], [11:31] - **Software Transformed, Developers Essential**: AI transforms low-code drag-and-drop software but highly technical development requiring trade-off understanding still needs professional developers, who are AI's primary users dollar-weighted; this is the first time software is being disrupted after 30 years. [12:25], [13:15] - **AI Teammates Unlock $6 Trillion**: AI will manifest as digital teammates augmenting human capabilities in collaborative intelligence, targeting 20% of the $30 trillion global knowledge worker spend for a $6 trillion opportunity; short-term displacement occurs but long-term humans win with infinite new creation avenues. [13:46], [14:37] - **Power Limits AI Throughput Most**: To increase AI throughput, the one non-technical thing needed is easing regulations on breaking ground for new data centers and adding power; OpenAI calls for a Manhattan-like project for 100 gigawatts per year as chips need power. [26:17], [26:32] - **Drug Discovery Takes 10 Years Minimum**: It takes 10 years and $2.6 billion on average to bring a medicine to market; AI effects on deflation will spread over a long period, with benefits from current medicines in the 2030s at earliest as biology takes time. [41:03], [41:52] - **NVIDIA Leads AI Factory Ecosystem**: NVIDIA has built complete AI systems and software stacks giving leadership few can match, curating an ecosystem from power generation to memory to algorithms and cooling; partnerships like with Cadence yield 80x improvement and 20x lower power. [03:06], [02:59:33]

Topics Covered

  • Infrastructure Enables Vertical AI Apps
  • AI Transforms Software Development
  • AI Teammates Unlock $6 Trillion
  • Open Models Democratize Innovation
  • Power Constrains AI Throughput

Full Transcript

[Music] [Music] Live from the Walter E. Washington

Convention Center in our nation's capital. This is NVIDIA GTC Live,

capital. This is NVIDIA GTC Live, Washington DC.

Over the next 3 and 1/2 hours, industry titans from a cross-section of the AI ecosystem will gather to share bold perspectives you won't hear any place

else.

And it all builds the main event, Jensen Wong's first ever keynote address in Washington DC.

A historic moment in a monumental city.

Strap in. You don't want to miss a second.

>> Nvidia GTC Live Washington DC starts now.

[Music] [Applause] Welcome to GTC Live from Washington DC.

Over the next few hours, more than 5,000 guests will arrive to attend Invidian CEO Jensen Wong's much anticipated fall keynote address. Before that, we will

keynote address. Before that, we will welcome to our stage 20 guests from all over the world at the very top of the tech industry. That's right, Brad.

tech industry. That's right, Brad.

They're here to discuss the state of AI, its role in business and science, and the exciting future that Nvidia and their partners are building right now.

I'm Brad Gersonner, founder and CEO of Altimter Capital.

>> And I'm Patrick Morehead, founder, CEO, and chief analyst at More Insights and Strategy. AI is the force for the

Strategy. AI is the force for the economy and is rapidly being integrated into every aspect of industry, security, science, and manufacturing. And that's

why we've extended the pregame show to include many of the CEOs, founders, and scientists who are changing the world driven by NVIDIA innovations. At

Altimter, we've been big believers in Nvidia's accelerated computing platform for years. They've been investing in

for years. They've been investing in this platform for more than a decade, and the moment is here. AI is taking off, and the world is responding. To

help understand what's driving all of this, we have put together panels featuring more than 20 guests and the lineup is absolute A-list leadership in technology, investing, and innovation.

So, I'm pretty stoked to be here with Patrick as we kick off this keynote.

>> And I'm excited too, Brad. And I've

spent three decades in tech and at more insights and strategy, 15 years tracking silicon infrastructure compute and software. And what I think makes Nvidia

software. And what I think makes Nvidia so special and differentiated and so valuable is its central role in creating not just faster architecture, but what

has become an entire AI ecosystem.

Nvidia isn't just selling chips, although chips are awesome. They've

built complete AI systems and software stacks that give them a leadership position few can match. Our goal today is to discuss with these leaders the

context and commitment, the wins so far and the challenges to come. So by the time Jensen hits the stage, we'll have a better understanding of what they're doing, why it matters, and what comes

next.

>> It is pretty incredible. We have all these guests under one roof. Brad, an

extraordinary lineup of leadership from the world's biggest tech companies, investors, that's you, Brad, and innovators. We'll kick things off with

innovators. We'll kick things off with the state of AI innovation, a look at new ideas, open models, and the power of open source that's shaping the direction

of AI. The next panel, Agentic AI for

of AI. The next panel, Agentic AI for every industry. Intelligent systems are

every industry. Intelligent systems are beginning to plan, reason, and act to reshape how industries work. after

aentic AI infrastructure ecosystems. Behind every breakthrough is an unseen network of data centers, power systems, and partners, the backbone of the AI economy.

>> And then we'll move into AI and science and quantum. In labs and research

and quantum. In labs and research centers around the world, AI is becoming a core instrument of discovery. Now it's

converging with quantum computing where together they're redefining what's possible in simulation and design. And

finally, an important discussion of AI and robotics and manufacturing. The

boundary between digital intelligence and physical action is disappearing.

Together, AI and robotics are re-industrializing America. This is

re-industrializing America. This is Nvidia's fourth pregame show. What began

with a handful of the best in tech huddled in a small corner of the SAP center has grown into this. Three and a half hours, 20 plus guests in this massive space here in DC.

>> It's pretty awesome. That kind of growth is just what you would expect from Nvidia.

>> No question. This is the place to be if you want to know where artificial intelligence is headed. The other place to be is on the floor where CNBC's Christina Parts and Eveos will be

joining us throughout the day with even more guests. Christina, welcome to the

more guests. Christina, welcome to the show.

>> Thank you guys. I have to say I am quite excited. Not because I get to, you know,

excited. Not because I get to, you know, learn a lot more about anything from quantum physics to agentic AI, but I get to come on the floor over here and really showcase what this event is about. You mentioned this being the

about. You mentioned this being the fourth one this year. Can you imagine? I

started in San Jose, went to Taipei, Taiwan, or I should say, sorry, Paris, and now DC. The difference with this one, yes, it's a little bit smaller, but doesn't mean that we're not going to make a lot of news. You're going to hear

from CEOs, breaks the news. I would say just based off of the crowd, it's a little bit darker, a lot more people in suits. That's my only observation thus

suits. That's my only observation thus far. But it doesn't mean that it's not

far. But it doesn't mean that it's not going to be as exciting. For example,

what is it 8:30 in the morning? I've

already spoken to the inflection AI CEO who demonstrated how atoms are staying in place. So just explaining the quantum

in place. So just explaining the quantum computing and this audience honestly all of you over here we thought this would be empty at this point because it's still quite early but it's incredible that the doors open and there was

already an influx of at least 200 people that swarmed in. I promise to break the barrier the fourth wall. You know I'm not on CNBC right now so I can just go wherever. Show be the behind the stage.

wherever. Show be the behind the stage.

You can see Brad over there looking.

show everyone so that those that are watching at home or maybe in your office right now streaming all day hopefully not flipping to anything else are really going to get a feel of what it's going

to be like. And I say that because not only will I be milling about, you know, that's why I'm wearing the bright suit so you can see me, but I'll also be speaking with several of the guests that come off the panels from Core Weeve CEO

to Cadence's CEO. And we're going to get into the I guess the nitty-gritty details of not only what was discussed, but some other topical things at hand in today's day and age. But uh this is just an example. You know, I would call it

an example. You know, I would call it the red carpet, but Nvidia likes to do things differently. That's why it's

things differently. That's why it's black. But overall, I promise we've got

black. But overall, I promise we've got some juicy details throughout this entire show, and I'll be going back and forth with the guys throughout the uh the morning, I should say, before the

keynote at 12 PM Eastern. Brad, Pat,

>> thanks, Christina. I feel I feel like we're full squawkbox this morning. Um

well, we'll check in with you in a bit.

In just a few years, CUDA has evolved from a programming toolkit into the backbone of the world's AI community.

What began as a way for developers to tap into the power of NVIDIA's GPUs has become a global movement. That's right.

Millions of engineers, research and creators all use the same language of parallel computing to revolutionize AI and drive revol innovations in every

field from robotics and genomics to film making and of course generative AI and all of this is powered by Nvidia's vision for accelerated computing. Let's

take a look.

You can literally access CUDA wherever you are, whoever you are. We've

democratized high performance computing as we know it. It is the reason why we believe this is the beginning of the age of AI.

We now have the fundamental pillars necessary for us to advance artificial intelligence in a way that has never been conceived of before.

[Music] You are the stars. This is a celebration of your science, of your work, and your innovations.

[Music] [Music] [Music] And all of you, all of you made this possible.

Behind every breakthrough in AI are the innovators and builders turning possibility into progress. From open

models to agentic systems, they're accelerating the next wave of innovation across startups, labs, and markets.

Joining us to explore the state of innovation, some of the best, Thomas Leafant, co-founder of CO2 management, Sarah Guo, founder and managing partner

at Conviction, Martin Casado, general partner at Andre and Horowitz, and Naveen Chara, managing partner at

Mayfield.

So Thomas, welcome. You know, CO2 has been investing at the heart of America's super cycles from the internet to cloud

to social. Help us contextualize how AI

to social. Help us contextualize how AI stacks up and how you think of all this investing in the middle of all the bubble talk.

>> Yeah, good morning everyone. Um really

thrilled to be here and um to be the warm-up show.

So I know we have a lot of exciting content today.

So look, I think that um all of the investments have really kind of been focused on infrastructure. So I think when people talk about AI, obviously it starts with semis, it starts with power, it starts with the large language

models. I think um you know if you look

models. I think um you know if you look at the private markets as an example, most of the value has been acrewed at the infrastructure layer for the past kind of five or five or 10 years, let's

say. But to me what's most exciting

say. But to me what's most exciting about the moment that we're um at right now is that we are seeing um a lot of value be accured at the application

layer and a new class of companies start to emerge in different verticals. So as

an example, if you look at coding, um cursor has become an unbelievable breakout company, one of the fastest growing companies ever. Um Martin can

talk about it as well. That's only

enabled to make coding better because of the investments in infrastructure, right? So starting with Nvidia, but

right? So starting with Nvidia, but we're also seeing it in medical with open evidence. We're seeing it in legal

open evidence. We're seeing it in legal with Harvey. So to me, one of the

with Harvey. So to me, one of the interesting moments that we're at right now is all of the investments in infrastructure have enabled apps to come through that are delivering real productivity gains, which I think then

reinforces the belief that it's worth making these infrastructure investments because you can see the return um from these applications.

>> Well, well said, Martin. Um you know, you're one of the uh legendary software investors in Silicon Valley. you know,

Satcha was on uh my pod last year and made some news um when he said, you know, AI is potentially a real threat to software, that software may be this thin

interface on top of this CRUD database.

And so, we've seen a lot of uh I think FUD there. Um you know, talk to us a

FUD there. Um you know, talk to us a little bit about how you expect software to change, what's winning, and what potentially loses uh in the age of AI.

Yeah. So I listen I think formal languages came out of natural languages for a reason and that is natural languages you can't actually describe what you want >> right.

>> So we've always had kind of two stories of software right we have the low code kind of drag and drop and that is definitely be transformed by AI. I think

that's going to look entirely different.

Um but then we also have the very highly technical where there's actual trade-offs. You have to understand those

trade-offs. You have to understand those trade-offs. you know, I you still need

trade-offs. you know, I you still need professional developers. If you actually

professional developers. If you actually look at AI, the primary use right now in development is professional developers.

It isn't kind of casual if if you do it dollar weighted. And so, listen, I think

dollar weighted. And so, listen, I think we're in for a major transformation. I

think we're going to see a lot more software than we had before. I think

many more people will be able to develop than have ever been able to develop before. I think it's a great educational

before. I think it's a great educational tool, but I don't believe it's going to get rid of software developers. I think

this is a very much a technical discipline where you have to understand the trade-offs to do it. And so, but I will say I mean I've been in software for 30 years and this is the first time

we were being disrupted, right? And so

it definitely is putting us on our heels to try to understand what's going on.

>> Yeah.

>> Makes sense. Naveen, great to see you.

>> Absolutely.

>> So you predicted made a bold prediction out there that knowledge workers will have what you're calling AI teammates and it's a $6 trillion opportunity. And

I don't know if that's by 2030 or what time frame this is. Can you tell us about what an AI teammate is and how it represents a massive way that we shift

in our work? Absolutely. So our belief is uh AI is going to team up with humans to get us to superhuman level and we are

entering an era of collaborative intelligence. What's going to happen is

intelligence. What's going to happen is AI will manifest itself in the form of teammates which are nothing but digital

companions that team up with us not only to accelerate productivity but augment our capabilities and amplify our creativity. If you look

at globally the knowledge worker spend is $30 trillion.

So if AI takes 20% of the market, 5 years, 10 years, it's a $6 trillion opportunity. First time we're not going

opportunity. First time we're not going after IT budgets, this spend is coming at teammate of the people spend for the knowledge workers and same is the going

to be the case for physical AI. So we

are extremely bullish. It's the same size as the IT market. Now it remains to be seen is it five years, 10 years, but it's going to happen. Let let me ask a quick question follow up to that. Is

this deflationary at the end of the day?

Right? Does the $30 trillion market because we're a lot more productive?

Right? So the idea is replacement that 6 trillion will go from humans uh to machines, but might it also just offer some deflation to the economy, some productivity gains along the way?

>> Absolutely. So I think in any new information technology market, there is displacement because productivity gains happen. There is some job loss, but in

happen. There is some job loss, but in the long run, I'm an optimist, humans come out as winners. The more

productivity gains you get, the more cost savings you get, the more profits you create. You're going to hire people.

you create. You're going to hire people.

>> By the way, >> the key difference is basically they won't be doing mundane jobs. They'll be

doing things that weren't possible before. And one example we talked about

before. And one example we talked about the VIP coding, 30 million developers have been able to code. So creation has been limited in the form of company

creation to 30 million people. Now with

VIP coding which is a teammate a billion people can become creators and it democratizes entrepreneurship and we can't even figure out what people are going to do with this technology. So I'm

very bullish. Shortterm there'll be pain but long run the avenues are infinite on what gets created. It's a It's a really important We're in Washington DC and at

a time where I think there's a lot of confusion on Capitol Hill about the impact of AI. We have too many doomers running around talking about how it might shrink the economy, right? People

are going to be displaced, jobs will be lost. It's important for all of us and

lost. It's important for all of us and everybody who's going to come after us to come here and educate really on the abundance to come. America cannot stay competitive unless we have our best innovators leading at the front. Right?

that starts with federal preeemption on state laws and other things that can happen here in DC in order to accelerate AI. Sorry to interrupt.

AI. Sorry to interrupt.

>> No, not at all. Listen, I I'm a I'm a student of history and what the history does show is with every major inflection in technology, uh whether that's uh

engines, electricity, it has been replaced. Now, now this curve is

replaced. Now, now this curve is dramatic. I think more dramatic than

dramatic. I think more dramatic than we've seen uh in the other ones. We do

need to keep an eye out uh for it.

Sarah, how are you?

>> I'm great. Thanks for having me.

>> Absolutely. Uh, so you're an AI focused, AI native shop, which is amazing. Um,

and a pretty good place to be right now.

Uh, let's talk about open. I I'm curious about how does open uh drive let's say growth. I think we we know where growth

growth. I think we we know where growth is but things like national security how the US can win by adopting open.

>> Okay. So I think there's two really important questions here. There's open

as in open source right as as an engineer but even for everyone who isn't like the the foundational principle of

open source is that if you um allow more people to participate in innovation you will get better ideas you will get compounding innovation and then

importantly you will open up the market at the layers above that innovation right and so I I think the idea of open source and open models in particular allows for um more democratization at

the application level where more entrepreneurs can build things that get to actual end users. Thomas mentioned,

you know, Harvey and law, open evidence and medicine, um cursor Martin mentioned in in engineering. We're like 1% of the way in, right? What about every other job and vertical? Where are the tools

for that? And I think open innovation is

for that? And I think open innovation is going to allow for that. you know on on the national security front my view as a big believer in America is that we have

always won by being strategically open and at the forefront right and so I think these things can go together strategically open means like to me

attracting um the capital and the talent uh to create the technologies that lead and create abundance strategically open is deciding what pieces of that we really want to own both in a supply

chain perspective and and then what companies really matter and will create opportunities for Americans.

>> Yeah, it really seems like we should be embracing at the model level all models regardless of where they come from. And

of course, we can go in, we can expect the results to see if there's something going on. But I do think that there's uh

going on. But I do think that there's uh the meme that says we shouldn't be importing models from other countries, namely China, uh to be able to drive

innovation and stacks on top. And I do think that's a mistake.

I I think that when I look at it from the application developer perspective or the enduser's perspective, they're going to choose tools that are good for their tasks. Right. Right. Um I don't think

tasks. Right. Right. Um I don't think you're choosing like a philosophy of intelligence. You're looking for

intelligence. You're looking for honestly efficiency and capability for most of these things. And so um I think efficiency and capability can come from all corners of the world.

And and let's be honest, you know, if it's driving American innovation, let's have a little confidence in the entrepreneurs that are adopting them.

That's right.

>> Um, you know, and so we had this deepseek moment earlier in the year that I think again led people to believe that we can live in this artificial world where we shut down all the open

innovation in China and and just drive it in the United States. And while I like to see and it's been great to see all the open source models coming out of US labs, the fact of the matter is

deepseek accelerated innovation in the United States. I don't know Martine, do

United States. I don't know Martine, do you have a >> So listen, I think not having a policy is potentially dangerous here, right? So

let's imagine that models were 3D printers and we allowed anybody to adopt any 3D printer that they wanted and we decided to limit our own 3D printers. So

if our entire manufacturing foundation is based on somebody else's 3D printers, there's a lot they could do, right? They

could, you know, modestly shift what comes out, they could only release weaker 3D printers for everybody else.

They keep the stronger ones to themselves. So when it comes to import

themselves. So when it comes to import controls with technology, we've long had long-standing policies, right? Do you

remember like the whole Huawei Cisco thing in the early 2000s? And listen, it turned out to be very appreciative, right? We shouldn't run our critical

right? We shouldn't run our critical infrastructure that we rely on on something controlled by a foreign adversary in every case. And so I would say first it's important to have a policy. That policy should be nuanced to

policy. That policy should be nuanced to understand everything that you said.

Yes, we do want to benefit from it. Does

it go all the way down to critical infrastructure? Maybe, maybe not.

infrastructure? Maybe, maybe not.

Historically, we've done a pretty good job not doing that. Um, and so I I think I would recommend being a bit more thoughtful of how we import and use and use uh open source.

>> Naveen, do you have a do you have a comment on this? Yeah, I think I agree with what was said, but having been in the industry for a long long time, my belief is the reason openness becomes

important is it's an ecosystem opportunity. No one company can solve

opportunity. No one company can solve all the problems. So we have seen with prior platforms whether it was Windows, Android iOS

you need a whole ecosystem that thrives and just keeps innovating. So it's not only the geopolitical situation but for

costs and the problems to get solved you need openness and I'm a big believer what happened in the past is going to happen again and companies which are platform companies which enable

ecosystems will solve problems together. So it's a together thing rather than one company doing it. Thomas, I want to come back to

doing it. Thomas, I want to come back to a question I started with you. Probably

the number one question I get asked in venues like this is how can you invest when every day on CNBC they're talking about a bubble and you know we have had

to invest through all of this talk. So

maybe just help us understand where is CO2 investing the most today, public markets, private markets, early stage, late stage um and how mentally do do you

guys continue to forge ahead while all of this chatter is going on?

Yeah, I mean I think um if you look at the public markets for example, one of the different features of this particular market versus the market in the 2000s which by the way we were

investing then we started our business in 1999 is first of all stocks are valued very differently today than they were back then. So to me, one of the incredible

then. So to me, one of the incredible things about this market is you get to buy some of the world's best, most exciting, most innovative, bestrun

companies, companies like Meta and Nvidia and Google and others, right? At

P multiples that are on average in the calling mid20s on a Ford basis, depending on which company you're looking at, still pretty incredible to get access to all of that innovation at that price,

right? So, but obviously what we got to

right? So, but obviously what we got to make sure is that we're right about the E, right, in earnings, not just the multiple. So, what we look for is

multiple. So, what we look for is leading indicators um like chat GPT as an example, right? And we've seen just off the charts usage. Um, one of my

favorite uh things that Jensen talks about is the triple exponential that chat GPT is benefiting from, which is more users making more queries and more

deep research per query. Right.

>> Right. So you kind of get this triple exponential of demand. So I think watching chat GPT and how it how it performs across the globe is really critical.

But you know you asked the question about inflation right or deflation. To

me one of the defining features of AI is it actually has a chance to bring deflation to sectors that really need it. So I think we've looked at

it. So I think we've looked at healthcare for example. You could argue that the internet really didn't do much to bring uh uh health care costs down, right? In fact, they've continued to

right? In fact, they've continued to increase. I think AI really has the

increase. I think AI really has the potential to dent the cost curve, right?

Um companies like Open Evidence as an example that we've talked about that make the diagnosis much better. Um

industrials, how do we bring industrial back to America? Well, we got to get more efficient. We got to get more out

more efficient. We got to get more out of people. We got to get more out of

of people. We got to get more out of machines. AI can do that. So to me, all

machines. AI can do that. So to me, all of those sectors, we're seeing it in defense, right? Look at new companies

defense, right? Look at new companies that are coming in with lower cost technologies. I think we're going to see

technologies. I think we're going to see that kind of across the globe. All of

those things kind of give us the confidence.

That said, and there's always a but we do think hypervigilance is really um the key. So we do watch these positions very

key. So we do watch these positions very carefully. We look at all these leading

carefully. We look at all these leading indicators very carefully. Um, I don't think it's a time where you can be kind of complacent. Right.

of complacent. Right.

>> So, Martine, I've got a question for you. We, we're sitting here, we're

you. We, we're sitting here, we're talking about AI stacks, and just when you, you know, it's gone from the GPU to a tray to a full rack, all the software,

and now we're talking half the time about how are we going to power these?

How are we going to build all these?

where is the leverage in the most leverage in the AI stack uh right now?

>> I mean honestly if if let's just say from a Nashville standpoint we wanted to increase AI throughput and there's one thing you could do. It turns out that's not technical. The one thing that we

not technical. The one thing that we could do is ease regulations on breaking ground for new data centers and add power. Full stop.

power. Full stop.

>> Yeah.

>> I mean I think that the rest of the stack we understand very well. That

right now is what is limiting our ability to do massive capacity build up.

>> By the way, Martine, I'm sure you saw yesterday, but um OpenAI wrote an open letter to regulators, probably worth kind of touching on today, right? But

essentially calling for a Manhattan-like uh project, right, around power generation in this country, >> right, and looking at all of these options. And look, they called for 100

options. And look, they called for 100 gigawatts per year, which is a massive number, but I think directly they're correct, right? We chips need power and

correct, right? We chips need power and to Martin's point, we need to invest in it.

>> Yeah. I think I think we often underestimate like what all of this means. Like we throw around terms like

means. Like we throw around terms like gigawatt all the time. Like every time I'm in a pitch, we're like I need a gigawatt data center. Like we're talking like four football fields worth of capacity, right? So this is a major

capacity, right? So this is a major national level undertaking that we need to do. Definitely need public private

to do. Definitely need public private partnership and cooperation to do that.

But if there's one thing one thing that we can do, it'll be ease regulations on building out new data centers especially power. Do you think it's also worth

power. Do you think it's also worth noting that um there are very few people who believe that America could not benefit from an updated grid?

>> Yeah.

>> Right. Every part of it um transmission, storage, new generation capacity, and that there's going to be a surplus to consumers from all of that if it's invested in. Mhm.

invested in. Mhm.

>> Um and and so the fact that there are large buyers of that power that want to front the capex to that improvement in American infrastructure, I think is a really good thing.

>> I mean, I think uh you know, Gurley and I did the pod from Diablo Canyon, the nuclear site in California. Uh that

fortunately, I mean, amazingly, they were talking about shutting down. It

represents 12% of the clean 12% of the total energy in California. Uh vast

majority of the clean energy in California. We've gotten that extended,

California. We've gotten that extended, but the reality is the site is situated for four new reactors. And we suggested on the pod that Meta, Google, Open AI, Microsoft could all uh, you know, build

their own reactor with a data center right next to it. Those are the type of out-of-box thinking that we need to have in the United States. There are 100 fision reactors under construction in China. Today in the United States, we

China. Today in the United States, we have zero. That's right. Right. And if

have zero. That's right. Right. And if

power is the primitive, if it's power in and tokens out, then we got to get really serious about power. And it all starts here in DC.

>> I would also say, right, like we're in the golden era of the semiconductor industry beyond the GPU and accelerated computing. Not only do you need these

computing. Not only do you need these new supply of energy, but you can do amazing things on the power and cooling side with innovation that hasn't

happened for 40, 50 years. So what goes onto the board with air cooling, liquid cooling, you can do amazing things and at the end of the day the voltage and

the loss you end up getting. So there is like just cutting edge companies being created that weren't happening 50 years back. So it's not only like supply of

back. So it's not only like supply of energy but it's also it's being lost.

You know speaking of primitives uh Martin one of the primitives is data.

>> Yeah. And you know you uh you know have been one of the pioneers around the modern data stack of course data bricks uh being one of Andre's large investments help us understand you know

is AI a threat uh to you know data software companies traditional database companies the snowflakes and the data bricks or are these things essential ingredients uh you know into

superpowering AI >> I mean I just I think the best mental model for models is that their data frozen in time. I mean, and you need a

lot of machinery and plumbing to get data from the source, which is tends to be the natural universe to these models.

And so, if anything, it's been a dramatic accelerant. That said, you're

dramatic accelerant. That said, you're seeing kind of a tale of two worlds, right? There's like the traditional

right? There's like the traditional analytics data stack, which is all structured and not very AI ready. Um,

and then there's the new stuff, which is you just kind of throw a bunch of data at the model and see what comes out at the other end. And so, I would say a it's a massive accelerant. However, if

you're working on data, you need to kind of catch the shifts that are going on because listen, it's data used in a new way with, you know, different guarantees and different bounds.

>> I have a related related question here.

So, the gsis have literally millions of employees and typically what they do is they'll do digital transformation projects, but they'll also modernize,

right? and today modernize

right? and today modernize >> let's say uh I'm going to take uh 27 instances of SAP and put it into one. Do

you see a future with agents that would be able to do this because where a lot of the training comes from on those uh is that from PDF files and training

classes almost like knowledge bases. Do

you see agents as a potential driver uh to do that with software?

I'll say I would actually love to hear Sarah's view on this, but here here's my experience looking at a lot of AI comp or a lot of companies adopting AI, which is let's say 80% of your time is drudge

work and 20% of your time is something else that actually involves agency like knowing what the business needs or knowing what you want or knowing what the customer wants or whatever it is. AI

is pretty good with the 80%, it's horrific at the 20%. It just really is.

And so, and this doesn't matter what it is. So sure you can do document

is. So sure you can do document processing but my experience over time is you know the AI will come make you more productive on the stuff that and be able to focus on the stuff that really matters and it will increase

productivity and I think we tend to conflate this with things like job loss where really we're coming out of 2021 which is this kind of crazy time where you've seen like a a massive um

compression on value and there's retooling in skill sets like the the AI companies I sit on the boards of are hiring like crazy if that isn't an indic indication that you need people and not

AI. I don't know what is right and so

AI. I don't know what is right and so listen clearly there's a shift in skill that's happening that's very important for us you know clearly you know there's some kind of macro stuff going on but my view is these things yes we'll take the

drug work to your question but they will drive human productivity not replace it >> I mean I think that's such an important point again I was on Capitol Hill yesterday having some of these

conversations there was news yesterday that Amazon is having you know a major riff and of course the question was is AI causing them to lay off all these folks and what I reminded them of is

coming out of 22 which we described as the age of excess right in co companies were hiring like mad they thought everybody was going to stay home getting flatter and getting leaner and getting

fitter right was important it had nothing to do with AI >> the most dangerous thing about AI in my opinion is that it's an excuse to do slimming not that it's actually changing productivity

>> right so I think it's important you know that Amazon came out this morning and said while we are going to get leaner in order to get more competitive. It's so

that we can hire more people and double down on our bet in AI. One of the things I want to come back to, you know, we talk a lot about anthropic and open AI.

>> Two companies that uh, you know, don't get nearly as much airtime when it come comes to models, X.AI and what they're doing around physical intelligence and

then Meta. And you know, I think there

then Meta. And you know, I think there were there was some uh you know, commentary yesterday about, you know, Meta was kind of a a surprise how little

they accomplished on the model front over the course of last year given their focus. So, you know, Thomas, Sarah, I

focus. So, you know, Thomas, Sarah, I would love for you guys to talk to us a little bit about the other models that exist out there. Were you surprised that Meta didn't have a bigger impact with

Llama 4? And where do you think, you

Llama 4? And where do you think, you know, the next wave comes from on X.AI given, uh, you know, that Elon's out there teasing that he may be the first to AGI.

>> Yeah, maybe I'll talk meta and then I'll listen to Sarah on on the next generation.

I think when we talk about meta and AI, there's a couple just to kind of level set, right? And the first thing is LLMs

set, right? And the first thing is LLMs are broadly not in use at meta today. So

if you think about the family of apps from big blue to WhatsApp to Instagram and threads LLM's functionally outside of small features and threads are not in

use today. So I don't think that the bet

use today. So I don't think that the bet that um Zuck is doing is about frankly today or might not even be about winning the desktop agent war, right? Let's even

presume that maybe chat GPT has kind of won that battle. So why is he choosing to invest so aggressively? Well, I think what they see is a world where one day

LLMs are going to be in use inside of all the big apps, right? So you will see generative AI uh ads, right? Ads that are kind of customized to the individual user,

generated on the fly.

>> So we are going to see tremendously more LLMs in use inside of those apps. And

they probably want to make sure that that's run on their technology, not someone else's. So I think that's where

someone else's. So I think that's where the investments kind of come from. It

isn't just kind of chasing um wanting to be a a player on the kind of uh the assistant side, right? But

it's realizing that AI is going to be a core part of all of the infrastructure of their core products over the next decade. That's the time frame I think we

decade. That's the time frame I think we should kind of judge them on. They want

that to be run on their own technology, but they're also capitalists and if their own technology can't keep up, they'll turn to others.

>> So maybe you can tell us about what what you're seeing from the others.

You know, I I think your view on XAI depends on whether or not you think this is like where you think we are in the model development war overall. If the

front is infrastructure, if the front is new architectural breakthroughs, um or if it's like capital raising, right?

These are all uh very reasonable assumptions right now. A lot of people would say Elon knows how to build big stuff fast. Yeah.

stuff fast. Yeah.

>> Right. um and navigate the regulatory and resource landscape around that. Um

if it's infrastructure, I think X has a very good shot. Yeah. Um that being said, this is a period of time where you can't trust the narrative from any

individual company that that much in AI it, you know, it ricochets all the time.

But there was a period of time where there's a lot of industry consensus among the leading researchers that scale was all you needed. um and that if you just put more compute into pre-training, you would get more capability out. I

think it's pretty clear now that the returns to those scale to that scale is slowing down and there might be more efficient ways to spend the next gigawatt of power if we can get it. If

that's true, I think it's a much more open landscape, right? And you see really interesting companies like Thinking Machines and Reflection and these new labs staffed by amazing

researchers making a new series of bets on different capabilities. I'd also say Chad GBT is an amazing product benefiting from these three exponentials

for for consumers. Um whether it's from Chad GPT or from new products, I think we're still like 1% of the way there in terms of the experience that's possible,

right? Uh there's still like we're we're

right? Uh there's still like we're we're at the very beginning of multimodality.

Uh figuring out how to make reasoning like cheap and efficient so people actually use it. Very few people use the latest models from OpenAI all the time or from any other vendor. Uh and I I

think the idea that we talk about agents and there there are some instances of those being used in the business context. They're not broadly used by

context. They're not broadly used by consumers today proactively. And so I I just think we're going to see many more experiences where the landscape of competition is not set between meta and

XAI and all these others. And so it kind of depends on what you think about the research.

>> Um I have I I have a question. Um when

it comes to chat GBT and answers, I think that's where most Americans and most users interface with AI and it's magic. You pull out your phone and you

magic. You pull out your phone and you get an answer to almost any question. I

feel like the next 10x moment is when that personal assistant can take actions, can book my hotel, can buy my black t-shirts, right? Can do all the things that a great assistant can do

digitally. And so my question to you is

digitally. And so my question to you is when are when is our agent when is our chatbt going to be able to book our hotel for GTC where I simply say, "Hey

chat, book me the Hayes Adams for next Tuesday in Washington at the lowest price." Are we going to see that in the

price." Are we going to see that in the next six months?

I think 6 months is tough but it's just we're going to blink and it's going to happen because that's the power of these models right like not only can they reason plan but they have to take action

but there's some missing things on the integration side no science problems these are all edge problems and it's just around the corner and it is happening in the enterprise first

because you can do manual work you can do integrations so we are seeing a lot cases where some of these things are able to complete the loop and take

action and take it all the way. It is

going to happen because at the end as I said we're going to have multiple teammates. We are seeing it in sales, we

teammates. We are seeing it in sales, we are seeing it in legal, we are seeing it in coding, in the enterprise, it's going to happen in consumer. I mean Brad to me the most powerful leap will be when it's

not just executing the idea but when it's actually generating new ideas for you and I think the pulse product from chat GPT is kind of a window into that.

>> Um so that's going to be the kind of the really exciting part.

>> Well we were lucky to have you guys um for everybody in the audience for the absolute best in venture and now leading the charge in AI. It was a thrill to

have you guys here. Uh what a great sh start to the show and I think what we're gonna do is throw it over to Christina uh and see what she's finding out out there on the floor.

>> I I'm going to turn to an expert and learn more about healthcare. And I'm

going to start with Dio from Eli Liy, chief information and digital officer. I

actually want to piggyback off of the conversation they just had on stage.

They started it by talking about deflation and how AI could bring deflation into health care. could you

just speak about, you know, how far along that actually really is in terms of lowering costs because I'm sure everyone here would love to hear about that.

>> Thank you, Christina. It is surprising when you take a look at how long it takes to bring a medicine to market and how much it costs. One of the things that surprised me when I came into this industry is it takes 10 years on average

and about $2.6 billion to bring a single medicine to market if it makes it at all. So any effects of a deflationary

all. So any effects of a deflationary environment are going to have to be spread over a very long period before we see that benefit.

>> But doesn't that make it very difficult for you and even your investors, shareholders to really understand uh the benefits of AI if it's something that's 10 years out?

>> You are absolutely right because there's all this talk about AI and drug discovery today. And I think there's

discovery today. And I think there's this expectation that in two years we're going to have a medicine that cures cancer thanks to AI. But the reality is the medicines that we're working on now, we're going to see those benefits in the

2030s at the earliest.

>> Could you share maybe one particular medicine that maybe that timeline is sped up a little bit?

>> The timelines are still >> Okay. So, they're still out there. We're

>> Okay. So, they're still out there. We're

about 6.7 years within Lily and we're doing everything we can bring to bring them down, but biology just takes time.

It takes a while to see how medicine works in somebody's body and make sure that it doesn't have any adverse effects. So it's going to be very hard

effects. So it's going to be very hard to speed up the biology even if you sp speed speed up everything else around it.

>> And in regards to your large language models at Eli Lilly, which companies are you working with to help build those?

>> We work with all of them actually. We

have over 45 models that we use internally. So we have a platform that

internally. So we have a platform that allows us to have different models. And

one of the things that's great about that is when you have different models, you have different perspectives. So when

you have the models interact with each other, they come up with actually very clever ideas. just like when you have a

clever ideas. just like when you have a group of people that have different backgrounds.

>> We were talking backstage and one of the things you brought up is that you didn't really have experience in life sciences about four and a half years ago and you're a techie because you came from Apple. You helped with the Apple stores,

Apple. You helped with the Apple stores, the tech uh and the retail and online, correct? So, how does that translate uh

correct? So, how does that translate uh to Eli Liy now four and a half years in that experience?

>> I still consider myself a tech guy even though I don't get to dress like a tech guy.

>> You're wearing jeans right now. you're

not wearing a suit. Even though there's a lot of suits here, we are in DC. So,

we'll give the everybody credit for that.

>> By the way, that's the first thing that I did when I went into the pharmaceutical industry was to change the attire. So, like my whole tech team

the attire. So, like my whole tech team now, they all wear jeans. But really,

the thing that I that I noticed the most when I got there is I came from a world where Apple and companies like Apple put the user at the center of everything they do. Literally at Apple, you're

they do. Literally at Apple, you're starting with the user and you're saying, "How do we build an ecosystem around it?" I came into healthare and do

around it?" I came into healthare and do you think healthcare is designed around the user like my experience >> I know the answer I think our audience knows the answer to that >> and so I think the answer to the health

care system and how we fix the health care system is to start with the patient and how do we build an amazing experience that's around the patient not around payers not around even um

providers pharmaceutical or or pharmacies and others but how do we really build that experience that's the right one for the patient >> but then what about patient safety because I know that's a major concern.

And when we talk about AI, there's some hesitation with some health care providers maybe not to uh I guess provide the AI at the patient level because of those fears. So how do you

mitigate that?

>> Well, patient safety is one of the most important things in medicine. Every

single medicine company that's out there, they do incredible things to protect patient safety from everything through their trials. So it should be no surprise that patient safety is a core thing that we think about when it comes

to data. And so we're not going to do

to data. And so we're not going to do anything bad with data, but at the same time, >> says every company, >> says every company, but I think you can believe some companies more than others.

And I would generally trust companies that are in in this industry and really care about patient data. But I think at the same time, we're missing a lot of data that AI could be taking advantage of. I've got a watch on here right now.

of. I've got a watch on here right now.

Very few clinical trials take into account uh how much walking I'm actually doing, for example, or how much activity I have. Instead, they depend on patient

I have. Instead, they depend on patient reported outcomes like how do you feel today or other things like that. There's

so much opportunity to bring more data and more AI into our clinical trials and hopefully develop better medicines.

>> You just mentioned data and that will be my last question in regards to the data that you're sharing with biotech firms. So, you're opening up some of your models, right? About 18 17 to 19 of

models, right? About 18 17 to 19 of them. So, how does that work? Like they

them. So, how does that work? Like they

get to use your models, but then you get some of their data. So, it's like it's a premium model almost in a way. Well,

here's the interesting thing that I never thought about until I got here, which is we've got so much data, millions of molecules that are really well described. Most of those molecules,

well described. Most of those molecules, almost all of them don't work. But if

you're going to train an AI model, you need the things that don't work.

Everything that's published out there, if you go to public data sets, are all the molecules that do work. And so, what you really need is a lot of stuff that isn't actually good and helpful. So,

we've got tons of data, but we've never figured out how to make it available to the rest of the biotech ecosystem. Now,

with federated learning, the concept of sharing your data uh in such a way that you train a model while keeping the core data, the source data private, you don't have to give away your trade secrets.

And that's what we're doing with biotech. So, we're able to build a model

biotech. So, we're able to build a model together without exposing our data.

>> And that's an actually excellent way to end, right? All about patient safety,

end, right? All about patient safety, not and and sharing data. Diego, thank

you so much for being very honest about the timeline, too. I appreciate that, >> guys. I'll send it back over to you.

>> guys. I'll send it back over to you.

>> Thanks, Christina. We'll check in with you in a bit. You know, when people talk about winning the AI race, it's not just about faster chips or bigger models.

It's about scale and deploying the American technology stack across the world.

>> That's right, Brad. From semiconductors

to frameworks, from the cloud to the developers who built it. I believe

America is currently winning that race and that dominance in AI is really fueling a new era where millions of AI agents will exist to help us in every

part of business and in life.

>> So let's take a look at all the possibilities in the future. They're probably going to be 10 billion digital workers. We'll

have AI agents which are part of our digital workforce working with us side by side. AI agents will be everywhere.

by side. AI agents will be everywhere.

How they run will be fundamentally different.

[Music] [Music] AI is no longer a single application. AI

systems now decide, design, and deliver.

Across sectors, autonomous agents are transforming how work gets done from strategy to execution. To discuss how agentic AI is transforming every

industry, we put together another incredible P panel starting with Aravan Shrinavas, founder, co-founder and CEO

of Perplexity. Shiva, founder and CEO of

of Perplexity. Shiva, founder and CEO of a bridge, Scott Woo, founder and CEO of Cognition, and of course, George Curtz,

the founder and CEO of Crowdstrength.

You know, Aravan, let's start with you.

Nobody has innovated more on the chatbot, on search, on the browser now um in AI than Perplexity. You've

consistently been a step ahead um although fighting maybe up a mountain against bigger incumbents. So tell us now what comes next for the agent? What

do you see out there? What is hiding in plain sight?

>> Yeah, first of all, thanks for having me here. Uh what is really our vision for

here. Uh what is really our vision for the browser is not to launch yet another browser. We think of comet our browser

browser. We think of comet our browser as a personal assistant for all of us here. Uh essentially a second brain to

here. Uh essentially a second brain to delegate all the mundane boring work. So

it gives us a lot more time to explore and just be ourselves on the web. The

internet is just a lot better if you can ask questions from wherever you are.

Whether you're on a web page or you're in a Google doc or you're in your Slack workspace or you're actually on an AI tool, it doesn't matter. You can just ask questions from wherever you are. So

that's what we learned first time when we launched Comet. The number of questions a user asks on Comet is 6 to 18x more than what they ask on Plexity on other browsers. So that's just

because the AI is there with them everywhere. and they're starting to do a

everywhere. and they're starting to do a lot of awesome things like setting up their own Shopify stores, setting up their own Facebook ads, you know, listing items on Facebook marketplace,

all those sort of things. So, we're just beginning to see this explosion of uh people getting a lot more agency and autonomy on their own. I think we're just at the beginning. We're going to come back and talk a lot more. I'm going

to ask you some of the questions I asked the last panel on when I'm going to get my agent to book my hotel for me. and

and I know you have opinions on that, but Scott, Cognition's one of the fastest growing startups in history, um, building coding agents that are helping to power some of the country's largest

enterprises. You know, there are a lot

enterprises. You know, there are a lot of people worried that the AI hype is ahead of the substance, right? You're on

the front lines selling to America's biggest enterprises a solution that's helping improve their business. So, from

those front lines, help us understand where is the substance of AI coding today? How is it transforming these

today? How is it transforming these companies and do you think it's going to keep up with all the excitement and hype?

>> Yeah, absolutely. No, I mean I I I think right now, you know, in code especially, you know, you really feel this, which is you are just faster as a software engineer if you're working with the best AI tools and doing the most with that.

And there's a range of kind of the productivity gains you see on different use cases. you know, on some of the more

use cases. you know, on some of the more kind of gritty um very particular, you know, use cases, you might see speed ups on the order of 20%, 30%, 50%, something like that. For a lot of what we call the

like that. For a lot of what we call the engineering toil, you know, that's things like migrations and replatforms and modernization. Honestly, we're

and modernization. Honestly, we're seeing games that are on in the neighborhood of 6 to 10x where basically one hour of an engineer's time using the best tools corresponds to about 6 to 10

hours not using the tools. Um, and so, you know, I think the gains are very clear. And I think the thing that's

clear. And I think the thing that's really exciting about it is every team everywhere has so much more software to build. Um, and that's the best part of

build. Um, and that's the best part of it, right? And every team has, you know,

it, right? And every team has, you know, you told me this I think a year or two ago, which is every every engineering team out there has 50 projects that they want to go work on, but they have to choose four because that's how things

are with engineering, right? And so, you know, the ability to speed up and to do a lot more is is really really exciting for us all. So, Shiv, uh, I spent was fortunate enough to spend 10 years on

the board of Austin's largest hospital, and I was the tech guy coming in, and man, it it was like time stood still.

Things were slow, getting paid for things. Uh, if there was a box uh,

things. Uh, if there was a box uh, sitting in the hallway, uh, you would get JCO to come after you. I'm curious

though with all those pressures inside of our health care systems, how is AI putting the patient uh in front of the line here? Because sometimes that gets

line here? Because sometimes that gets overlooked in the bureaucracy.

>> Yeah, absolutely. Well, it's been a wildly historic moment for AI and healthcare over the last few years and part of the moment is the problem is the pain point. Two out of five doctors

pain point. Two out of five doctors don't want to be doctors in the next two to three years. There's a JAMAMA article that suggested that 30% of nurses don't want to be nurses in the next 12 months.

So, we have a public health emergency.

We have to do something about it. And

that's where AI comes in because AI and in our case with the bridge, we're able to unbburden clinicians so that they can make eye contact so that they can be fully present with their patients knowing full well that a lot of the

clerical work that they've got to do that that's getting taken care of for them so that they can just like focus on the person, focus on the care that they're delivering. That's incredible.

they're delivering. That's incredible.

Uh George, great to see you.

>> You too.

>> Thanks for coming on.

>> So in every major uh inflection point, whether it was mainframe to mini to client server, PC, social, local, mobile, we fractalize our applications

and our software. And what happens in every one of those is security gets more difficult. How in the age of AI is the

difficult. How in the age of AI is the risk higher? And then how is AI using to

risk higher? And then how is AI using to actually help be more secure?

>> Well, when we think about technology, and this is the great part about where we are today, if you look at the slope of the technology innovation curve, security has to parallel the slope of that curve. So, in every inflection

that curve. So, in every inflection point that you just mentioned, you have to have security. 30 years ago, it was an afterthought. It was a bolt-on. Now,

an afterthought. It was a bolt-on. Now,

thankfully, it's being integrated into the stack. And I think what we've seen

the stack. And I think what we've seen over time is where security uh where where technologies have sort of seams that's where the adversary lives where you're trying to connect things together. So if we can build it in

together. So if we can build it in foundationally it's going to be much better off. But from an AI perspective

better off. But from an AI perspective what does it really buy you? My view is that data is the the key to solving almost every security use case. So the

more data you have the more use case you can solve. And obviously AI seems to be

can solve. And obviously AI seems to be uh a good opportunity to deal with lots of that data. So you know from our perspective what we try to do and and what we've seen in the in the adversary

universe is the time has dramatically been cut for the adversary to actually be able to find vulnerabilities, exploit them, get in and pivot. It used to be months, then weeks, then days, now

minutes. And in one of the cases we

minutes. And in one of the cases we found within 51 seconds an adversary had dropped onto a system and pivoted off.

So the only way you're going to keep up with that is the the automated sock, the AI native sock where you're driving AI agents doing the work of a security analyst that cannot keep up with the

threats. And the challenge that we have

threats. And the challenge that we have right now is that AI in many senses is great because we're able to deal with these threats, but it's minting new

adversaries because it's now democratized destruction and it's given this level of sophistication to a much broader group that are not as sophisticated. So, George, the uh I'll

sophisticated. So, George, the uh I'll call the value chain of security threats goes all the way from uh a $5 IoT endpoint uh to the hyperscaler data

center and everything in between. I'm

curious um do you is there a central place that we can secure everything or do you need to have these every step

every link uh let's call it confidential computing? I it's not an easy problem to

computing? I it's not an easy problem to solve and if you think about I always jest you know I'm on all these panels every year and you know for the last 30 years that we to we still talk about bad passwords and identity like we still haven't solved it right I mean that's

kind of the state we're in we're getting better um but you have to look at where compute happens obviously cloud is a big element but now with AI it's being pushed more to the edge you know it it

was at the edge and it's in the cloud then it goes back and forth now it's all over the place so from my perspective you've got to apply the right security technology technologies to each of those techn uh to each of those technologies

and then you got to connect the dots across them. There isn't one magic

across them. There isn't one magic bullet that's out there. There isn't one company. There's one technology that can

company. There's one technology that can secure everything. So, it's using the

secure everything. So, it's using the right security for the right technologies at the right time.

>> Shiv, when you think about I love your story, right? A cardiologist by day and

story, right? A cardiologist by day and you know leveraging AI agents to solve healthcare problems by night. Today it's

about translating those doctor patient conversations into the health record.

But when you look ahead and think agentically, where is AI going to have the most impact on your practice? What

are you most excited about as the next step for a bridge and and what's happening out there?

>> Yeah, absolutely. I I think a big part of our thesis is that in the next 5 to 10 years, we're not going to be able to fully automate a doctor or a nurse. And

if we're not fully automating them, then the conversations that they're having with their patients are really upstream of so many of the workflows that happen in healthcare. So if I see a patient in

in healthcare. So if I see a patient in clinic, I'm without this technology, my back turned to them. I'm not really paying attention to them. I'm typing the whole time. I'm not really like making

whole time. I'm not really like making eye eye contact. I'm not being present.

With this technology, I'm fully focused on them. My documentation's getting done

on them. My documentation's getting done for me. But also in this country, but in

for me. But also in this country, but in around the world, we're not as clinicians compensated for the care that we deliver. We're compensated for the

we deliver. We're compensated for the care that we documented that we deliver.

So essentially, these notes are bills in healthcare. And being able to generate

healthcare. And being able to generate the note the right way in the most compliant way that checks off all the boxes for those billing and coding experts as they call them means that you can keep the lights on for the health

system. Now you can start to remove any

system. Now you can start to remove any amount of waste in the system that is is is basically taken up by by you know inefficient offshore services.

>> Go ahead.

>> Uh Scott um the meme out there is that developer tools are going to mean that we don't need uh any developers anymore.

And you know it's funny uh if you go back to the old days uh machine language to Cobalt once Cobalt showed up it was we're not going to need any more developers. But what happened is every

developers. But what happened is every successive generation of tools uh we moved to idees and now we have some pretty amazing products including yours

that can help do code assist to accelerate uh time to good program and a lot more. I'm curious what's the endgame

lot more. I'm curious what's the endgame here? How far can this be pushed? Is it

here? How far can this be pushed? Is it

uh like we saw with the the other iPhone, the iPhone meme, which was all photographers, no photographers are going to have a job. And I see photographers out there. But what it did

do is smartphones did democratize taking pictures that looked, you know, pretty good. So, how far does it go in

pretty good. So, how far does it go in the programming space?

>> Yeah. No, it's it's a great point. And

to to to your point, I mean, I think software, it's it's assembly and cobalt and maybe a long time ago was punch cards and you know, obviously now we have Python and we have right uh things are running on the cloud and there's

been a lot of form factor changes. But I

think if you think about what is software development at the end of the day, all you're trying to do is it's it's telling your computer what to do, right? That's that's that's kind of how

right? That's that's that's kind of how I would describe programming, software engineering, whatever you call it. It's

telling your computer, here's exactly what I want it to do and having it go and do that. And you know, I I think it will always be up to us as humans to decide what the computer should do, right? Um and and I think getting to

right? Um and and I think getting to this kind of platonic ideal where you can really just work with your computer, you know, Jarvis style and tell your computer what to do is is is where this is going. And and I think, you know,

is going. And and I think, you know, people talk about Jevans paradox, right?

And I think code is is perhaps the best example of it, which is again as as you're saying, you know, we've already made software so much more efficient over the last several decades, right? I

mean every software engineer what we all love doing as programmers is going on auto automating processes and figuring out how to make this part faster and make this cleaner and simpler and so on right um and despite that you know the number of software engineers has just

gone up and up because we have so much more demand for software >> as you think about I mean as you know one of the places I'm most excited is how this is going to transform uh the

consumer experience um I think all of us have come of age in the age of 10 blue links right And Google was such a breakthrough in terms of information retrie retrieval transformed all of our

lives. But for my, you know,

lives. But for my, you know, 17-year-old, they wouldn't dream of spending their time looking at 10 blue links when they could just get answers.

Um, where do we go? I asked the question on the last set, when am I going to be able to book my hotel in DC? And uh, you know, just just talk to Perplexity or talk to Chat GPT say book it next

Tuesday at the lowest price. It already

knows what hotel I like to stay at. It

already knows the room I want. How far

are we from actions, not just answers?

>> Few months to actually do that particular prompt you're asking for. So

hopefully next time you come here, you can use our product and get it done. But

here's the thing, like 3 years ago, we started this transition from 10 blue links to answers when we launched >> and it had a big impact in the sense that even Google is starting to do the

same thing. Um, but the real uh

same thing. Um, but the real uh transition entirely from just going and booking your hotel through a keyword on Google to just asking your agent to go

do that for you is only going to be possible with something like a browser agent. So that's essentially why we

agent. So that's essentially why we wanted to build Comet. And um

essentially what you're talking about is it should have a deep understanding of you. It's your personal AI. So it should

you. It's your personal AI. So it should have a sense of what rooms you like, what kind of views you like, what is your budget, where do you typically stay, which is where what are the set of hotels you typically stay in case it has

to deal with constraints on availability and then it has to actually go to the website of the hotel and check for these availabilities and actually go and use your card and make the transaction on

your behalf. So it's essentially uh

your behalf. So it's essentially uh advantageous for a company like us which has access to the web. We have our own index. We have our own browsing

index. We have our own browsing infrastructure to be able to do all this with the help of the most capable frontier reasoning models. And as these models are advancing in their

reliability and their capability every few months, uh it's possible to actually create this end user experience. And our

goal is that like even if it takes a few minutes to get it done, you should be able to take your mobile app and just talk speak out this task, forget about it, delegate it. it's running on the

background on the server asynchronously, comes back to you, elicits feedback whenever it's not sure, and actually gets stuff done. Uh, that's our vision for the mobile version of our Comet

browser. It should be running on the

browser. It should be running on the background on asynchronously and able to multitask and do like hundreds of different tasks like these.

There's been a complaint of some browserbased agents that they're just slow, right? They go in, they kick the

slow, right? They go in, they kick the tires, and you know, I needed to get groceries, and my gosh, it it didn't fully understand what I wanted, and I could have done it quicker. Very

related, I think, to your your travel.

When will we get to the point where it's faster or is that not important because it can be doing it in the background?

>> Yeah. So, that that that's the key distinction between agents and chat. Uh

it's not necessary for agents to be living in the chat UI. In fact, it should be more asynchronous and running in the background. Chat is synchronous interaction. So when you give your

interaction. So when you give your coworker or an intern or your assistant a task, it's not like they finish it in one second. So why would you want the AI

one second. So why would you want the AI to do >> definitely not my interns?

>> Yeah, but the point the key point I'm trying to make is like it can parallelize. You can multitask.

parallelize. You can multitask.

>> You can do hundreds of tasks. You can

you can call a plumber. You can call five different plumbers at the same time. Find the best option for you.

time. Find the best option for you.

whoever can come uh earliest at the best price and get it done. It's very it's it's basically humanly impossible to do five calls at the same time for one person. So that's the kind of thing that

person. So that's the kind of thing that we are imagining for agents. It's still

not a paradigm that uh everybody's used to uh where you ping somebody on Slack, give them a task, you don't expect them to reply instantly.

>> But on a chatbot, you want the answer fast and you're like, "Oh, this product is pretty slow." It's slow because it actually takes physically it takes time to get things done.

>> Great point. Uh George, um I think security is one of the most important areas in AI for part of the reason that you uh laid out, which is we've democratized uh the business for

attackers right?

>> Um two questions. Number one, if data really is kind of the primitive to AI being able to help defend, then then does this mean that advantage goes to

the security companies that are at the largest scale that have access to the most data? So I've heard your company,

most data? So I've heard your company, I've heard PaloAlto talk about platform systems now increasing advantages of scale. And then the second question I

scale. And then the second question I think is if you're a startup um you know we're an investor in Expo which is you know build an AI hacker to help

companies offensively try to get ahead of this. Do startups have the

of this. Do startups have the opportunity given that scale disadvantage to continue to be the disruptors in security that they've been in the past?

>> Yeah, two good questions. So um I do think scale is incredibly important now more than ever when you look at competitive advantages and modes it one of them is scale it's just you know not

only the amount of data you have but the customers you can touch right and that's the whole platform play as you know I think from a data perspective um there's a lot talked about well we collect data

or they collect data it's how you collect it it's how you curate it it's how you label it and it's what you do with it it's not just a pile of data and and the context is really important and one of the things that we focused on

since I started the company was never losing the context of when we collect data at the endpoint to the cloud. We've

got a mini graph on the endpoint. We've

got big graphs in the cloud. We never

lose context.

>> And this is the key. It's not about collecting a pile of data and putting it somewhere. It's never losing the

somewhere. It's never losing the context. That's number one. And I think

context. That's number one. And I think that has served us well because it allowed us to solve new use cases by creating new modules and with very little effort in terms of the modules

that we add on, right? Because we've

already collected it once uh and it ties into the business model. I think with respect to startups, I think it's one of the best times to to be a startup. Um

there's so many things that you can do today in security that you couldn't do.

We had to build all this stuff, right?

We were like pioneers in AWS. They

didn't have all these services. We had

to build the hard stuff. we couldn't use all these APIs and services that were out there. So I think and and um the

out there. So I think and and um the startup you mentioned Expo is really cool because now they're top of the leaderboard for finding vulnerabilities and >> with with all the bug bounty programs.

It's actually really cool technology. So

if you're a startup in security, I think you carve out your niche. You have a lot of advantages that we didn't have some of the bigger players. You have speed as your advantage and you can connect to all these APIs that just weren't

available. So if you do what you do

available. So if you do what you do really well, either you're going to get really big and expand horizontally or you're going to be part of a broader company. Two good outcomes, but I think

company. Two good outcomes, but I think it's now's not now is one of the best times to be a startup in security because you can focus on the areas that really matter and then companies like ours and and others that are out there

look at those and go, "Okay, we want that to be part of our platform."

>> George, I want to up level a little to we're here in DC obviously. Is there

something that Washington can do to help make our country more secure >> in this new age of AI?

>> There's a lot that we can do. I had some meetings already yesterday. I think

there's two things. One is there a there's a technology piece which I'll come back to. But the first is the procurement piece. And in most cases,

procurement piece. And in most cases, and I've been selling in the Washington for the better part of 30 plus years, is they're buying technology that's 5 years old because their procurement cycles take so long. It takes forever to get

through it all. Right? So, we've got to come up with a better way to procure these. And instead of most big

these. And instead of most big enterprises that you guys deal with them, you know what they do? They buy

once for all their subsidiaries and their companies. And the government,

their companies. And the government, it's a little piece here or there and it's just peacemeal. So, they've got to figure out the procurement. And then on the technology side, they need to be forwardleaning. And I think we're in a

forwardleaning. And I think we're in a position currently now with this administration where they are forwardleaning. They think like a

forwardleaning. They think like a business, right? Not like a government.

business, right? Not like a government.

And the key is they need to be implementing the agentic sock. They need

to be pioneering areas that haven't been done before with companies like ours and others to drive automation and implement technology that's you know future proof

for the next x number of years not technology from the last 5 years. So if

you combine speed of procurement with the ability to deploy technologies and partner public private partnership I think we can get to what I would call

security AGI. This is my goal. How do we

security AGI. This is my goal. How do we get to security AGI and how do we create the autonomous sock? I mean I think it's a really uh important point and of course our AIS are David Sachs is

looking for ways that he can drive further efficiencies in the administration and I think that's a great example um procurement's not sexy but if we're buying technologies that

are 5 years old in the age of AI in in AI dog years that's like 50 years exactly >> um and so uh that's a suggestion we'll certainly take to him I want to come back to this just question we had on the

last panel about you know power and the primitives and the cost of inference, right? All your companies are big

right? All your companies are big consumers of inference. Um, what do you see happening in the cost that you know that effectively as a cost of goods sold

for many of your companies? What do you see happening in the cost on the inference side? Are we bringing down the

inference side? Are we bringing down the cost fast enough? what are things you might be doing uh you know creatively with clouds with on onrem etc in order to drop that inference cost or is it

something that you think about at all maybe starting with you Arvin >> yeah so I think like a lot of people predicted the cost would uh roughly half every 3 months or something like that I I don't know if that trend is still

continuing costs will continue to drop as good models continue to emerge uh for example you could see uh haiku 4.5 from enthropic was pretty good Um what we are

doing is uh driving a lot of our own inference workloads. So we work with

inference workloads. So we work with Nvidia and building our own inference libraries and we use that to serve the best open source models. We collect a lot of tokens from all the several tens

of millions of users that we serve and we use that to like postrain our own versions of these open source models and serve them and that helps to bring down the cost a little bit. uh we are really

hopeful for GB200s to be like you know much more efficient compared to the H200s that we are currently serving on and that's hopefully going to lead to uh some reduction in cost too. In addition

to that we're also introducing new subscription tier. So we have a

subscription tier. So we have a perplexity max subscription uh which costs $200 a month but we're introducing the concept of background agents there.

agents that will reply on your behalf on your emails, draft your replies while you're sleeping. You can just add that

you're sleeping. You can just add that agent to your email thread and just schedule your meeting for you. Uh

automatically tag your emails as different categories. Imagine that sort

different categories. Imagine that sort of agent booking your tickets, booking your flights, doing your restaurant reservations. I think $2,000 a year is

reservations. I think $2,000 a year is pretty cheap for something that can do all this in parallel at the same time with all your personal context, right?

And that's going to obviously need a really frontier reasoning model that's going to be expensive to serve, but you're going to make way more in return because people are going to use it to make their life a lot better.

>> Scott, maybe for you guys, what what are you doing? Uh, you know, uh, I know a

you doing? Uh, you know, uh, I know a lot of the deep reasoning that you're using consumes a lot of inference. What

are you seeing out there and what are you thinking about in the world?

>> Yeah. Yeah. So, so one big thing to call out is you know agents especially are extremely compute hungry and maybe one way to think about it is you know you go to chatbt and you say all right who was the fourth president right and it gives

you the answer that's one query one answer right if you go to Devon and you say hey I've got this bug can you go click through the product yourself reproduce the bug check the logs see what went wrong maybe try and make some

fixes and then go and test the code and make sure all that works that's hundreds of queries or even thousands of queries that come from just one human ask and so you for better or for worse, they are extremely comput hungry. With that said,

you know, the way that I like to say it is the the productivity gains that we're getting are so massive that obviously, you know, we're not going to have the GPUs and just say, "Oh, just turn off the GPUs. We're not getting enough value

the GPUs. We're not getting enough value out of them, right?" And and so I I I think a lot of what it looks like is is kind of optimizing on that spectrum. Um,

and I think models are getting smaller and faster and smarter, you know, all the time. But maybe one thing I would

the time. But maybe one thing I would point out is there's kind of this uh this this curve of intelligence that always exists where you know the absolute smartest model that you could have is is also pretty slow and bulky

and so on and then you have something maybe that's almost as smart but is much faster and then you have like a really really fast one that's you know that next level >> and one of the big things that we have

to think about with Devon um because Devon is a compound system that uses multiple models is basically at each point in time always using the right model for the job, right? And so, you know, hardest step of this debugging

problem, you want to put all the reasoning into it. You want to do the smartest thing. If it's just clicking

smartest thing. If it's just clicking around the website and doing steps one, two, three, you know, something that's fast and just kind of like efficient gets the job done, right? So, it's kind of finding that mix between them.

>> So, you know, this ensemble model approach Yeah.

>> is a mechanism you're deploying in order to not only drive down the cost of inference, but drive down the cost of model use.

>> That's right. Yeah. So, so you're able to kind of use this uh use this frontier of models where you know the the biggest and most expensive models you only use in the times that you absolutely need

them. Um and then for a lot of these

them. Um and then for a lot of these other kind of day-to-day tasks that don't need the the maximum intelligence, you're able to do that faster, cheaper, and so on.

>> There's a lot of chatter out there that cursor may be building their own model or attempting to build their own model.

Um is that something in the future for cognition as well?

>> Yeah, so so we we do a lot of post- training of our own of of various models. Um and you know we produce

models. Um and you know we produce models that are that are really well suited for our particular tasks and I think especially when you get into the depth of software engineering obviously you know a lot of the models have a lot of code data trained into them and

that's one thing but you know if you said hey my kubernetes pod is you know is going wrong and could you please just like take a look at my logs and see what happened um that's obviously a very specific task which you can train a

smaller and faster model to do very well. Um and so you know specialization

well. Um and so you know specialization of task is where we see a lot of this model training coming into play.

>> So shift I mean healthcare uh the expenditures are massive but there's so much pressure. How how does the cost of

much pressure. How how does the cost of inference impact you and what you're doing at a bridge?

>> Yeah, it's it's a similar playbook for us. We're live in over 200 of the

us. We're live in over 200 of the largest health systems in the country.

We're touching well over 70 million patients, you know, every year. And

we're going really, really deep on a very narrow use case, but it allows us to get a lot of edits from users on a daily basis. So, our playbooks are

daily basis. So, our playbooks are similar in the sense that we might hit a model maybe 19 times, let's say 20 times to create one set of outputs for any given doctor patient encounter. If all

of those hits were to an off-the-shelf commercial model, we'd be out of business, obviously. And so so much of

business, obviously. And so so much of our playbook is around distillation.

It's around open models. It's around

fine-tuning and it's around post training. And so being able to even

training. And so being able to even create that feedback loop in healthcare is easier said than done because of obviously privacy and security being so important. This information is

important. This information is sacrosanked. So a lot of what we have to

sacrosanked. So a lot of what we have to do is build pipelines that allow us to deidentify data then certify that we've done it the right way and then you know build the systems that allow our our

models to continually improve with every single encounter across the country.

>> Uh Aravan I'm going to give you just a softball question. Uh just kidding. So

softball question. Uh just kidding. So

there's been talk about uh agentic systems replacing operating systems. Uh we've heard Meta talk about it. I've

heard Qualcomm talk about it. anybody

with a headset as well, an XR headset, what is the um the odds or the chances that this this could become a reality?

>> So, could you repeat that? Sorry.

>> Yeah. So, essentially instead of having a full-blown operating system like we have with Windows, iOS, it's an agentic system replacing it.

>> Yeah. So the way we think about it is uh the browser the one of the reasons we decided to work on a browser is there's no other way to ship a personal agent on the phone because the phone there's only

two operating systems that you can have on your phone. It's either iOS or Android. And while Android might appear

Android. And while Android might appear like an open operating system, technically it's not because what actually gets shipped on the device is controlled by Google. So the browser

lets you access third party apps without actually having to do that at the OS level on the phone. Uh we think cracking that problem is way more important than going out and building a new hardware

because your new hardware should still connect to your phone via Bluetooth and you're still controlled by all the permissions that your OS on your phone lets you do.

>> I appreciate that. Yeah, Johnny IV is working on that. It's open AI right now.

It's not necessarily a device like this, but who knows? Is it a headset? Is it a pair of goggles?

>> By the way, I'm really bullish on other hardware. Like the glasses are a really

hardware. Like the glasses are a really amazing form factor to just visually see things and like ask questions based on what you see. Uh that totally breaks the

interaction mode of just uh asking things from a stream of text. At the

same time, I feel like uh access to the web, access to browsing, access to tools, these are not going away. So you

got to work on problems that are hardware agnostic as a software company.

>> Makes sense. Browser browser based.

>> Well, I I I would say we could sit here and spend another 30 minutes. Some big

news this morning out of OpenAI, Microsoft, uh what they're going to do in healthcare. So I know you guys need

in healthcare. So I know you guys need to get off set and to pay attention to the stuff going on in the world, but it's been great to have you here. Thanks

for spending time with us and uh let's continue the conversation on the floor.

So in the meantime, let's go back to CNBC's Christina parts andos on the floor with another amazing guest.

>> Yes, one amazing guest, Mike Contractor, the face, founder, chairman of Core Reeve, one of the three founders, I should say. Um, Mike, we've been talking

should say. Um, Mike, we've been talking for quite a bit and there's so much I want to share with our audience, but could you speak to just the relationship with the government and we're talking about these buildouts here, you guys are

clearly building out a lot of data centers. um how is that relationship

centers. um how is that relationship changing even now and in the near term?

>> Yeah, so so Coreweave has been building the physical infrastructure as well as the software stack for the most demanding most sophisticated consumers of accelerated compute in the world and

we are relied upon by the labs to deliver that infrastructure. And we're

incredibly excited to be able to deliver that infrastructure uh to the government, to the federal government, the state government, the local governments that are going to need to access this infrastructure uh as they

continue to integrate uh AI into uh every aspect from, you know, across society through defense, uh through security, and and we just really think

that that a purpose-built stack like the one we have is going to be incredibly uh uh productive for the government, and we're excited to build that business.

Wouldn't do you feel like you need more support, more help? We were talking about public private partnerships. Uh

you know, getting the picks and shovels in the ground is is really been an issue. Power has also been an issue. You

issue. Power has also been an issue. You

are facing all of those issues. So, how

how do you mitigate that when you you're promising to build out 2 gawatts here at Project Horizon in West Texas? How do

you deal with that?

>> The the challenges of building this infrastructure at the size and scale and speed are material. the the government's

role for us is to help us uh facilitate the permitting process. The the role for them is to uh understand that this infrastructure is of uh national uh

importance, national security, uh national priority and to be able to support us as we continue to do that.

whether it's through private part uh public partnerships, whether it's through tax abatements, whe like we need to build this stuff and we need to be able to tap into the capital markets to raise the money to be able to build this

stuff at a speed that has really never been contemplated before. And so it's a really important role that government comes in and supports uh coreweave and supports other companies that are

driving the infrastructure that the company that the the country and really the world needs to be able to allow artificial intelligence to uh achieve its potential. You actually just

its potential. You actually just mentioned tapping capital and that seems to be a constant theme right how levered uh coreweave is and you just recently announced this partnership with uh a

Paris AI startup you know backed by NVIDIA poolside and the goal is to provide 40,000 GPUs how is the the debt going to how are you going to build it that out because that

requires a lot of upfront capex a lot of spending >> we we've been the uh um uh the tip of the spear in many ways in uh building debt structure

that allow us to go out, find a client, uh, structure payment from the client, go out into the debt markets, buy the infrastructure, build it, and ensure

that our payment or repayment of the loans occurs within the four walls of the contract. during the 5 years or

the contract. during the 5 years or seven years that we go ahead and we contract for we're able to pay for the infrastructure, pay for the opex, uh

drive a return to our company, reinvest in the R&D associated with building the next generation of technology and it's been incredibly uh accreative to our

company. Um we've been pushing this uh

company. Um we've been pushing this uh debt structure out into the market because we think that this is the way you build largecale infrastructure over time. We're excited about it. Uh

time. We're excited about it. Uh

Poolside is a great client. We're really

excited about them as an offtake for our compute. We're really excited as them uh

compute. We're really excited as them uh as a partner as we continue to build physical infrastructure and data centers with them.

>> But bu building over time, you're signing leases for some of these data centers for 15 years and then there's concerns that the customer contracts are much shorter. So that leaves a gap. How

much shorter. So that leaves a gap. How

are you dealing with that gap?

>> Yeah, you're you're correct, right? like

there is a timing mismatch between uh the the current contracts. But the truth of the matter is is the world's going to continue to need compute. They're going

to need the data center infrastructure.

And as a company that has been built on our ability to quickly respond to the demands of the computing infrastructure globally, we need to have access to

those data centers to be able to serve the needs of our clients. We will

continue to do that. We will continue to build the infrastructure. We're really

excited about it. We think it's an appropriate risk for us to have.

>> And I guess we should just end on that because it was a nice positive note, Mike.

>> Seven 7 o'clock.

>> Okay. Thanks, Christina. AI isn't just transforming tech. It's becoming a new

transforming tech. It's becoming a new driver of the American economy. From

manufacturing to medicine, it's rebuilding our industrial and scientific foundations that are driving new productivity and new jobs. Oh, it's me.

Nowhere is that impact clearer than in the rise of data centers. They are the new factories of the AI age. Here's how

this wave of innovation is fueling record growth across the US.

The world is going through a transition in not just the amount of data centers that will be built, but also how it's built. In just one year, you could see

built. In just one year, you could see the incredible growth in AI infrastructure.

The computer has become a generator of tokens.

[Music]

I've said before that I expect data center buildout to reach a trillion dollars and I am fairly certain we're

going to reach that very soon.

[Music] America's AI leadership depends on more than algorithms and chips, although

those are awesome. It runs on infrastructure. Data centers, power

infrastructure. Data centers, power grids, supply chains are becoming the backbone of intelligence. The foundation

of a national strategy for innovation.

Even when we assume we need let's say a thousandx more and there's forecasts that go up to a million times more, the buildout is tremendous. And here to talk

about building that foundation is Gio Albertazi CEO of Verdive, Olivier Bloom, CEO Schneider Electric, Krishna

Jonagato, CTO GE Vernova, and Chase Lockach Miller, co-founder and CEO of Prusso.

All right, Gio, let's start off with you. And let's pretend for a second

you. And let's pretend for a second people don't know how big of a challenge that we're looking at right now. Can you

talk about the scope of the power and the cooling that we're going to need by let's say 2030?

Well uh think think in terms of uh all the announcement that we have heard about uh investment in data center uh

certainly in the US but but globally uh we have an imagination about how much IT power is behind that. Yeah. For every uh

kilowatt, megawatt, gigawatt, there is an equivalent amount of power and thermal infrastructure. And that power

thermal infrastructure. And that power and thermal infrastructure needs building at scale. Now that has been traditionally a very kind of a

construction laborintensive industry and we're all thinking about how can we change that and scale it to absolutely unprecedented industrial proportions

because without that and that's what I think I believe and I know everyone in this panel and beyond are doing without that we will never be able to scale AI

at the speed that we have uh seen in in the video and that you have explained to us. So I think we are at an very

us. So I think we are at an very important inflection moment in this uh in this juncture.

>> So Olivier uh Schneider has been a hallmark uh inside of data centers. I

get a lot of data center tours and I I see your logo on a lot of the boxes in there. Uh one question I have for you is

there. Uh one question I have for you is sure it's build more and more and more but this also an efficiency play. Can

you talk me through the balance of efficiency and just more power?

>> Yeah, you're absolutely right. You know

what is very interesting? We are at a time where AI depend on compute. Compute

depend on energy. But the very interesting part is energy availability and efficiency depends on AI.

>> Okay. Because you know at Shredder I think we have been an advocate of energy efficiency for many many years and we strongly believe that the combination of electrification automation and digitalization will solve the energy

transition in every part of the world.

Now 10 years ago Frank speaking it was not possible. Electrification automation

not possible. Electrification automation were available but the type of technology you have now through AI help you to make energy more efficient. So on

our side of course we are excited because alto together we are providing the infrastructure to make the compute and therefore AI available but we are even more excited to leverage AI to make

our overall industry more efficient and if you think through we have been all of us through the industrial and digital revolution actually what we are doing it was in your introduction we are talking more

and more about AI factory and we are implementing exactly the same principle so going through the full life cycle of PCI factory start from the design building digital twin where you can

simulate power and cooling efficiency to the development the construction the maintenance the operation this whole site life cycle is completely powered by air at the end

>> Krishna I think that um you know Vernova's gas combustion engines are really a a secret weapon in America's AI buildout um but you're sold out and you

know talk to us a little bit just about how you're changing your own business processes to drive more production to get more online and uh where how you see that playing out over the course of the

next few years.

>> Sure, Brad. So, like you said, the production is, you know, the demand is massive. Uh we're sold out through 28,

massive. Uh we're sold out through 28, maybe through 29, but we're investing heavily in capacity expansion of gas.

So, we're quadrupling our capacity of uh you know, number of gas turbons delivered by 2028 compared to 2020. And

this is uh all types of power is also an answer to this. It's not just gas. So we

are fortunate to have basically all kinds of power that's available to us.

This is gas, this is nuclear, this is solar, wind, uh and hydrogen. So we are looking to leverage every type of power source that's available to grow the

electrical infrastructure. And uh power

electrical infrastructure. And uh power generation is a part of it. A big part of it is also we'll talk about later is the grid. Once you actually generate

the grid. Once you actually generate these electrons, how do you get them from source to use in a very complex environment with new grid resources and so on. But power generation is

so on. But power generation is definitely a and grid is a big bottleneck right now and we're working on you know easing that.

>> Help us understand how your power generation capabilities capacity evolve from 25 from 2025 to let's call it 2029.

So if you look at the demand side the U for about 20 years the total power demand in the US has been flat no growth uh.3% growth but in the next uh 20 years

uh.3% growth but in the next uh 20 years it's expected to grow 50%. And about a third of that is going to come from data centers and the to meet that demand you

essentially need to leverage all kinds of power that's available. Like I said, it's gas, it's nuclear. We have an SMR that's coming online in 2029 in Ontario.

Our first SMR application uh at TVA is went in this year. So there is significant growth in nuclear that's coming up from a gas perspective depending on the gas turbine. Sometimes

heavy duty is the answer. Sometimes it's

your aero derivatives. Sometimes it's

you know different types of turbines and also futurep proofing some of these gas turbines so that they can run on hydrogen in the future. So it's gas is a fantastic bridge right now. So electrify

now decarbonize later. So I think that that's something we can do as well. And

we're also looking at you know uh wind and solar. So all of them belong in the

and solar. So all of them belong in the energy equation and I think that's what we're working on. we are it's what the market needs. Energy abundance is what

market needs. Energy abundance is what you're what we're chasing here and energy abundance is really an all of the above kind of answer right now. Right.

Uh you know Chase you guys have been at the forefront of what Jensen talks about extreme code design. Right. Moore's law

is dead. The data center is now the unit of compute. This is about power in uh

of compute. This is about power in uh and tokens out. Crusoe is building some of the most capable uh data centers in the world in Abene and other places in

partnership with Nvidia. Talk to us a little bit about what you see happening in that world of extreme code design and why you've been so successful along with companies like Coreweave in pushing the

frontier on how Nvidia brings its chips to market.

>> Uh absolutely. So, you know, Cruso is an AI factory business. And the thing about AI factories is this this incredible composition of all of the greatest industries humanity's ever produced

from, you know, everything the gentleman next to me are are are working on to uh you know, from from a cooling, a power generation, a a storage perspective all the way to, you know, everything that we've done in the in the silicon space

and, you know, most importantly our our partner with Nvidia uh our partnership with Nvidia. uh but you know this this

with Nvidia. uh but you know this this process of taking uh electrons and turning them into tokens is uh just one of humanity's greatest opportunities and

challenges over the next decade. Um and

uh we we uh we we see it as you know just a a very complex challenge in terms of just the scope and magnitude of these investments. Um and it and it's

investments. Um and it and it's requiring just an incredible amount of resources across the entire economy. So

tell me a little bit I mean again I'm out of my league out of my depth a little bit here but the 800vt data center you know kind of architecture that I know Nvidia's been pushing as

just one example of this you know extreme code design what are the things that you see happening and coming down the pipeline that's going to drive that equation in terms of uh power in and

tokens out >> yeah so uh you know I think the the 800vt uh rack design is like a great example of us, you know, thinking through a first principal basis like,

okay, data centers are sort of built in this way and that's how we built them 25 years ago. Is that really how we should

years ago. Is that really how we should be doing it today at the scale of gigawatts? Maybe not. You know, when we

gigawatts? Maybe not. You know, when we think about this data center scale computer where it's not just a single server, it's not just a single rack.

It's really the entire system working together as one cohesive unit that's thinking as one giant brain. And how do you think about that from a networking perspective? How do you think about that

perspective? How do you think about that from a cooling perspective? How do you think about that from a uh uh you know from a like entire system perspective?

Yeah. And um you know the the the the power aspect of that you get massive efficiencies from doing things like you know changing you know changing from 48 volt to to 800 volt like that's like a a

huge efficiency gain that like all of us would benefit from and uh so you know it's just one example of like this cohesive collective you know industrywide partnership that's required

to bring this infrastructure to life and >> if I if I can reinforce 100% you know this 800 volt is an example of how you

can um make the whole system not only just more energy efficiency efficient but really scoop up kind of a power that

is available because the system can be redesigned from scratch again you were talking about the system uh think system not just components we've come from

decades of data center will being built a component at a time I was talking about an inflection point in the way we think about data centers This is a fantastic example. There are

many examples. Again, we we have to think the power train, the thermal chain, the whole pre-fabrication u as ways to

industrialize >> the way we build data centers like we've never done before. So believe the next 10 years, the next five years, three years will be very different in that

respect than I have been so far. Uh

we've seen some picture of what we call vertive uh one core that is exactly that type of uh modularization system level

all parts integrally designed to work together and that is perfectly applicable to the 800 volt uh power

train. So things are changing fast.

train. So things are changing fast.

>> So um China is building faster, they're building bigger. you had mentioned the

building bigger. you had mentioned the 10x on nukes uh to R zero or one if you count Georgia. Um and we're behind. I

count Georgia. Um and we're behind. I

mean I started I did China I I worked with China in 1995 and on buying sheet metal and plastic injection molding and they sent me I

said hey send me a picture of the site and the picture had no roads had no electricity and I said well how are we going to be up and running in 60 days?

They said watch and it happened and China was willing to do whatever it took to build infrastructure and here we are 30 years later and it it has continued.

What is it that we could do to go faster? Uh I was at the uh I think we

faster? Uh I was at the uh I think we were both at the winning of the AI race here in DC. Chase, you were here as well

on stage. What can Washington do to get

on stage. What can Washington do to get this moving faster?

start with you.

>> Well, look, when you see what happened in the past months, they've been very careful for the local government that AI has been big on the agenda and a energy has been big on the agenda and the two are really interconnected. So I think

what they are doing right now to make sure and it's a bit what my colleagues have said it will take an ecosystem at the end of the day and you you might not be aware but whether it's Nvidia us the

four of us you have more and more an ecosystem of people who are working together to support the local government to make it happen we are working very closely also with all the IP scalar and what is very very important at the end

of the day and Joe talked about it we are living a huge revolution in our sector you know I've been 32 years National Electric I think more has happened in the past two years than in the past 32. So the fact that you have

this ecosystem which is working together working closely with the government. I

think we have the good discussion the real discussion on what does it take to to to make it happen and we are moving very very fast. US by the way when it comes to AI and data center is still by far number one in the world today.

>> Yeah. President Trump signed an executive order which basically said uh removing some of the barriers uh to that. Are you currently experiencing

that. Are you currently experiencing that?

>> Yep. Yeah. Well, we're working on all of that for sure.

>> Good.

>> I mean, that's you know, uh, you you nailed it. I mean, Secretary Wright and

nailed it. I mean, Secretary Wright and Secretary Bergram, every time I've seen them, heard them when we're in DC, they're actually asking us, "What is your recommendation? How do we go

your recommendation? How do we go faster? What do we need to do?" I think

faster? What do we need to do?" I think it's a complete change in kind of the mindset. Um because now it's been

mindset. Um because now it's been elevated to a national security issue.

That's right. Right. This is no longer just about economic security. It's about

national security and consequently moving you know faster than ever. You

know which brings me to this question where can startups help right when you think about we're sitting here with the large some of the largest industrial companies uh you know in the ecosystem.

What if anything do you see happening out there in the startup ecosystem and Silicon Valley and power generation?

Certainly obviously Crusoe and you know um and and the work of Cororeweave and others Nebius in terms of building out these new data centers has been critical but anything else that you see is

interesting that people in the crowd might want to pay attention to >> absolutely so if you look at I think startups are the lifeblood of our innovation ecosystem. they're here

innovation ecosystem. they're here >> and uh when you look at what we're doing I mean G Vernova is 18 months old spun off as an independent company and our

CEO intentionally headquartered it in Cambridge uh Massachusetts because of the innovation ecosystem with MIT and the universities and everything surrounding there and we are investors

in a lot of these startups but startups are bringing so when I look let's let's go to the grid for example our grid was designed for you know 50 years ago

relatively stable spinning load, spinning sources of power, relatively stable loads. It's a completely

stable loads. It's a completely different world right now with the AI workloads, hundreds of megawatts in in in milliseconds. And then on the source

in milliseconds. And then on the source side, you have wind and solar where these resources, you know, they're changing constantly. They're

changing constantly. They're fluctuating. So having a grid that can

fluctuating. So having a grid that can actually deal with this is is really really important. So we are working it,

really important. So we are working it, all our competitors are working it, but the startups have a really really important role to play there. There's

the hardware aspect of it and also the software aspect of it. When it comes to hardware, we have power electronics, innovative solutions. I've seen

innovative solutions. I've seen companies that are cooling the transmission cables. I mean just

transmission cables. I mean just fascinating stuff, you know, uh so that you can increase the capacity to transmit power. But how do we leverage

transmit power. But how do we leverage power electronics you know statcoms you know uh fax solutions to make sure we can leverage the infrastructure more startups are playing a huge role out

there and they're also playing a very massive role in the software ecosystem how do we make you know how do we predict you know what's going to happen how do we make sure the right resources

are there at the right time at the right place and how do we manage in real time from millisecond scale to minutes and hour scale How do we dynamically manage

the grid across uh across the country?

These are all big challenging problems we're working at but a lot of the innovation in that space is coming from startups. So we're seeing again I don't

startups. So we're seeing again I don't want to name a specific startup but we're seeing lots of innovation coming from there. So I'm really excited to

from there. So I'm really excited to work with the startups and see how we can kind of you know accelerate this transition.

>> Maybe the same question for the two of you. Yeah, maybe maybe if I can one

you. Yeah, maybe maybe if I can one point if I come back to what I said before you know AI depend on compute comput depend on energy and actually energy depend on energy intelligence.

What I mean by that at the end of the day it's about data you know if we are even if we are Schneider Electric historically more hardware company actually today we are more an in digital

company and back to your question what we love to do with startup the more you have startup which are coming on this data intelligence and looking at the different assets you can have in the factory because it's a very complex one

you will have switch gear you will have racks we are doing a lot of R&D but the more they are taking those user case on how you can make it more efficient how you capture the data. We are helping to

structure the data. We are helping to create an ecosystem but our customer are asking for even more. So the more you have startup coming in our own ecosystem which is completely open by the way not protected. The more we deliver solution

protected. The more we deliver solution and the more we can accelerate energy intelligence in our industry. So we love having those startup taking a strong interest. We need more and more smart

interest. We need more and more smart people and capital you know coming from all the startups.

>> Y yeah absolutely uh 100%. And if I look at it from uh the infrario technology now that the the industry infos data center uh infrastructure technology

has been static for many years and then it started to accelerate crazy in the last 3 five years and it is going to accelerate more when you have this kind

of acceleration uh people like like like the like us that scale I mean scale is absolutely essential but you cannot just scale in this market. You have to be an

innovator and scale. Innovation happens

organically. Innovation happens

inorganically. There needs to be a base of uh startups, inventors, creative minds, engineers out there that define

what the new technology will be because a new technology cannot be just nurtured within the Janova, Schneider vertive. It

needs to come also from uh spring from a from the let's say entrepreneurial uh fabric of a

country and that the two things combined and scale and that's what we see very apicious times for right now. Chase,

what do what where are the most exciting areas you see innovation when you look ahead two three years you say I can't wait to bring this into right the AI factory

>> um yeah you know just adding to this sense of urgency and I think this huge opportunity for startups uh is you know just this sense of uh you know start startups have you know the benefit of

being able to move very very quickly uh and you know h we're at a moment of just incredible urgency across the world in terms of bringing this infrastructure to

life. uh you know I I I think uh from

life. uh you know I I I think uh from from to your question around things that you know we're really really excited about um you know what what's changing in the data center is uh you know it's

unbelievable how much change is happening over such a short period of time uh you know from you know the cooling architectures to the the power densities of the systems you know I think when if you look 20 years ago a

single rack in a data center might have been 2 to 4 kilowatts uh you know today with the you know uh GB 200 00s, you know, you're looking at 130ish, 140 kilowatts depending on peak demand. Uh,

you know, and and then you look, you play it forward and you look at, you know, Vera Rubin, you look at Fineman, these are, you know, one megawatt racks, right? You know, this is a a thousand

right? You know, this is a a thousand homes worth of power in a single rack in a data center. Um, that requires tremendous innovation across, you know, the cooling systems, the the network systems in terms of cabling these things

together, which we're seeing, you know, a lot of tremendous innovation. Um and

uh you know I think I think there's a lot to be said about just like the overall memory optimization in the systems like you know one things Cruso is very focused on is not just the hardware aspect of bringing these AI

factories to life but actually the software aspects of actually how do you operate the AI factory to produce intelligence very very efficiently. that

requires getting data and moving data from you know things like uh you know your your your your object storage you know uh you know bigger storage platform to uh into directly into HBM and how do

you do that you know ultraefficiently and we're really excited about some of the innovations taking place across the networking stack and the overall software innovations that are uh you know bringing data to compute way more

efficiently so we can actually utilize the GPUs and get the most intelligence per unit of investment in compute >> you know it maybe appropriate because we're here at GTC. Nvidia has been an

active investor across the space.

They're forward engineering working with partners like you guys to help drive this, you know, the this extreme codees in the data center to get us the benefits of Moors law now that Moors law

is dead or even more. Can each of you just give us an example of, you know, some of the things you're doing with NVIDIA in particular that are helping to drive this flywheel of innovation and

how this unique partnership might be accelerating uh the path that we're on?

>> Well, we can go no please go to >> we can start exam we were talking about the 800 volt DC as an example >> and that's an area where we are actively

actively working together. Yeah. So uh

how to how will we make that technology available and what we have to do as infrastructure provider we have to make

that infrastructure ready ahead of uh the the GPUs the silicon that that will land because that infrastructure needs to be in place so that's an area another

area of of great importance still going still continue to evolve is everything around uh around liquid cooling Let's not forget the transition that has

occurred and is still occurring in the industry from a air cool to a to a liquid cool and that's a it's a colossal change if you will. There's still air

it's a mix. It's a it's an hybrid but more and more loads are liquid cooled.

So all these things require u technology partners that are ahead of the pack and then the the industry can uh can follow.

But we have to pave those ways together.

>> Yeah. Uh we have a ton of deep partnerships with Nvidia across the stack and it's it's it's so amazing that this company just spans so many aspects of the economy from power generation to you know the most cutting edge

artificial intelligence development. But

um you know we're working with Nvidia very tightly on like uh AI factory reference design. So how do you actually

reference design. So how do you actually build these gigawatt scale uh data centers um and and do it efficiently from a from a power generation from a

from a cooling from a you know uh power perspective and uh so that that's one big aspect that we're investing a lot of time with Nvidia uh another is on the software layer right so uh you know

working with them on you know what the uh you know things like uh leptton which is basically a distribution of like getting compute in people's hands so that they can run their workloads very efficiently across uh you know various

uh NeoCloud partners like ourselves um as well as things like uh Dynamo which is basically accelerating inference and and creating uh how do you actually get more utilization and a lot of these memory optimizations that Cruso is

focused on uh pushing and and innovating on so that people actually get more tokens per per unit of compute uh from their systems. Let me let me just interrupt by asking. Do you see any other chip company out innovating?

Right. There are a lot of other chip companies. You guys work with other

companies. You guys work with other ones, but Nvidia seems to be playing a really unique role in pushing the ecosystem forward. Um maybe maybe a

ecosystem forward. Um maybe maybe a little commentary on how it's similar or different than others.

>> Yeah, the thing I would say about Nvidia that they've done so incredibly well and kudos to Jensen is just the ecosystem itself, right? the uh Nvidia, you know,

itself, right? the uh Nvidia, you know, I I forget the exact headcount. It's

35,000 peopleish, something like that.

Um, you know, the amount of leverage that they're able to get from those 35,000 people and then the ecosystem of builders is absolutely incredible. And

that's kudos to Jensen in terms of you know building out all these different uh work streams and workflows across energy data center designs that they've released that have just uh

helped unlock this collective intelligence of uh society and and and and the startup ecosystem to you know big uh big enterprises. Just to you know kind of build on that a little bit by

the time you're done with all four of us maybe you'll get a view of the scope and breadth of the ecosystem that Nvidia is playing. We talked about software

playing. We talked about software optimization AI we talked about cooling.

We talked about 800 world rack. We are

more on the power generation and grid side. So we just released a white paper

side. So we just released a white paper with Nvidia. We're working with Nvidia

with Nvidia. We're working with Nvidia on reference architectures for power gen inside the and and how do you manage it?

So we just talked about the grid a few minutes ago how complex it is different source of power load balancing and so on. Now the power is not available today

on. Now the power is not available today at the grid sometimes when folks like you are building the data centers. So

there's behind the meter power there's bridging power but when you do that you're dealing with all the complexities of the grid inside the data center. So

how do you design and architect it so that you don't have you can actually manage the power flow with battery energy storage with you know power electronics fax you know all kinds of different systems working with people

who work in the medium voltage and low voltage space that you know my colleagues here are doing so it's an ecosystem where all the way from the power gen to memory to algorithms

cooling and the entire thing we're all working with Nvidia so to me it's amazing to see how They're curating the entire ecosystem and moving us all forward and helping us move forward.

>> And maybe we'll end we'll end with the, you know, Schneider's commentary on this. I don't see anything else out

this. I don't see anything else out there that has this level of systems thinking, right? It's it's extraordinary

thinking, right? It's it's extraordinary the breath. that gives me a lot of

the breath. that gives me a lot of encouragement about the gains yet to become or yet to come uh in in in terms of uh America's re-industrialization

around around power and and AI uh you know generation but how is Schneider working with Nvidia to drive this forward >> well look a lot has been said but for for me what is very unique in the vision

of Jensen is first of all to work with partner you know what what we love with Nvidia team they share a lot with us you know they speak about the omniverse they speak about the next generation of chips, you know, next generation of GPU

and when you can work as a technology partner up front and in a very open manner that change the game. What I

would say also, you know, Jensen has this famous sentence about the speed of light you know is forcing the entire industry but in a very positive way to work at a very different speed and for

us as for Shannon I think I can tell you it has changed a lot of stuff in the way we are managing the company. Joe

mentioned the shift from air cooling to liquid cooling. We discovered that by

liquid cooling. We discovered that by working we we knew it would come. The

question is was when we discovered two to three years ago that it could come much faster than expected. But if you ask me today what does it change it really we have to completely rethink the way we design those data center and I

like what Chase mentioned in his sentence. He said design he talked about

sentence. He said design he talked about maintenance operation. So you have to

maintenance operation. So you have to completely rethink the way you build this those AI factories from the design stage you know to the build stage to the operate to the maintenance. So it's

about building a digital twin and what Jensen and the Nvidia team are forcing us to do is to think about those next generation and guess what software is playing a big role and software is powered by AI. So again it's a loop on

which we are working all together. One

thing I want to touch on uh follow up there just in terms of uh the systemwide innovation uh needed to invent solutions to support this infrastructure at scale.

uh one of the conversations I was having backstage with uh Olivia and uh is is just around this notion of how the utilization of the data center scale

computer impacts the power swings uh because you have this massive large variance in terms of like when you have a gigawatt scale data center that's operating as a single brain you have this problem where it's pushing data to

the to the GPUs all the GPUs are running their back propagation or their compute you know their matrix multiplications and then they're publishing the results to every other GPU on the cluster. And

when you have that, you actually have these massive power swings that you know it's like whatever six herz somewhere in that range where it's uh you have these massive load oscillations and that creates a lot of problems for utilities

um and on-site power generation like you know uh G Vernova doesn't like it when we have these massive power swings in the actual power utilization of the turbines. So, it's actually created this

turbines. So, it's actually created this uh you know systemwide partnership where we need to work with uh you know our on-site generation, we need to work with the utilities and most importantly we need to work with uh you know folks like

Schneider uh to innovate these solutions where we can actually create a a buffer through battery systems that can absorb a lot of this load oscillation um that we're seeing from these large scale

training workloads and these systemwide workloads that are running on these massive scale computers. I mean, one of the things that that blows me away, if we think about the Manhattan project, um

the that that cost about uh $4 billion over four years, right? Inflation

adjusts that. That's about 40 billion.

As a percentage of GDP, it was 1% of GDP, so let's call it 400 billion.

Jensen has said we're going to spend 4 trillion over the next five years. So,

private industry is going to spend 10x what we spent on the Manhattan project.

And just listening to the decentralized coordination really where Nvidia plays this role in pulling everybody together and pushing everybody forward, it

reminds me like the speed of that is very commensurate with what we think about the speed of, you know, the Manhattan project. I don't see it

Manhattan project. I don't see it happening anywhere else in the world. I

think that all of your companies are critical to the success uh you know of the American AI race. Um and it's great to have Nvidia helping push it all forward. I know we got to jump so I'm

forward. I know we got to jump so I'm going to throw it to you.

>> No, that was a great ending. Gentlemen,

I just want to thank you so much for coming up here and I hope everybody out there listening has a better understanding of the challenge that is ahead of us and also Washington can

hopefully clear as many roadblocks as possible to let all of you cook.

Thank you very much.

>> Thank you.

>> So, let's go back down to CNBC's Christina Parts and Eveos on the floor.

>> Thanks, guys. You're right. This is all about the buildout $4 trillion that Jensen has spoken about, and we're hoping that we will get more details about uh just that evolution. The crowd

here is quite impressive. Many of these guys are just waiting in line to take photos. Uh, I didn't expect people to

photos. Uh, I didn't expect people to actually wait to get into the building right before 8:30, but they were here.

And like all GTC events, considering this is number four this year, they've got a merch truck, which I'm curious, is it expensive, sir? Do you think it's expensive?

>> No, it's just the perfect price.

>> What are you buying?

>> Uh, a lot of gear for my kids and a little bit for myself. Some, you know, shirts.

>> Do your kids understand what Nvidia does? Not yet, but they will.

does? Not yet, but they will.

>> They will. They will. Thanks, sir.

Thanks, sir. You know, it's kind of funny. We have a lot of tech people

funny. We have a lot of tech people here, policy leaders, you know, I saw members of the military as well, but one of the perks for a lot of investors that

here is just the stock climb for Nvidia.

I, no joke, last night a man at the Marriott, I was just getting some tea and cookies and he decided to show me his portfolio and how much he's made on Nvidia shares, which by the way have

climbed 1,400% just in the last year. Market cap right now is about 4.6 trillion. Uh Pat and I were talking backstage, you know, that 5

trillion is just around the corner.

We'll see about six trillion. But back

to why we're here today. GTC event in DC. the fourth one. Here's the I guess

DC. the fourth one. Here's the I guess black carpet that we're going to go roll up on. Oh, don't worry. You can be on

up on. Oh, don't worry. You can be on camera, too. Uh, a lot of people are

camera, too. Uh, a lot of people are expecting more news about the quantum um collaborations with Nvidia alongside uh

public partnerships. We spoke to Corey's

public partnerships. We spoke to Corey's CEO too just about um you know the depreciation cycle often comes to mind for a lot of investors but he's saying that they've been sold out of A100s,

H100s, H200s, everything. So maybe this narrative doesn't hold true. We'll let

the audience be uh the judge of that.

And speaking of the audience, look at these look at these guys. You've been

sitting here since what? 9:00 a.m. All

of you. And you didn't have to do anything special to get these seats, right? You just sign run. That's

right? You just sign run. That's

actually a great a great answer. Uh the

crowd has is packed which is to be expected right from GTC in San Jose. You

could barely get a cab. You had to I had to jump out of my cab and walk because it was so crowded with people here. You

ma'am, what brings you here today?

>> I'm just here just listening into all the NVIDIA stuff. I work for Department of Defense and so just listening into, you know, what innovation is bringing from all different industry partners >> and anything that stood out to you thus

far from any of the panels?

>> Absolutely. So everything that uh from that I heard from miss uh from Martin on the the guard railing of AI and uh policy type of stuff to obviously infrastructure and well how industry

government needs to work together to enable a lot of that. So absolutely.

>> Thank you. Thank you very much. I put

you on the spot. We didn't test this at all. Um, one of the interesting things

all. Um, one of the interesting things that we look at from uh, I guess an investor perspective is just the return on Nvidia shares because leading into GTC often after, let's say, the big GTC,

the developer one in San Jose, there tends to be a nice little stock pop. On

Friday, I was just checking out there was quite a a bit of options activity, which would, you know, lead maybe some people to believe that uh, we expect some big news today and that could move

markets. That's why we're here. Uh and I

markets. That's why we're here. Uh and I I do think the significance of being in DC cannot be overlooked, right? A lot of this is about changing the dynamics here. You know, really building out

here. You know, really building out power infrastructure. We know that

power infrastructure. We know that that's going to be a major major roadblock here in the United States. And

I'm trying to show you, chat it up, provide some insight while we're going through the crowd over here. You guys,

are you Nvidia employees?

>> No.

>> No. Tell me what brought you here today.

Uh we're just trying to network and meet people and learn more about AI.

>> And so this is I guess new to you then.

What have you learned thus far then in regards to AI?

>> Uh there's a lot going on with it.

>> You know that's an honest answer too because it is very complicated especially when you when you start uh speaking about acronyms and stuff like that. And we'll just keep going through

that. And we'll just keep going through the crowd right now before I get to my next guest. So, my purpose is not only

next guest. So, my purpose is not only to give you an inside scoop as to what's going on here on the floor, but then also we're doing some CEO interviews, hopefully getting some more insight as

to not only their collaborations with Nvidia, but more so how we move forward because right now there's a lot of big topics. The AI bubble narrative, whether

topics. The AI bubble narrative, whether that is the case despite all the hyperscalers being able to use their free cash flow uh to spend. Uh the other narrative which unfortunately is popular right or not popular I should say but

relevant right now is just the efficiencies from AI which is great but then what does that mean for uh job displacement so another major narrative and then lastly of course power which keeps coming up today I know we have

more panelists that are going to be speaking about the power infrastructure I'm getting people like midbite he's smiling because he knows he's on camera sir you look important because you have

a pin um what what do you think so far of this event?

>> Uh, I definitely think it's uh very interesting. I didn't know a lot about

interesting. I didn't know a lot about AI before I got here. Um, some of the concerns that have brought up were things that, you know, just I've heard off like the news and stuff. So, it's

good to hear it from like the experts.

>> And are you from what department do you work in?

>> Uh, I work with uh the Marine Corps over at the Pentagon. Yeah.

>> And so, do you feel like this is a little bit different? You have tech and policy leaders, the Marine Corps, just more the DC like people just merging together here. Is that something that's

together here. Is that something that's new and you guys are talking about it back in your office or anything?

>> Yeah, we're trying to to see how we can leverage AI to make our processes a lot uh faster and more efficient so we can get after what is, you know, our main job, which is defending the country.

>> Awesome. All right. Thank you. I know

I'm putting a lot of people on the spot and not everybody really enjoys doing that. So, you did a great job. Thank

that. So, you did a great job. Thank

you. And and you know what's great here is that you do have a lot of really advanced uh people in terms of technology. They can talk about you know

technology. They can talk about you know uh putting atoms in steady state and how that works and then others that are just learning about it for the first time.

Why? Because that's what GTC does. that

brings all sorts of people uh tear together to not only mingle, listen to the CEOs, hopefully catch a glimpse of the superstar of the day, which maybe we

should start taking bets on Polly Market which leather jacket he will be wearing or maybe a t-shirt. I don't know. I

don't I know a lot of us didn't want to wear leather jackets, right? You're not

wearing one. I'm not seeing any leather here, so perhaps that's uh done on purpose.

But we are going to go head over now to Brett. Brett has been waiting patiently

Brett. Brett has been waiting patiently for me from Figure AI. You are the CEO.

We had just in this audience now we had a lot of people that are new to AI and let's start with what you do humanoids because I feel like to a lot what is that what does that even mean?

>> Okay. So fig figure designs and manufactures humanoid robots. Uh

humanoids are like mechanical humans. We

have hands, legs, head, sensors, cameras, uh onboard GPU for AI processing. And the goal is to be able

processing. And the goal is to be able to do humanlike work in the real world like just like go out and do commercial work manufacturing logistics warehousing, and be able to put robots in the home to do cooking, cleaning,

dishes, laundry, all this all basically any activity a human does in the physical world.

>> And you are an example of a success story because I think your company is what three and a half years old. You

have 400 employees just within that short time span. And I was looking at the valuation. You went from what is it

the valuation. You went from what is it 2.9 or maybe I have here 2.6 oh excuse me $2.6 billion all the way up to 39.

What justifies that leap in valuation?

What milestones have you hit just in the past year?

>> Yeah, it's a good question. The I think this the story here is that if we can create mechanical or synthetic humans it'll create the largest economy in the world by a long shot like half like little under half of GDP is human labor

figures probably made about five years of progress in the last 12 months. We

have robots out now commercially running every single day like we have a robot right now running commercially in a manufacturing uh commercial partner of ours and we have robots that can do

endtoend uh activities say like fold laundry do dishes only with neural nets.

So we I I think what's happening is we're approaching this kind of singularity point for humanoid robots where it's in the coming years we're going to see humanoids out in the real

world um I think at scale that are doing like end to-end useful work for humans >> which brings me back to that topical point I brought up about job losses due to AI efficiencies and you're talking

about somebody doing my dishes or you know working on the manufacturing floor.

Is that just an inevitable uh I guess outcome of this AI revolution that obviously people are going to be dis displaced much to the dismay of many listening right now? I'm not trying to to to take that.

>> I mean, do do you like doing dishes and laundry?

>> No, but I like having my job, right? And

I do you see journalists on that list that could potentially be taken over by some type of AI bot or deep fake, etc. So, how how do you mitigate that? My my

view is we've been in this automated automation exponential for like last couple centuries and AI is just like a largely a progression of computation that we've seen last you know few

decades where more and more automation is taking place around the world and this is happening both in the digital world we'll have more and more agents doing work like humans and in like the humanoid is like just manifestation of a

physical agent being able to do humanlike work in the physical world >> and you know what is such great timing we're talking about humanoids we're talking about the collaboration with Nvidia and why not just ask the leader

of Nvidia himself to just share some insights.

>> Thanks for having me.

>> Yeah, good to see you.

>> So, we are actually talking about job placement. Well, we were talking about

placement. Well, we were talking about AI and the concern right now about job displacement and I know that you are >> everybody doing here >> watching you at the moment. That's for

sure. And us Brett, >> could you just speak to maybe um the evolution right now with AI?

>> I want to be on TV with Brett.

I'm gonna I'm gonna get to why we're in DC in a second, but I just wanted to get your insights on the concerns about the job force and what AI efficiencies would mean for the job force, especially right now given some jobs are starting to be

spread thin.

>> We have labor shortages all over the world. If we had more workers, companies

world. If we had more workers, companies would make more money. If we were product, more productive, companies would make more money. When we make more money, we invest more money so that we

hire more people. And so, it's very likely that the companies that use AI first, that use robotics technology first, will be the most successful first, and they will end up hiring more

people. You're going to lose your jobs

people. You're going to lose your jobs not to somebody, not to a robot. You're

going to lose your jobs to somebody who uses a robot. You're going to lose your jobs to somebody who uses AI. And so,

so, uh, if you look at NVIDIA today, we 100% of our software engineers use Cursor 100%. And we're hiring faster

Cursor 100%. And we're hiring faster than ever because we're creating more chips than ever. We're developing more software than ever. We're more

productive than ever. And it looks like the company's growing faster than ever.

So, there's a lot of great things that goes along with being productive. Nobody

ever says, you know what, we want I want a company to be less productive. I've

never heard that before. I've never

heard somebody who's who well if you look at look at uh companies around the world about what is it 70% of the companies in the world feel their their

labor short and that they have a hard time hanging on to employees and so the robotics is going to really help that.

But here here's the thing I think Brett you're on to something huge. This is

likely the next largest next giant consumer electronics market.

>> And so I think he's on to something.

>> You I love that Brett's you're helping me with my job. See, I'm going to be displaced.

>> I'm here to watch him. Are you kidding me?

>> Can you speak about the significance of being here?

>> Brett just closed. You closed around.

>> Yeah, we talked about it.

>> Figure is worth what? 3040 billion

dollars today.

>> Yeah, close.

>> 29 was it? Or 30?

>> 39.

>> 39. $39 billion through here. That is

impressive. And congratulations. We did

talk about that giant leap.

>> You're a whopping three-year-old company.

>> Three and a half.

>> Yeah. Well, that extra half really made a difference.

>> Incredible.

>> Which is a great example.

>> Three years after I founded Nvidia, Nvidia was worth still precisely 0 billion.

>> Yeah.

>> Perplexity. I was speaking to the Perplexity CEO. Same thing. just a few

Perplexity CEO. Same thing. just a few years 300 employees valuation in the billions too which is >> doing great >> which is a success story with AI could you speak to the significance about

being here in DC because this is the fourth GTC um it's a different groups of people I met some people from the military some that have never really they don't understand the interconnects

of AI quantum forget it can you speak to why we're here >> the technology industry's really changed you know Brett when I first started the company um quite frankly nobody Nobody cared. Nobody cared about

Nobody cared. Nobody cared about technology companies to be honest. And

we're just we're a startups in Silicon Valley doing our thing. Um but today technology is at the forefront of almost every single geopolitical conversation.

Technology is obviously one of the most important industries in the world. And

what I can say is that on behalf of and both Brett and I, you know, have the benefit of serving and being part of the American technology industry that I

would say is our national treasure. This

is our single most valuable industry. um

unquestionably one of the most important industries in the world and uh really we're really proud to be part of it and and so anyhow >> what would you >> here make sure that make sure that

Washington DC uh has a has a front row seat of uh what Nvidia is working on and what what Nvidia is going to be working on and so this is um GTC bring GDC to DC

is really exciting for us you know doing GTC at our nation's capital it's incredible >> what are do you think are the major roadblocks? Power has come up a lot in

roadblocks? Power has come up a lot in the conversation and it seems like we do have a lot of headlines building out x gigawatts but the capacity to to build that infrastructure is is a roadblock

essentially. So what do you say to that

essentially. So what do you say to that that maybe these promises uh can't come to fruition in the near term?

>> Uh cash flow matters, power matters, um uh land power and shell matters, power generator matters. Uh these are all

generator matters. Uh these are all things factories matter. These are all things that Brett and I think about all the time. And uh his business, my

the time. And uh his business, my business, it's really very similar. But

it comes down to one thing and one thing only, profitable token generation. The moment

we have intelligence that's worthy of being paid for and as soon as that flywheel goes, then everything else will take care of itself. And this is no different than the moment that human

robotics starts to do productive things and people are willing to pay for it and it's a profitable endeavor then you could scale up as large as you want to scale up. This is no different than when

scale up. This is no different than when TSMC in the old days manufactured the first profitable wafer. The moment that happened they could build up the factories as large they want. And so

that's really the moment we're looking for. We're looking for that virtuous

for. We're looking for that virtuous cycle where AI starts to generate productive tokens, profitable tokens.

After that, the flywheel takes off.

>> An executive at another company told me that tokenization or just profits per token could be the next oil and and for company countries all around the globe really focusing on that output. Do you

kind of agree with that just in terms of tokens and maybe explain too to our audience because not everybody here as is as advanced.

Computers speak numbers. Tokens are

numbers. Tokenization,

you tokenize. We tokenize almost everything. We tokenize images, words,

everything. We tokenize images, words, uh Brett tokenizes motion, you know, a sequence of plans that a that a uh that a robot performs. And so

everything that we can tokenize, AI can understand. So if we can tokenize proteins, if we can tokenize chemicals, we can tokenize motions. what uh action

and behavior, all these things that Brett is trying to tokenize. We tokenize

video, we tokenize 3D, everything. We

tokenize uh fluid dynamics, we tokenize anything we can tokenize, AI can understand. And everything that AI can

understand. And everything that AI can understand, it can translate, it can generate. And so that's the that is the

generate. And so that's the that is the revolution that we're in. And that's why token is at the core of everything that we do. And ultimately what we

we do. And ultimately what we fundamentally do is to transform electricity into tokens. And that that's what an AI factory does.

>> We come full circle. I've been told that you have to keep going so we keep the show on the on on because you have a keynote which is very soon. Yeah. An

hour and 15. Jensen, thank you so much.

We'll see you soon. So we'll actually continue this conversation because we did interrupt a little bit and now you've got a massive crowd to watch you.

So no pressure. Um,

if we could I I want to reiterate the fact that three and a half years, 400 employees and you have done incredibly well. One of your customers is BMW. Can

well. One of your customers is BMW. Can

you just speak to maybe just this is the last question just because of timing just the relationships, any other customers that you might be announcing soon please?

>> Yeah. Um, we basically uh yeah, we we we have a few uh a few customers now that we're in commercial commercial uh discussions with. The the goal is like

discussions with. The the goal is like we're looking for ability this year next to deploy robots into the real world, run them every day, make sure we get useful like performance out of them, make sure we get ROI out of the robots,

ma make sure we can bring humanoid robots into the real life.

>> I'd ROI is a big topic, but we have to end it there because that seems to be uh something a lot of people are concerned about. But $39 billion, congrats on

about. But $39 billion, congrats on that. We'll send it back over to you

that. We'll send it back over to you guys. Thanks.

guys. Thanks.

>> What a great surprise with Jensen coming in. Christina, thank you so much. You're

in. Christina, thank you so much. You're

doing great out there. So, science used to move at the speed of experiment. Now,

it moves at the speed of compute, crunching data as fast as we can collect it, modeling everything from the atom to the atmosphere. Then, this just speeds

the atmosphere. Then, this just speeds up outcomes. No doubt. And as AI begins

up outcomes. No doubt. And as AI begins to supercharge quantum computing, we're on the verge of discoveries that could redefine physics, chemistry, and life itself. Here's how accelerated computing

itself. Here's how accelerated computing is unlocking the next frontier of human discovery.

We're going to take a giant step up in several areas in high performance computing for scientific computing as well as quantum classical computing.

Quantum computing is reaching an inflection point. We dedicate ourselves

inflection point. We dedicate ourselves to creating accelerated computing stacks to enable quantum computers.

[Music] Now we could simulate data using a quantum computer and use that as ground truth to then go train an AI model.

Amplify the capability of AI to be able to solve the drug discovery, the the material sciences, the biology applications.

It is clear now we are within reach of being able to apply quantum classical computing in areas that can solve some interesting problems in the coming

years. This is a really exciting time.

years. This is a really exciting time.

Quantum computing is at the heart of the fastest acceleration that should unlock scientific discoveries. From molecules

scientific discoveries. From molecules to the cosmos, AI is transforming how science models the world. And leading

this conversation, we have George Church, chief scientist at Leela Sciences, Matt Kinzella, CEO at Inflection, Mark Tessier Lavine,

co-founder, chairman, and CEO of Zyra Therapeutics, and Anirude Devon, president and CEO of Cadence.

>> All right, Matt, let's start with you here. Big day, big quantum day. Can you

here. Big day, big quantum day. Can you

talk about how um the mixture of we'll call it classical computing and quantum computing is really going to change the game.

>> Yes, I can talk about that.

>> That's an easy one.

>> First of all, you know, when we first of all, thanks for having us. This is going to be a blast, guys. This is going to be a lot of fun. When we say quantum, maybe we should just define some terms because not everyone in the audience might know what that means. And so when we say

quantum, we're talking about the world of the very small, the atomic and the subatomic levels. And there's a whole

subatomic levels. And there's a whole different set of rules that govern the day down there called quantum mechanics.

And so when someone says quantum, it's taking advantage of those very strange quantum mechanical properties.

>> What?

>> Hi.

>> What's going on here?

>> Jensen Wong, everyone.

>> Oh, he brought us water.

>> Thank you.

>> That's what he does.

>> Yes, we do.

>> You need to hydrate.

>> Thank you. It's very important to stay hydrated.

>> What What is this mumble jumble about quantum >> super position >> entanglement >> entangling stuff? Like this is just

mumble jumble.

>> Hey, good to see you.

>> You know, once a bus boy, always a bus boy.

>> Appreciate Denny's to DC. From Denny's

to DC.

>> Thanks for being here. Wow, this is Thanks for having us incredible game show.

>> It's been an amazing morning and this panel I think saving the best for last, Jensen. So, you know, maybe ask this.

Jensen. So, you know, maybe ask this.

You you caused a kurfuffle uh last year when you you you made some comments about quantum. Anything you would like

about quantum. Anything you would like to revise, you know, as we're about ready to launch our quantum panel here about how Nvidia thinks about quantum?

>> You know, I got to tell you, I'm afraid to say a word.

If I just said quantum, the stock price goes up. Quantum quantum

quantum by five by five by five by five by five by five by five by five by five by five by >> here. here. Listen, listen. the work

>> here. here. Listen, listen. the work

that we're doing uh together is is uh really important obviously and and I I think I think that the message that I was what what I was trying to say is

that quantum and classical computing really needs to work together so that we could bring in the usefulness of quantum computing and it's becoming very clear

that the two industries really need to work together as one and that class that quantum classical computing quantum GPU computing could really help uh solve a a lot of very sticky problems, very

challenging problems associated with quantum computing and the ecosystem and ourselves were doing some amazing work.

It remains incredibly hard. I mean the work you guys do, it's deep science. You

know, we do deep engineering. They do

deep science and and so so I I'm excited about the work that we're doing together. I won't I won't I won't ruin

together. I won't I won't I won't ruin it by telling you guys what it is. Um

but but um this doesn't work.

>> Now it does.

>> Now it doesn't.

>> It's okay. No, no. This this is about you guys. It is about you guys. And

you guys. It is about you guys. And

apparently I want to thank everybody for coming to GTC in Washington DC.

[Applause] So So the original plan I just I'll be I'll be honest. The original plan was uh

President Trump was gonna be in Washington DC. And so we brought GTC to

Washington DC. And so we brought GTC to Washington DC. The administration's all

Washington DC. The administration's all going to be here. And then and then um uh literally two days ago uh President Trump said Jensen could you uh be in

Korea today and I said I said uh I would love to be in Korea today but I came to vis visit you in DC and so anyways we're we're in two sides of the world. Uh but

anyhow uh he wishes all of you well. He

he's he he's going to see me in a day and we're going to we're going to we're going to go and um support support the president uh as in his in his tour through Asia. You know, he's our

through Asia. You know, he's our president. We want him to be enormously

president. We want him to be enormously successful so that America could win and America could win out. And so anyways, I want to thank everybody who's here working in DC. Thank you for uh thank you for your service and thank you for

coming to GTC.

>> All right. Great to see you.

>> Take care.

We can't wait to hear more. I mean, we said last night we are in Washington.

We've heard on the panels this morning.

So, maybe we'll just jump to this and then we'll come back Matt to that. But,

>> you know, there's an extreme need to accelerate AI. I think Jensen's been at

accelerate AI. I think Jensen's been at the forefront and Nvidia leading the charge with the president, the White House and Capitol Hill. But we also see

a a lot of dumerism. you know, you guys are in what I would call longhaul, you know, investigation, discovery, and execution. Um, talk to us a little bit

execution. Um, talk to us a little bit about the importance of having Jensen leading the charge and Nvidia leading the charge to your businesses that

require decades of innovation and maybe how you're partnering with uh Nvidia to bring to to to make that happen. Matt,

we'll come back to you on that.

>> Absolutely. Well, I mean, we have looked to Nvidia as a trailblazer of how you bring deep tech out into the market in a staged commercialized way. And so, as we've thought about bringing quantum

technologies out into the market, we saw them point their GPU engine at graphics to start and then at crypto mining and they always found the next thing until the crown jewel of large language models came around for them. And so the way

we've thought about commercializing quantum is by pointing quantum at areas where there is true quantum advantage already like timekeeping and then quantum RF sensors and antennas and on

the trip to that crown jewel of quantum computing that will do things that classical computers can't do. And so

we've really used them as a as what we've modeled our strategy against. And

so Jensen's led the way in many ways and including one of them is how to commercialize deep tech.

>> Right. Absolutely. Do you have any comments on this George? Well, I I I totally agree. Uh I think that another

totally agree. Uh I think that another early stage thing that we can do that roughly involves quantum is natural computing where you uh

>> where the best simulation of a particular system is the system itself.

It's 100% accurate and we and there's certain systems where we have the synthetic ability to make them >> even more cheaply than we can compute.

But they're definitely quantum. They're

very complex. uh systems are are possible >> for sure. So, Honor, you you could have been on any one of these uh panels, all

five, uh given how uh how broad your company operates in. But I do want to ask you about how is AI accelerating

uh all the way going from uh the chip to the entire system and as we heard from our panelists uh before you also into electrical and cooling systems and the

power grid. Yeah, first of all, great to

power grid. Yeah, first of all, great to be here and uh so for people who are not familiar with Cadence, Cadence basically makes products to design chips and

electronic systems and we have a long history of working with Nvidia. We have

worked with Nvidia for more than 20 years and it's remarkable to see you know what Nvidia has become and you know under Jensen's and what it will be going forward. Now I think the key thing for

forward. Now I think the key thing for chip design and and also now the chip and systems are merging together is there's a lot of deep science in it you know so basically our products are

software products which is a combination of CS plus math plus physics okay and applied to chip design and system design and the whole stack needs to be

optimized and what AI can do is it can provide the next level of innovation you know the next 10x productivity improvement because chip design itself is exponential So if you look at chips

now in the by by 2030 the chips will be 10 times bigger the systems will be 30 40 times more complex. So we need the next level of productivity improvement

that AI automation can provide. So to me it's a combination of AI you know the basic science you know the ground truth and accelerated compute. This is what I've called for a while the three layer

cake and all three layers of the cake have to work together with AI the ground truth that's still very important and then accelerate computing and the interesting thing with accelerate

computing especially you know I always wanted for a long time combination of CPU and GPU >> and Nvidia has fabulous GPUs and GPUs have become much more general purpose as

Nvidia has latest generation but with Grace uh Gracehopper and now Grace Blackwell. The fact that CPU GPU are

Blackwell. The fact that CPU GPU are close together gives a very great foundation platform to build science and AI on top of that.

>> Before we go to the next one, maybe just an announcement to everybody in this hall, Jensen is beating you over to the keynote. So, I've been advised to tell

keynote. So, I've been advised to tell everybody in this hall they need to make their way over to the keynote to grab your seats so that Jensen's not over there alone. Meanwhile, we're going to

there alone. Meanwhile, we're going to continue this conversation. You know,

one of the things uh you know that has been a promise for a long time is how AI or how technology is going to advance

human genomics, advance drug discovery.

And I would say that there's been a lot of promise and there's probably been less success than the world might have thought 20 years ago in terms of the transformation that would come. It feels

to me like we've been laying down the tracks, right? the groundwork, the

tracks, right? the groundwork, the primitives for these big breakthroughs.

And so my question to you is, are we on the verge of really solving problems that we've been talking about for the last 25 years? Do you think the cycle time on discovery is about ready to

change because of where we are in AI?

And you know, anybody who wants to chime in, but we'll start here.

>> Right. So the um I think we're at an inflection point. Absolutely. And it's

inflection point. Absolutely. And it's

for the reason you you mentioned the the tracks have been laid down and there have been advances just in the last few years that have made it possible to achieve liftoff. If you think about the

achieve liftoff. If you think about the problem of drug discovery um and and the conundrum that we face today when you start working on a molecular target to make a drug make the drug take it

through clinical trials and get FDA approval on average that takes 13 years.

Nine out of 10 drugs that enter clinical trials fail. Um the overall cost

trials fail. Um the overall cost including the failure is about2 to4 billion dollars per drug and it's not improved in the past 20 years and that's because drug discovery is still very

artisal a lot of science yes but a lot of art intuition empiricism trial and error and at the highest level AI the promise of AI is that with the right kinds of data and the right amounts we

should be able to transform this from a an an artisal endeavor into an engineering discipline with much higher success rates and shorter timelines. I

think in terms of the inflection point um if you think about the uh there are multiple applications of AI in in the drug discovery um platform uh one is at

the level of um uh logistics if you will uh designing clinical trials recruiting patients um there the the advances with large language models are already being put into action. We're seeing uh

companies across the world um implementing this to uh improve the efficiency of recruitment, identifying patients, uh filing reports to the FDA.

All of those things are being accelerated in a dramatic fashion. This

second area is in molecular design.

That's been touched on in the movie and you've talked about it. Uh where instead of screening for drugs, um we're designing drugs. Just as you can ask,

designing drugs. Just as you can ask, you know, Sora to make a movie of two puppies playing on the beach, uh you can take a a protein and say, "I'd like to make an antibbody, you know, a drug-like

substance that attaches itself in just the right way and have the AI design it." A lot of action there, but the

it." A lot of action there, but the advances are recent. The Nobel Prize a year ago went to um two groups the deep mind group D de D de D de D de D de D de D de D de D de D de Dennis Savis John Jumper and David Baker our co-founder uh

for advances just in the fast past few years alpha fold RF diffusion RF antibbody that make it possible so we didn't have those technologies 10 years ago even 5 years ago so that's prompting

that acceleration and then the last area the in some ways the hardest one is understanding human disease human biology and human disease and getting those insights to figure out what should we make drugs to and who are the

patients who are going to respond. Uh AI

is just entering that area. That's also,

I believe, going to build over the next few years, but we're seeing acceleration, very rapid acceleration in the first two that wasn't possible until a few years ago, and we're going to see increased acceleration in the third.

George, same question to you. Are we on the verge of a new cycle time in terms of discovery? um you know is is is a is

of discovery? um you know is is is a is is the promise you know I I I look at uh discovery and invention and these are nonlinear systems. >> Yes.

>> Right. And it feels like we're at a moment because of all the technologies that we're talking about that a lot of this promise is going to come to bear.

What are you seeing through the lens of Laya?

>> Yeah. So this is definitely an amazing moment. There's a there's the the

moment. There's a there's the the intersection of uh exponentials that have been occurring both in computation and in biology. For example, 20

millionfold reduction in cost of sequencing and that and that's something we use not just to study the populations but to guide experiments to interpret experiments. So for just that analytic

experiments. So for just that analytic tool but uh their synthetic tools are also getting better. I mean we we we're we're our even our uh confidence in

having low toxicity and high efficacy is making the clinical trials shorter. the

the record now as far as I know is baby KJ which was seven months from birth diagnosis to cure with a gene therapy

and uh we are really actively using not not just uh the dichotomy between screening and uh

analysis and and prediction but putting them together so you can use AI to design as well as you can but then you make big libraries because there's some modesty that you can't necessarily

design the perfect thing but you can make these big libraries and that combination is much more powerful than either one of them separately. So for

example at at uh at Dino and and manifold we've made uh proteins that target the nervous system a 100 times better and they dearget the liver where

you can get um you know toxicity and and and uh absorption of the valuable drug.

So those are just examples. This is

happening now. This is not dreamy stuff off of the future. Yeah,

>> exactly.

>> So, before I go to the next question, I have a public service announcement and that is the pregame show is actually being broadcast inside of the hall. So,

as much as I'm sure you want to hear Brad and I and the guests, Jensen is a bigger show and you want to make sure that you get your seat. Thank you very

much. Okay, now to your regularly

much. Okay, now to your regularly scheduled program here.

>> Yeah. One thing I want to add on on drug discovery and cadence has a molecular science division. We are working with a

science division. We are working with a lot of the pharma companies because a lot of the math and science is similar in chip design which is very nonlinear to to drug design and something to watch

because is how much of the work is done >> on the computer versus how much of the work is done in the lab.

>> Got it.

>> And if you look at like three big areas that I'm involved in, one is chip design. In there 99% of the work is done

design. In there 99% of the work is done on the computer and you know you get first time right silicon okay then the second part is system design you know design of planes and you know data

centers and cars okay over there you know 20% of the work is done on the computer 80% is still done on the physical work and then you go to drug design molecular design I think only

few% of the work right now is done on the computer and most of it is done that's why the design times are so long okay so the what happened In chip design is most of the work moved to the

computer and then you can do modeling which is accurate. Then you do simulation and optimization which is basically design. So the more the

basically design. So the more the modeling is accurate more simulation happens more optimization with AI can make drugs happen. So I think it is definitely on the verge but if you

compare it to other established areas it is still very very early days in drug.

>> Yeah. So I have no please go ahead. Just

want to add the in terms of this transition to go from a few percent to 10% 20% 30%. Part of that is having accurate simulations. Um Eric Schmidt

accurate simulations. Um Eric Schmidt has a a saying that is uh I think very profound which is AI is to biology what math is to physics.

>> Currently for physical objects you can model them using the equations of quantum mechanics or or other things.

For biology we don't have equations. AI

is the tool to make sense of biological data because AI can see patterns where we can't with the human eye. Of course,

you have to feed the right kinds of data for that. So, what we're in right now is

for that. So, what we're in right now is a a moment where uh we and uh others in academia and other companies are generating massive amounts of data to train AIs that will understand biology.

So, we can have more and more of that stack absolutely >> in silicone rather than in the wet lab.

That together with the automation that that George talked about is what's going to accelerate this. So, a follow-up question on that, and this is a bit of a challenge. Uh, I saw a recent report

challenge. Uh, I saw a recent report that said that China has 10x more drugs in the hopper than the United States does, yet we have 10x the amount of data

centers uh that they do and probably even a lot more compute power. What is

what is that disparity in? I'm not

talking about trials. I'm just talking about the creation of it. uh why are we falling behind and what needs to be done to accelerate this?

>> Well, I wouldn't say we're falling behind necessarily because there's >> quantity of drugs and there's quality and whether you know a gamechanging drug

is one that reverses aging because 90% of us are going to die from that and but you can make a whole lot of uh me too drugs that don't really uh solve

problems. That that's part of it. I

think in in addition um they do have some advantages like they have this IIT system investigator initiated trials which are very streamlined and and um

and and and seem to be dodging uh anything having to do uh they seem to not be running into problems with toxicity with the focus on on lowering that toxicity problem. So I think it's

healthy competition. Yes. But I think

healthy competition. Yes. But I think we're quite a bit ahead. Uh partly

because we are integrating computation so smoothly with it uh right now.

>> Now I appreciate the uh the clarity uh on that for sure.

>> One of the one of the questions um and feel free to chime in there as well. But

one of the questions I have around quantum >> is I think for the average uh investor >> Yes. Um, for the average observer, they

>> Yes. Um, for the average observer, they see these stocks flying.

>> They think it's part of the meme bubble that exists in the world.

>> They have no idea when the benefits will come.

>> Um, so for somebody leading a company, you know, like inflection where you have real customers that are leveraging this today, >> help the viewers at home understand like

what is the important consequence for making these investments? um are the stocks today in the quantum universe are they ahead of the ahead of themselves like do we have some of these mini

bubbles where everything is getting pulled up and uh because I think deflating that and providing a real set of context is important for the longevity of the industry. Mhm. Well,

Brad, uh just for some context, and you know this, but before I came to be CEO of Inflection, I was an investor uh for 19 years. And so, uh one of the things I

19 years. And so, uh one of the things I did learn as an investor is never uh give stock tips to friends because you're going to be wrong uh more than 50% of the time. And so, I will punt on any commentary on, you know, the the the

stock prices of of the publicly traded quantum companies. What I will say is

quantum companies. What I will say is the reason this matters is going back to where we started, unlocking the power of quantum mechanics and turning that into

products will result in orders of magnitude improvement in those types of products. And so we're not talking 50%,

products. And so we're not talking 50%, we're not talking 100%, we're talking 10 10,000 1 millionx improvement in performance. And so going back to what I

performance. And so going back to what I was saying about how we followed Jensen's strategy on how to monetize and commercialize and build the market for ourselves in quantum and actually kind of show the world that quantum has real

advantage today is we pointed this at areas where it already does have those types of 10x 10,000x 1 millionx improvements in performance and so timekeeping RF antennas things like this sensing those are real market

opportunities to your point where there's real real um customers and products today. So part of it is I think

products today. So part of it is I think you can look to other areas than quantum computing to see real quantum advantage.

Now when will we see advantage in quantum computing? That's the big

quantum computing? That's the big question everybody wants to know, right?

It all comes down to logical cubits.

Logical cubits are really the keys to the kingdom in in quantum. And this is one of the ways that AI and quantum are really going to interact and be tightly coupled because AI helps us get to logical cubits faster because logical

cubits are error corrected physical cubits. So they're sort of these

cubits. So they're sort of these pristine cubits that you can actually use to do computation. So up until 2023, the world did not have logical cubits yet. It wasn't even we weren't even sure

yet. It wasn't even we weren't even sure if we could get them. In 2023, we saw the first logical cubits. Today,

inflection in a handful of companies have logical cubits. We announced that we had 12 as of a month ago. And it's

generally believed that about a 100 logical cubits, you'll start to see quantum advantage in areas like material science. When you get to a thousand

science. When you get to a thousand logical cubits, you'll start to see quantum advantage in areas possibly like drug discovery. And as you scale those

drug discovery. And as you scale those logical cubits, you'll see quantum advantage in areas beyond those. And so

I do think this will be absolutely game-changing technology. I left a great

game-changing technology. I left a great career at a firm called Maverick Capital to come do this full-time because I see the opportunity ahead is absolutely huge.

>> So I have to ask uh George and Mark, I mean, you're using GPUs today. Uh we'll

call that it's hard to call GPUs classical computing. I think that is

classical computing. I think that is CPUs. So CPUs to to GPUs. Um how are you

CPUs. So CPUs to to GPUs. Um how are you looking to quantum to help change what you're doing? Because quite frankly if I

you're doing? Because quite frankly if I look at the algorithms uh that they're working on and what what they do it would seem like there would be an opportunity uh maybe not immediately but

maybe in the future.

>> Mark, you want to start? Well, in terms of the which kind of going from CPUs to GPUs to to quantum uh comput or what?

>> Well, we're going to have a world I mean listen as we know nothing goes away.

We'll always have CPUs, >> right?

>> GPUs uh and and we'll have quantum which will assist it.

>> So, you know, we're like many companies like ours that are doing, you know, applying AI to molecular design, AI to develop foundation models of biology.

We're big consumers of of GPUs, obviously. um what we're doing. There's

obviously. um what we're doing. There's

certain applications where really having the power of of quantum computing would come in uh very very handy particular for molecular dynamics simulations uh

which right now uh require you know um basically much more uh compute than than what we like to deploy. Um so we see lots of use cases where we could leverage that kind of technology. For

now we're um you know we can run with the technology that's being made available by Nvidia today.

I mean one thing to add on quantum and we we work with a lot of the quantum companies as they design their computers and quantum is very promising especially for certain applications it can give a

huge speed up so we are you know closely watching when it gets to scale but I think future will be hybrid I think CPUs are still important

are phenomenal you know FPGAs custom silicon so I think it's not either or thing quantum will have certain big applications. It will give dramatic

applications. It will give dramatic speed up, but all the other hardware platforms will work together to to solve.

>> I completely agree with that. I think

just as GPUs layered into the data center and enable new capabilities of what we could do with compute, QPUs will start to slowly layer into the data center and just expand what we can do at

compute with compute. And I think it'll result in more CPUs and more GPUs being deployed and sold because we'll be now addressing problems that just weren't able to be addressed with compute historically.

>> I mean, maybe one of the uh you know, um as as we begin to reflect on the day heading into the keynote with Jensen, um

you know, one in my world, the world of investing, the talk really is about uh you know, a bubble and whether we're ahead of ourselves. This is where we started the conversation. I think it's

super important that we kind of steal people's minds for the fact that this is not a straight line up and to the right.

Um, you know, as we look ahead. So a as you take yourself out of the position of running your companies, but just as an observer of technology ecosystems over a

long period of time, um when you hear all of the deals announced over the course of of of the past many weeks, trillions of dollars of compute that's

that's getting built, what concerns you the most, right, as you're running your company, as you look ahead, what do you think could upend us? What do you think

might be that that risk that causes this to slow down or us to get ahead? Are any

of you concerned about you know kind of the dark fiber and the the the GPO you overbuild that you certainly hear talked about on CNBC every day? Well, let me

start and I mean what is promising for me to see is that you know there is still a lot of demand from our customers the big companies to build more and more

compute and the way I think a lot of people don't uh quite uh grasp that there are multiple phases of AI in my mind and we are in the horizon one or

first phase which is the infrastructure buildout okay build out for on the cloud on the edge but there is horizon 2 and horizon three phases which can be even bigger. So, so the first phase is

bigger. So, so the first phase is infrastructure, LLMs, you know, and if you look at that first phase, I still see talking to all the big customers, years of growth in compute and AI

capability.

>> But what is even more encouraging to me is phase two, which I've always said for years is going to be physical AI. You

know, AI is not going to be restricted to the cloud and to to software applications. It will move to the

applications. It will move to the physical world which is cars, planes, drones, robots and that could be trillions of dollars of monetization and then phase three even though we are

already doing science now but science is AI to me is a horizon 3 application with drug discovery when material science I mean these are trillions. So I think we are only in horizon 1. Yes.

>> And talking to customers and partners, horizon 1 still has ears of legs. And

then you add horizon 2 which is physical AI. Horizon 3 which is science is AI. I

AI. Horizon 3 which is science is AI. I

mean this is a long way to go.

>> It's a great framework >> and I think we're seeing the same thing in in in our field. The you know in phase one the availability of the algorithms and the compute means that it makes sense for us to try to generate

data at scale. previously we couldn't have made use of of those data interpreted them uh in the right way. So

we're going to have that that build out.

We're going to see the permeation of AI in all aspects of the laboratory. Um you

know that's how people will analyze their data. It's going to become just

their data. It's going to become just routine and and and baked in. So I I see um you know steady growth and especially with automation of the kind that um that

George uh is thinking about. Uh but also I think it's important to uh try to set expectations right. It currently takes

expectations right. It currently takes 13 years to go from starting a drug discovery program to having FDA approval. We're not going to get it down

approval. We're not going to get it down that whole 13-year process down to two years. Can we get it meaningfully down

years. Can we get it meaningfully down to 10 years to I think we should give ourselves the ambition that over the next 10 years we get down to we cut it in half and we cut the attrition in

half. Those would be huge gains in terms

half. Those would be huge gains in terms of bringing life-saving therapies to patients. Uh but we have to be

patients. Uh but we have to be realistic. Some of the AI companies that

realistic. Some of the AI companies that started before we did are already going very quickly from starting a project to getting into clinical trials. You know,

one of the ones um there's one that announced recently within two years they went from starting a project to getting clinical trials. Uh normally takes about

clinical trials. Uh normally takes about 5 years to do that. So I think we're going to see meaningful improvements but you also have to be >> well we already have an example of seven

months. So let's not say that it's that

months. So let's not say that it's that far off necessarily. And to the hybrid strategy, I think we're we're we we already h energy is going to be a big

thing. And if we look at biological

thing. And if we look at biological intelligence, the the energy consumption there is 12 orders of magnitude better if you talk pedlops per watt. Um and so

we need to look at that as an example.

So we'll have hybrid systems. We've already made single molecule transistors in a cos system. We published this three years ago. um we're going to see more of

years ago. um we're going to see more of that uh molecular uh electronics uh so hybridizing with quantum uh which is quite different.

>> Well, looping back to Brad's initial question which is the bubble bears. It

sounds like and I I say this tongue and cheek uh because I'm an optimist is you could do some pretty useful things with a 100x or a thousandx more compute. I

mean, you could make lifesaving drugs out there and things that can literally cure diseases that have never been cured before.

>> I mean, I think the way my mental model is is this is is there's an enormous investment cycle going on here and it's a return will need to be earned on these investments. And and so I think it all

investments. And and so I think it all comes down to do you believe a return is going to be earned on these investments?

And I guess you could say, are the investments leading to useful things?

And it sure seems like they're going to be useful. And so the question is what

be useful. And so the question is what kind of return are they going to earn on these investments? So

these investments? So >> and these multiple phases we reinforce the previous phase.

>> So if there's infrastructure AI, physical AI, sciences AI in physical AI when we deploy AI to robots or cars, of course the car itself has a inference

chip like in a Tesla or in a robot.

>> But the model, the new word model still has to be strained on the data center.

So not only the infra infrastructure AI is strong because of software application when physical AI happens it reinforces the data center. Same thing

with sciences AI it will reinforce the data center and also in medicine there will be a lot of robotics that helps. So

these layers not only are good by themselves they reinforce the previous layer.

>> Yeah gentlemen this has been an amazing conversation. science and quantum. We

conversation. science and quantum. We

got a special guest, Jensen Wong, uh, come up here and literally add some serious value here. So, thank you very much and hopefully I'll see you on the floor.

>> Thanks. Great job. Thank you, guys.

Thank you. Thank you.

>> All right, folks. Let's check back in with CNBC's Christina Parts Nevalos.

Christina, we're talking here about quantum systems and a lot of science.

What do you think is the smart investor play?

>> Wow. Way to put me on the spot.

Remember, I'm a journalist, right? So, I

got to remain unbiased. And I actually I'm going to pivot and deflect your question because I don't want to weigh in on quantum even though I'm learning quite a bit from just even how to stabilize the atoms. Um, one of the

points that Jensen was on the panel with you very quickly, he spoke about, you know, having GTC here in DC because he thought President Trump was going to be here and then two days ago uh, he moved

on to Korea and we know why. Could you

guys both maybe weigh in on perhaps China reopening as a market because I know uh, it was a a financial conference not too long ago where Jensen Wang said that they're 100% out of China at the

moment. They're still working on getting

moment. They're still working on getting those licenses. So maybe Brad and Pat,

those licenses. So maybe Brad and Pat, you could both weigh in with your thoughts on the Chinese market and what we could expect out of uh conversations later this week.

>> Well, Christina, I mean, it's an extraordinary week. The president

extraordinary week. The president starting in Malaysia, going to Japan today. Uh and then we just heard from

today. Uh and then we just heard from Jensen that he's going to meet the president in Korea. And if they get a deal done as speculated with China, I think it's an extraordinary tailwind to

the market at large. Um, remember if you think about where we were in April of this year, the NASDAQ was down 20% on the year. Everybody was panicked about

the year. Everybody was panicked about whether or not tariffs would upend the global economy. Now, it seems like we've

global economy. Now, it seems like we've landed the plane on global tariffs that we've gotten a you know, the big beautiful bill passed as as you know out out of Washington. Um, and now we've got

we're into a rate cutting cycle. I think

that there's a very good chance because logic is on the side of Nvidia. Nvidia

had 95% market share in China. Today it

has zero.

>> Right?

>> If we want the world to run on the American AI stack, we ought to be competing everywhere in the world with America's AI technologies. That's

including in China. And so to me, it'll be an interesting question. Does do do Secretary Besson and the president, you know, use Rare Earth and some of these

trades that are going on to allow chips to be sold in China? I certainly know where Nvidia stands on this. And as

Jensen and I talked on my podcast, there are some of those who say all he cares about is selling more chips, but those of us who know Jensen know that he's an American patriot first and he wants the

US to win the AI arms race when it comes to AI. 40% of the researchers and

to AI. 40% of the researchers and developers in AI are in China. They

would prefer to use the CUDA ecosystem and build on Nvidia chips. So let's hope that certainly if not this week uh that in the weeks ahead US chips are back in China. Um I think the markets are

China. Um I think the markets are currently saying no Nvidia chips in China. Uh Nvidia's told investors not to

China. Uh Nvidia's told investors not to include it in their guidance. Um and so we'll see. But I I I think there is a

we'll see. But I I I think there is a chance um because I think logic is on the side of Nvidia being able to compete in China. I think it's good for the uh

in China. I think it's good for the uh uh US's race uh in in in in global AI.

>> Okay.

>> Yeah, Brad really nailed it here and I'm glad you hit the both the import and the export portion of that.

>> But I actually have a special guest right next to me that we can continue this China conversation if you guys don't mind and give you a little bit of a break. Anarude, you just the TV magic.

a break. Anarude, you just the TV magic.

You just came off of set. This is the first time I'm meeting you cold turkey like this. So, thank you for for doing

like this. So, thank you for for doing that quick move. Congratulations on your quarter as well. Uh, I was going through the transcription last night and there was a few things that stood out. You

said EDA tools would become standard.

More specifically, you said quote monetization approving over two contract cycles. So, when could we start to see

cycles. So, when could we start to see more meaningful revenue uh from this shift if you're saying just within the next two contract cycles?

>> Yeah. Hi, Christina. Great to be here.

>> I know it was like a hard question.

Boom. Right into it. Well, we announced our earnings yesterday and we raised our outlook for the year uh both from a revenue and EPS standpoint. So, we are

very pleased with the performance. Our

uh revenue this year is going to grow 14%. And we're always focused on revenue

14%. And we're always focused on revenue plus margin and our margin operating margin will be close to 45%. So, we are pretty pleased with how you know we are able to monetize of course chip design

and AI applications into into chip design.

>> But so, maybe I'm not understanding. So

does that mean in two cycles from now we should see a meaningful increase? Is

that what you were implying in that statement?

>> No, my statement is that you know we are also infusing. So cadence has like two

also infusing. So cadence has like two ways we are participating in AI. So one

is design for AI, the other is AI for design. So when I'm say design for AI is

design. So when I'm say design for AI is the buildout of the AI infrastructure.

So we working with most of the big AI companies as they build out chips and compute. And my comment on two contract

compute. And my comment on two contract cycle is on the second part which is AI for design. Okay.

for design. Okay.

>> So, not only we are helping out build the AI infrastructure, we are applying AI to our own products to make them more efficient and in any new technologies, you know, we are very positive on the

impact of the technologies, but we are always prudent in terms of when it will translate to business and normally we have seen over the years that it takes some time as customers try it and then

deploy it. So, the deployment is is very

deploy it. So, the deployment is is very impressive. Most of our top customers

impressive. Most of our top customers are deploying our AI tools and then we hope to monetize as we move along. Yeah.

>> Speaking of monetize, maybe monetize in China could be a topic since I promised Patrick we would continue that conversation. If I remember correctly,

conversation. If I remember correctly, it's 18% of total revenue, correct? In

the market and there was a concern that maybe there was some pull forward given the constant back and forth with what's going on with China. Can you just speak to maybe was there some pull forward?

Was that what you were seeing from Chinese customers in particular? China

revenue if you look on a yearly basis is about 11 to 12% for cadence and what I would like to say is that most of our business is in US you know about half the business is in US and rest of the

world we are very diversified across all you know we work with all the major tech centers and countries and so we glad to work with China but we also work in India Taiwan Japan Israel Europe in all

the major tech centers so you know since semiconductors are so essential to everyday life and and but overall Our China business I think is about 10 11%.

And you know but it is higher in Europe you know it's higher in rest of Asia >> and and the settlement is clear now. I

think it was like $140 billion right >> um so that's done all done. Um

>> lit Bhutan um the current CEO of Intel um used to be at Cadence is there what do you think about what they're doing at Intel? Do you think that process of

Intel? Do you think that process of getting investments from SoftBank the US government stake uh Nvidia partnership etc. What are your thoughts on that?

>> I think Libu is doing great at Intel.

You know, we're glad to partner with Libu. He's making all the right moves

Libu. He's making all the right moves and just like we're great to partner with all the leading companies. You

know, we worked with Nvidia for about 20 years.

>> So, our goal is to do innovation, provide products and to our customers so they can build even more amazing products. So, we are fully supportive of

products. So, we are fully supportive of all the efforts with our top customers.

>> I know. And we'll end with just the relationship with Nvidia. Could you

speak to just I know you've you were on your panel and you were chiming in on physical AI. You're talking about how

physical AI. You're talking about how your tools are going to be used, but could you just elaborate on how the relationship has changed with Nvidia even within the last >> year or two?

>> Yes. Yes. This is remarkable. First of

all, Jensen is is a true visionary. you

know what he has done and the impact he's happening happening on multiple industries and we have worked very closely with Jensen and Nvidia over the years and the relationship has really

you know blossom even further in the last few years and you know first of all they use our products to build their products you know that's the essence of our uh you know that's how the relationship got started but then over

the last few years we are also using their products to turbocharge our products right >> and what is really impressive to me is how general purpose GPUs have become you

know and now with Grace Hopper and Grace Blackwell the combination of CPU GPU together in one accelerated platform along with AI it can give lot of speed up to our product. So we jointly

announced a product 6 months ago which can give up to 80x improvement and up to 20x lower power using the Nvidia accelerated platform also in software.

So our partnership on Omniverse of course on CUDA on Neotron. So, it's a great, you know, partnership with Nvidia and we really cherish that relationship.

>> 20 years, right? I got to leave it at that and and I'm gonna see you soon.

We're gonna go keep this conversation going. Thank you so much. And to our

going. Thank you so much. And to our audience here, I'd encourage everyone to head over uh to the next hall because the the keynote is very very soon, which is why I need to wrap. So, back over to you guys.

>> Yeah.

>> Yeah, that's what I'm hearing.

Christina, if we don't uh get over there and get our seats, I know everybody's grabbing free food and coffee. Patrick

and I, we have reserve seats over there.

We do.

>> Well, I think if you don't have a reserve seat, you need to get over to the main hall. We're about ready to light it up with Jensen. Um, and it's going to be an exciting day. But

Patrick, before we do, we've saved the best for last. So, why don't you introduce us to our last panel on robotics?

>> That's right. So, robots aren't coming.

They're already here. In fact, AWS even announced that they had a million robots here in the US, but out in other countries, they have a lot more. And

whether it's for factories, for operating rooms, accelerating computing is training robots in simulation and deploying them into real world faster

than be ever before. I mean, it's the bridge between simulation and reality.

Training robots to build, move, and even operate with precision once thought impossible. Here's how AI and robotics

impossible. Here's how AI and robotics are powering a new industrial revolution.

The age of robotics has arrived, reshaping every industry where everything that moves is robotic, in the operating room,

at home, our warehouses, and igniting modern manufacturing.

Physical [Music] [Applause] [Music]

[Music] AI and robotics are moving so fast.

Everybody pay attention to the space.

This could very well likely be the largest industry of all.

The physical world is becoming programmable and this fusion of intelligence and industry is creating new types of factories, new jobs and a

more resilient manufacturing base. The

leaders driving that transformation.

Join us now. First we have Peter Kouta, chief technology officer and chief strategy officer at Seammen's AG. We

have Yonglu, chairman and CEO at Foxcon.

We have Brett Adcock, founder and CEO Figure AI. And we have Aki Jane,

Figure AI. And we have Aki Jane, president and CTO, Palunteer US government.

So great to see everybody here. So

Young, I I want to ask you the first question. First of all, my relationship

question. First of all, my relationship with Foxcon goes back uh to 1995, and I don't know if they call that experienced or old. Uh but you are one

of the biggest manufacturer for all precision goods, uh from smartphones to hyperscaler, data center equipment, and pretty much everything in between. Uh

talk to me how AI and robotics is transforming what Foxcon does.

Okay, first of all, thank you for having me here. This is a very great event for

me here. This is a very great event for technology companies these days. You

know, Fastcon is the largest uh manufacturer in the ICD industry and uh we uh you know used to be you

know very much labor intensive and then we transform to uh automation intensive

with new generative AI technologies. We

think the AI intensive you know manufacturing is coming and uh uh with this new technology this

is new and disruptive we'll have to work with industrial leaders or technology leaders like Nvidia Seammens and the

friends at this table together you know to be able to catch up and apply the new technologies to our manufacturing you

know uh uh uh uh facilities and currently you know we are doing uh we are building up factories you know

here in the states in uh Ohio, Texas uh Wisconsin and California and we think

the AI era is coming and I call this industrial 5.0 No, >> that's excellent. Uh, Peter, uh, your

company is known for being in all of the connective tissue inside of manufacturing.

Uh, and I'm curious, how is AI making manufacturing in the United States more competitive? because that's a big talk

competitive? because that's a big talk particularly here in Washington DC and quite frankly it has a lot to do with and Brad pointed this out in earlier

segment even national security right so we believe that AI really will be a key driver for the establishment of manufacturing yet again in the United

States which is going to be smarter so we think of it as more of a AI native factories that we want to build so it starts with the census right I mean the census that we all carry and uh that

that we all carry. We love the data and because we got so many sensors now everywhere, humans cannot make sense out of this anymore. So therefore you need to automate that and so as we build now

factories we built them always twice. We

built them first in the digital world and we optimize them and we see how you know you place the machines, how the material will be flowing, how humans will be interacting with machines and

then we optimize them over and over and over again until we finalize think this is great. Then we built the real thing.

is great. Then we built the real thing.

So with the real factory but still you got the real one and the digital one and they talk to each other. So whenever you have like a supply chain glitches that we used to have remember the semiconductor shortages you actually can

go to your digital model and you can look at it and say what if actually this part is missing what what will the implication on of course on the shop floor and that gives you unprecedented

speed it gives you unprecedented productivity and of course sustainability and energy efficiency and I think this is vital as we think about the next generation of manufacturing blueprints here in the United States

which is very labor constrainted because we don't find enough people to fill all the factories. We have to automate that

the factories. We have to automate that and see is a key part of bringing manufacturing back to the United States.

>> Makes sense. Hey Brett, you know a lot of people talk about Tesla and Optimus and what they may not know is that Figure AI has emerged as a startup from a start small startup to a real leader

in humanoid robotics. We heard you earlier this morning jamming with Jensen about the things you're doing with Nvidia to, you know, put you at the front of that pack. Tell us a little bit

about Figure, where you are, and how the relationship with Nvidia is so potent in in in really launching you to into a leadership position here.

>> Figure is trying to basically build humanoid robots that can do everything a human can. Um, you you can't code your

human can. Um, you you can't code your way out of that problem. Um like ma mathematically we have like 40 different joints in the robot that can ultimately be in I mean every joint can it's a

motor can move like 360°.

So um you basically have uh 360 to the^ of 40 equals the number of states the robot could possibly like the basically body positions.

>> Yes.

>> It's like more than atoms in the universe.

>> So it's just you can't you you have to solve this with neural nets. And so for us, we're we have this like kind of phrase we use where we're trying to give AI a body and to do everything a human can. So that means everything from

can. So that means everything from pre-training like we have to basically build like large scale data collection efforts of humanlike data to do training, we we use NVIDIA there. And

then that means at test time when we're running policies on robot, we're we're doing inference on NVIDIA GPUs on robot without any network connection. So we

can basically run robots in full end toend situations like doing work without any network uh outside network. So so

for us we think of ourselves like an AI business. We happen to be building like

business. We happen to be building like these physical agents that are out in the world like similar to web agents.

We're basically just like in the physical world touching things. Um and

Nvidia has been a very significant partner and investor of ours and I think they will be significant in the future as well. And you you guys today just

as well. And you you guys today just give people a sense as to where you are.

You know, when did you know we heard a million uh you know robots out at Amazon like how how should we be thinking about how many humanoid robots you guys are going to be producing you know two or

three years out?

>> I think to be candid the the problem that the entire space feels for humanoids is we have to solve like a general purpose robot.

>> Yeah.

>> You have to get to be able to like just talk to it and have it do anything you'd want it to do in unseen locations. That

problem is not solved. That problem is 10 times 50 times 100 times harder than making a a humanoid robot.

>> Yeah.

>> In my view. So the hill we need to climb the like the hill we're trying to hill climb now is how do we build a horizontal AI stack that can do everything a human can. And quite

frankly we have robots that are working in the commercial market as of right now like to this very moment. And it's been really good for us to get those lessons learned. But it also gave us insight to

learned. But it also gave us insight to build like how what is the right technology stack to go build and that technology stack is like end toend deep learning. And so I think for us we're

learning. And so I think for us we're trying to hill climb that problem. In

parallel we've now announced bot Q this year which is our basically highcale manufacturing facility out in California. We do all final assembly

California. We do all final assembly testing and robot shipping from there.

And that problem is hard. It's more like a consumer electronics manufacturing problem, but it's not it it dwarfs in comparison to solving a general purpose humanoid robot, which is just like a on the surface just like an incredibly hard

problem. It's tractable. It's um it's an

problem. It's tractable. It's um it's an approachable problem, but that's the problem we need to solve first. And then

from there, we need to figure out how to manufacture at scale. That's like heavy automation. We need robots in the loop

automation. We need robots in the loop for end manufacturing. Um and we're doing it now inhouse. So we figure out how to build MEES systems, how we build lines, uh how we do inline testing, how

we deliver at the customer and make that better and validate it. So we're

basically trying to get good at manufacturing while we're trying to solve the problems of like how do we solve like for how do we basically solve general bodies.

>> Yes.

>> Yeah. uh Aki, you know, Palunteer, I heard Alex say the other day, Alex Karp, of course, um you know, that you're solving the ontology problem that sits between silicon and you know, and

getting useful outcomes, uh for companies, for uh for businesses, for physical intelligence. Talk to us a

physical intelligence. Talk to us a little bit about how you think about the role that Palunteer plays in, you know, bringing us from from the promise to reality.

>> Yeah, absolutely. So if you go back to the core of what Pounder is all about, you have all these data systems. You have ERPs, MRPs, random software that people have built in the ecosystem and all those kind of systems solve a

specific problem. They have different

specific problem. They have different security frameworks and whatnot. How do

you provide the ergonomics whether it's for an autonomy stack, whether it's for a factory worker, whether it's just for a human to actually answer questions about that business and optimize it.

That's really what the ontology is. It's

kind of that agility layer. Your

different data stacks integrate with it.

They provide the security uh the privacy other controls around it and they provide the ergonomics you know the AI yet when breath's done the AI will be able to walk down the hall you know and kind of sit at the water cooler and

learn learn things about the organization learn things about the IP learn things about the processes but right now it can't right the way it can is actually interacting with APIs and so

the ontology ends up serving as an SDK for the AI for the agents for the models whether the awesome Nvidia open models neatron cosmos Some of those capabilities are the proprietary models

and foundry and AIP on top of antology provide the ability for humans to orchestrate those those models to achieve goals. So at the end of the day

achieve goals. So at the end of the day kind of the core insight is if you reframe the data which is what we do into the ontology you can reframe the operations of a factory an organization

kind of any process at scale.

>> Maybe uh you know I'll start with you AI on this and we can go across the across the group. You know, we've talked a lot

the group. You know, we've talked a lot today about open models. I think

Nvidia's contributed more to these open source libraries than any other country company on the planet. We'll hear more about that later today. What do you see uh you know really being employed in

industry today, right? We obviously have these frontier labs that are producing these incredible models, but what do you see in terms of uh you know the hybridization maybe the ensemble model

approach um maybe you know some Chinese open source models that you see being leveraged by US companies etc. I'd be curious what you see out there on the front lines. Yeah. Well, I think right

front lines. Yeah. Well, I think right now at least from Pounders's perspective both across you, you know, half of our business is commercial, half of our business is global government. And what

I would say is look, we always start with the big heavy cloud model um from from the Frontier Lab companies and there was proprietary models. I got to be honest, they're moving so fast and

they're so effective. Um so we use them, you know, to solve any problem is the first thing we try, but we're finding more and more especially as we move to the edge as we're thinking about inferencing at the edge. um we think about the need to sort of take those

models, train them on bespoke data and then move that towards kind of more of an edge inferencing architecture that solves specific problems either with a small language model um or even something that maybe is just a traditional machine learning model for a

lot of problems where data becomes critical. So we're seeing the trend of

critical. So we're seeing the trend of open models and the ability to take those models quickly refine them and kind of get to something that solves a specific problem at a low swap is being really critical.

>> Yeah, I will concur this is it usually starts horizontal. the horizontal ones

starts horizontal. the horizontal ones are more open and by nature but once you get into the use case specific applications then then actually they become more closed because you have to retrain.

>> So as an example uh we are doing a model where that helps to uh um uh machine programming right in the US we're going to miss two million people by 2030 and

we don't find the people that are skilled to do that. Yeah. So to program a machine there's a very specific code >> and so we take a general language model and so a large language model to apply

it but then we we do all the racks all the training retraining of it in order to make this work and then we are held liable for actually really that this works so therefore you need to close it to some extent that you get a little bit

of a control over what the quality and the results are.

>> Yeah. Um, another another line of question I'm interested in. You know, we again back to this question about the global AI race and the re-industrialization of America. You all

have a a role you're playing in that re-industrialization. Foxcon has made

re-industrialization. Foxcon has made some major announcements about investments it's making in the United States and and frankly about uh data centers that you're building in partnership uh you know with Nvidia

around the world. talk to us a little bit about how you see your role in the industrialization re-industrialization of America.

>> Yeah. Um based on our experience in terms of u the level of AI intelligence for the manufacturing and we found there

are three levels of intelligence. Uh

level one is for the uh simple and fixed operations. Level two is for simple but

operations. Level two is for simple but flexible operations. Level three is for

flexible operations. Level three is for simple but no complicated but flexible

operations. Now with these three levels

operations. Now with these three levels of uh uh intelligence it requires large

compute power to do the training and uh and the inferencing. And with that, that's why we're building uh a lot of AI

related uh uh facilities in the states as I mentioned from Ohio to Texas to uh

Wisconsin and uh and California in order to support the demand for this compute power. But besides the compute power

power. But besides the compute power that is needed, we also need

um talented technicians and engineers to apply and to use these technologies.

>> And that's another challenge that we're facing.

>> That's something that we will have to do together with the government to have education that is steered towards this.

Okay. Great, great, great, great point.

Brett, I know that um you know you and Elon are leading the charge on robotics being built in the US, right? And

obviously China uh famously has some leads in autonomy and in robotics etc. Handicap for us if you will. You know

how you're feeling today? What's your

level of confidence that we're going to be able to compete effectively on the global stage both from a capability and a cost perspective in robotics with what with what's coming out of China?

>> I get this question a lot and I have a very strong opinion here. Like I think what really matters today is we see the core like horizontal technology stack both in hardware and software and AI

come together to build like general purpose like almost like humanlike intelligence in the physical world. Like

that's the key unlock. Like I can't stress that enough. like people are looking past that in terms of cost and manufacturability and all this different stuff. We're at a stage now where we got

stuff. We're at a stage now where we got to go build like synthetic humans and it's an in and it's incredibly hard and I think what we are proud about a figure is we've been able now to show these

like real like pockets of like long horizon intelligence done with neural nets. Um I think the shortcomings are

nets. Um I think the shortcomings are that we couldn't put them into a house that's unseen today and do like hours and hours and hours of work. We want to get there. We think we will get there.

get there. We think we will get there.

We've seen all the now the ingredients of the puzzle pieces to be able to show that that is going to happen. Um I think we've been leading that globally. Um I

think if you're going to say like okay here's a robot now out in the real world that needs to run all day with like like low human intervention rates like low faults like we've been able to show that both in the commercial market and the

work we've done in house. Um I think we are like in a lot of ways miles ahead of other competitors we see in China in that in that in that aspect. From there

once that is solved or close to solved it does become like a manufacturing game. This is a consumer electronics

game. This is a consumer electronics manufacturing process. It's not an

manufacturing process. It's not an automotive manufacturing process which means we have like different piece parts and electronics that we need to fabricate. We need to do FATP. We do

fabricate. We need to do FATP. We do

integration EOL testing. We do all that extremely well. I think that is like

extremely well. I think that is like extremely tractable and surmountable problem that you can do even at scale. I

mean we produce what a billion phones a year almost like by hand in some way.

Like we can definitely create like millions and millions of robots. uh like

traditional consumer electronics manufacturing processes but what has not solved is be able to do like real end to-end general purpose robotics and I think uh the US today at least I can say what we know figure I think we're

leading that head and shoulders globally uh and we hope to continue to pull away like I'm going to be jumping on a flight right after this to make the engineering stand up into day uh to try to continue that and pull ahead

>> Brett there's been a lot of discussion uh doubts around you know a general purpose biped robot that can operate in the home uh and in the factory. I mean,

you got a so many people, you dropped a video of one of your robots folding laundry and I mean, everybody went nuts like, "Hey, I want one of those." But

I'm curious, um, where does the technology have to be where you can confidently serve both of these markets?

You know, maybe, uh, uh, put one of them into Young's, uh, uh, factory as well.

>> Yeah, >> we can we can do a deal here on stage.

Let's let's let's get a deal. You ready

to do a deal? Um yeah, let's do it.

There we go. Um listen, I think one thing we're proud about is we have a robot running right now and our first commercial customer. It's running like a

commercial customer. It's running like a 10-hour shift as we speak. We've been

doing it for almost 6 months now. We've

gotten like operational readiness there.

It's running autonomously. Um we've been trying to track like fault rates like human intervention rates per like per shift. Those have all been dropping.

shift. Those have all been dropping.

Performance have been rising every single month. So we like we now have a

single month. So we like we now have a better line of sight to what it takes to launch like a vertical job in like a commercial world. What we're also trying

commercial world. What we're also trying to tackle and what we found like learned a lot last year is we need to solve like the horizontal stack. Like we need to be able to build true general purpose work >> to put robots into any uh anywhere in

the world. So we've basically refactored

the world. So we've basically refactored our entire like autonomy and AI stack from scratch. It's all done now end to

from scratch. It's all done now end to end with neural nets. Uh we think it's the only way to really scale and it's done at the foundation level so we can scale into any work. Um, so I think maybe to answer your question more directly, we will see robots in the

commercial workforce over like now and in the next like year or two. We will

try to put more and more out. We will

try to get operation like we'll try to get cost down, reliability of the robots up. We'll try to make it really real.

up. We'll try to make it really real.

>> In parallel, we're trying to solve a robot in a home. It's been in my home now for like three or four months.

>> We're doing we're doing small parts of this like small parts of laundry and folding and dishes. We now have to connect all of it together like this tissue. We have to make it language

tissue. We have to make it language conditions so we can talk to it. We have

to put it in pixel space. So, there's

like a bunch of work in like AI models and foundation work and pre-training that we have to go do. Um, but we do see a path. It's like a little light in the

a path. It's like a little light in the tunnel that we going to go we want to go at. It'll definitely be solvable this

at. It'll definitely be solvable this like in like like like this like this decade. Like it'll be solvable hopefully

decade. Like it'll be solvable hopefully in the next few years that we can have it have this work. So, we're excited about that. But that is like a more of

about that. But that is like a more of like a like car autonomy in a city versus car autonomy in a highway is like they're very different. One's like super unstructured, high variability and like a lot of the engineering challenge is

proportional that variability. So like

we're going head on with that problem. I

would say you're going to see them in the workforce over the coming year or two and then we will we will only launch a product in the home in the workforce at scale once we feel confident of the product.

>> We will not do it early. We will not teley operate them in market like we will not do any of this silly stuff.

>> Um and I think that's maybe differs from other groups maybe in the space but we are trying to build like Indian autonomy with low human intervention rates in these in this commercial land market. So

once we're able to like really sell into your home to Brad's home, we want to be really confident it's going to really work and it's going to be really safe.

Can I give you my credit card now?

>> I'll take it. Okay. We'll we have a little machine in the back. We'll start

We're happy to happy to punch it.

>> Um Aki, you know, you guys have long been important partners to the US government.

>> Um you know, we obviously have had a major changing of the guard over the course of the last year. A lot of people including Jensen have said that this administration is far more open to

Silicon Valley and technology companies coming uh you know sharing their concerns, sharing their complaints and getting deals done.

>> Um talk to us about how you're feeling about the state of you know industry government partnership today versus what you've seen over the course of the past you know uh many years.

>> Yeah. I mean, this is year 21 in Palanter for me, so I've seen it all, just to be very clear. Um, it's like night and day, right? And and that's not a knock on the prior administrations. I

just think you had a a particular status quo or way of doing things and industry was often held at bay. And Silicon

Valley, frankly, I mean, part of why we moved our headquarters to Denver when we went public was, you know, very much a monoculture. You couldn't have

monoculture. You couldn't have disagreement. you couldn't actually

disagreement. you couldn't actually discuss national security and the importance of national security to Silicon Valley and vice versa. Um, and

so I think what you see is kind of both at the intersection of the administration, you know, taking on some really hard gnarly challenges. Uh, at

the same time as generative AI is coming through is kind of when you think about AI infrastructure is kind of the way of the future when we think about, you know, how we're going to build things, how we're going to make sure we do re-industrialize and bring a lot more

capability back to being built in in America. you kind of have this perfect

America. you kind of have this perfect storm in the moment and the people and uh what I've been incredibly encouraged by is the openness, right? Um look,

every administration gets something wrong. But that ability to say, "Hey,

wrong. But that ability to say, "Hey, that doesn't look right." And to give feedback and frankly for people to take calls and and kind of hear that feedback from industry and think through, hey, how do we make this great for

government, for every American, and for industry, uh I think is unparalleled.

And I think, uh I think that that's kind of what we're seeing right now. I think

it's also then uh kind of on the back of that driving a huge resurgence from Silicon Valley to say, "Hey, we want to be I mean Jensen said this so many times. If if you're doing work in

times. If if you're doing work in technology, you need to spend time in DC. You need to share what you're

DC. You need to share what you're doing." And you need to talk about how

doing." And you need to talk about how it's going to help every American out there kind of >> as they kind of think through this.

Otherwise, there's going to be a lot of fear and certainty and doubt out there.

So, we also have this responsibility as technologists now to kind of share that vision, share what we're doing, share how it's actually going to help and kind of bring the American people along. uh

and this this administration is doing that uh in spades. So, we're thrilled by it.

>> You know, I I'm curious um if you guys shared the same experiences visa v Washington. I mean, I think one of the

Washington. I mean, I think one of the critical things this administration did was to name David Saxs, you know, the Zar of AI so that there was a technologist, a point person who could frankly walk into the White House and

help break through uh some of the laresses that just is institutionally part of Washington. Have you all shared similar experience uh that we just heard as it pertains to getting stuff done in

Washington?

>> Absolutely. the uh the willingness to listen but also then to take actions and to implement is is really really key and uh in particular to be pro business understanding about where regulation is

required but where isn't and in the industrial world we have businessto business relationships where obviously we we have we have professional terms that that govern the quality of service

where you don't need a lot of regulation and where it's about speed and really making it accessible and so we find that in a very friendly friendly way to to bring that technology into the United States.

>> Yeah, >> we felt the same way. You know, I think the go this government is quite open to the technologies and very supportive to the new technologies, >> right?

>> Yeah.

>> So, that's good.

>> So, Peter, Young, Brett, Pocky, I want to thank you for coming on here. We

saved the best for last and you guys were awesome. Thank you so much.

were awesome. Thank you so much.

>> Thanks, guys. Thank you. So let's go back down on the floor with Christina Parts and Evalos. You have one more interview, a very special guest.

>> Yeah, also the best for last and that would be CJ Muse from Caner Fitzgerald who's been in the industry forever. We

were just but he looks really young. I

shouldn't preface like that. But could

you speak to uh China right now given that is a topic uh President Trump was supposed to be here. We're in DC. Uh

some people say the markets moved on Monday with the NASDAQ really climbing because uh you know there's maybe some positive outcome given Trump's comments over the weekend that you know something may be worked out. So how are you

looking at it from a uh a chip analyst perspective and how to read that for the market going forward?

>> Yeah absolutely and Christina thank you for having me first. Uh but uh I would say China is a very important market uh and we don't want to seed that to local competitors inside China. And so I do

think that there is a deal as part of the grand bargain uh where Nvidia will be uh able to ship as well as AMD uh into China. And so I would assume uh the

into China. And so I would assume uh the deal on Thursday, President Trump, President Xi will include the ability uh to ship GPUs. Uh and so Jensen's hopping on a plane after GTC and you know I

would imagine that that will be an announcement that we hear on Thursday.

>> That's pretty big. So you're telling our audience that you're expecting a positive outcome which do you think it's actually priced into a lot of these names then? I don't I I I think you know

names then? I don't I I I think you know China's a very large market. Uh you're

talking 15 plus billion dollars and I think that opens up. So no not priced in at all and that helps both Nvidia as well as AMD. Yes.

>> Okay. So that that is good news for those that are watching and you know figuring out which stocks to buy. We're

going to talk about just uh Washington and we're here right and this involves a lot of collaboration between policy leaders really getting the president on board the White House administration which we know they already are uh given

all the initiatives thus far but when you have these headlines from open AI or Nvidia or AMD Intel etc. They're building they're collaborating and yet

you still need to bypass the red tape you need to build in the ground you need to find the power. Are you worried that perhaps we're the market is reacting too positively to news that these outcomes

won't happen for another 10 years or so?

>> Too positive? No. Uh but I do think that we are now in the realm where excuse me, AI is now part of infrastructure. Uh and

that requires long-term planning and that requires DC to get heavily involved. Uh and so I think that there's

involved. Uh and so I think that there's a great reason why Jensen and team and Nvidia are here in Washington DC. AI is

critical to compete globally, right? and

we need Washington on board. Uh and so I do hope that they understand that China is investing heavily and if we don't want to fall behind, we need to remove the red tape. We need to enable

investment. Uh and that is the greatest

investment. Uh and that is the greatest thing that we could possibly do. So I

would say building out data centers. So

building power, you know, enabling these data centers to be built, those are the biggest kind of bottlenecks right now, uh outside of TSMC manufacturing GPA. So

that's the most critical part right now and we need Washington DC's help.

>> Yeah. But to build a nuclear reactor is not going to happen even in the next two years or so. So what if you could be more specific like what do we need to do here in the US to improve that power grid? So, I'd say remove the red tape,

grid? So, I'd say remove the red tape, start investing today on power. Like,

that would be missionritical to get the job done.

>> And it sounds like that is the direction we're going towards, which would mean that the power suppliers in the United States, again, for the investors watching and the the names, the stocks that trade on that, uh, there's still

ample opportunity for growth, right? So,

is that I know you look you focus on chips, but this could be really bullish for for >> possibly. I'm not sure what's priced

>> possibly. I'm not sure what's priced into those stocks.

>> Then, let's go back to Nvidia really quickly. just you have earnings coming

quickly. just you have earnings coming up. Is there anything major that you're

up. Is there anything major that you're expecting from the earnings report? I

know we're a few years and you haven't written your your look ahead yet, your preview I should say, but what would you be looking for?

>> I think what's most important really is that Nvidia's sold out not only 2025, they're sold out 2026. So they have perfect line of sight to what they're going to do. And you know, at a very

high level, we think they can do $8 of earnings next year. Streets at 650. If

we're right, this stock is way too cheap and it's going to push higher.

>> Do you benefit if you're right?

Something like that. That's

>> I would I'm just curious how it works behind the scenes like companies I cover. Do you get a big bonus if you're

cover. Do you get a big bonus if you're that much higher and you're right, but maybe that's an offline comment, not for the public. On that note, CJ Muse, thank

the public. On that note, CJ Muse, thank you so much guys. Back over to you.

>> Thanks for having me.

>> So, uh, Patrick, what were some of your big takeaways from today? Well, first of all, I appreciated us addressing the full stack and who knows, maybe in a

year there will be another element to the stack and we'll add another layer.

And I think it was just the comprehensiveness. I mean, we're going

comprehensiveness. I mean, we're going all the way from the investment funnel up front to literally building robots for homes and for commercial industrial

applications.

>> I mean, for me, it it was just an optimism for American AI.

>> Yeah. like you know there's a reason we're doing this in Washington DC uh you know they say that part of the success of Silicon Valley was that it was 3,000 miles away from Washington >> right

>> today everything is happening in partnership I think this entire leo ecosystem is led incredibly by Nvidia and Jensen who now understand the

importance of coordinating with Washington and coordinating with each other right we're taking on something that is 10 times the size of the Manhattan project that is of profound

consequence uh to our national security, our national economic security. I think

it was a great choice to do it here in Washington. And what we heard I thought

Washington. And what we heard I thought was both sober about the path ahead that it's not going to be a line straight up and to the right, but that we have these phases that we're going to go through in

terms of AI. And I leave here a lot more confident about the path ahead for the US um and about the coordination between Washington and Silicon Valley.

>> Yeah, I'm I'm with you there, Brad. I

mean, the prosperity that we can provide, the uh diseases we can cure and the amount of people we can put back to work. That's what I'm here for.

work. That's what I'm here for.

>> Um no doubt about it. So Christina, any final thoughts from you? It seems just after the last conversation I had with CJ Muse that uh we're going to expect some positive outcomes from the

conversations in China, especially with Jensen traveling overseas to Korea. So I

think that's maybe a positive thing to read into this. The other thing is uh Nokia uh during maybe you guys you were on the panel so perhaps you didn't hear the the news came out about uh Nvidia's investment a $1 billion investment in

Nokia shares were halted uh and so this could speak to the collaborations with telecom and Nvidia here in the United States and how we evolve uh towards 6G and how that will you know just update through software. It's incredible. So

through software. It's incredible. So

I'm sure we're going to hear more about that news but the stock was initially halted and so that's something that just happened within the last hour or so. But

overall we're making news here. That's

what this event is about, guys.

>> No, no doubt about it. It's a big day.

We woke up with the, you know, with the president uh in in Japan making news.

Open AAI and Microsoft made news and we're going to hear a lot more out of Nvidia uh in in just the next hour or two. So, we're out of time. It's been an

two. So, we're out of time. It's been an incredible day. Thanks to you, Patrick.

incredible day. Thanks to you, Patrick.

Thanks to the entire team uh here at NVIDIA. Jensen Hong is about to take the

NVIDIA. Jensen Hong is about to take the keynote stage.

>> Brad, thank you. And our thanks to the entire Nvidia team for bringing together this the amazing guests and all the folks from around the world for coming

to Nvidia GTC Washington. Thanks for

watching. Until next time.

[Music]

Loading...

Loading video analysis...