LongCut logo

The Most Important AI Stories This Week

By The AI Daily Brief: Artificial Intelligence News

Summary

Topics Covered

  • Flash Models Outperform Pro
  • OpenAI Eyes $100B Raise
  • Amazon Centralizes AI Leadership
  • Data Center Funding Falters
  • Moratorium Hands AI to China

Full Transcript

From Google Flash to Amazon moves to more open AI fundraising to Bernie Sanders moratorum on data center construction, we are talking about the

most important stories in AI. This week

we have got to dig into so many stories that we've missed. It's been a weird week. On the one hand, we are prepping

week. On the one hand, we are prepping for the end of the year, which means I'm producing a ton of shows for the next couple of weeks as everyone goes off for the holidays. But there's also been a

the holidays. But there's also been a couple of big releases which consumed entire shows. For example, we had

entire shows. For example, we had ChatGBT images 1.5, which sort of shoved things out of the way. So, what we're doing today is kind of the inverse.

Instead of a full episode on a single topic, we're going to be talking about a ton of topics. It's sort of an extended headlines except for the fact that many of these stories could easily be mains

in their own right. And we are going to kick off with another model release.

Now, this one is really interesting because on the one hand, it is not the premier model release. And yet, on the other hand, according to the company behind it, it kind of seems like it is.

Around the middle of the week, Google staffers started cryptically or not so cryptically posting lightning bolts all over Twitter/X. Now, given the pattern

over Twitter/X. Now, given the pattern of how we got Gemini 2.5 Pro followed by 2.5 Flash, everyone assumed that we were getting Gemini 3 Flash, and that's what the lightning bolts referred to. And

indeed, that is what we got. What was

unexpected is just how good Gemini 3 Flash appears to be. Sudar Pichai

tweets, "We're back in a flash. Gemini 3

Flash is our latest model with frontier intelligence built for lightning speed and pushing the paro frontier of performance and efficiency. It

outperforms 2.5 Pro while being three times faster at a fraction of the cost."

And they really just kept coming back to the speed. Sedar tweeted later, "Watch

the speed. Sedar tweeted later, "Watch below as Three Flash generates complex graphics, 3D models, and a web app before the previous generation even finishes processing." Google AI Studio

finishes processing." Google AI Studio product lead Kilpatrick writes, "Gemini 3 Flash punches way above its weight class, surpassing 2.5 Pro on many benchmarks while being much cheaper, faster, and more token efficient." And

if you're wondering why all the comparisons are to 2.5 Pro instead of 3 Pro, this seems to be a prerogative from inside Google. Google chief scientist

inside Google. Google chief scientist Jeff Dean writes, "One of the things we strive to do with each new Gemini release is to make the new flash model as good or better than the previous model's Pro model." So, this is

explicitly a thing that they were attempting. Gemini 3 Flash exceeds 2.5

attempting. Gemini 3 Flash exceeds 2.5 Pro on nearly every metric, often by very large margins, and almost matches Gemini 3 Pro on most benchmarks. And

honestly, as much as the marketing comparison is to 2.5 Pro, it's very clear that even relative to 3 Pro, inside Google, a lot of people have this as their favorite workhorse daily driver

type model. Nom Shazir writes, "We've

type model. Nom Shazir writes, "We've packed Gemini 3's prograde reasoning into a leaner model with flash level latency, efficiency, and cost. It's my

favorite model to use. The latency feels like a real conversation with the deep intelligence intact." Dennis Hassabis

intelligence intact." Dennis Hassabis calls it the best pound-for-pound model out there. And I actually wonder if this

out there. And I actually wonder if this idea of pound-for-pound models is going to become something that we think about more as we move into more complex workloads where companies actually are shifting between the state-of-the-art

and these more token efficient models.

On the benchmarks, Logan Kilpatrick points out the massive jump from the middle bottom of the pack on artificial analysis for 2.5 Flash all the way up to the very top behind only Gemini 3 Pro

and GPT 5.2. Three Flash also did really well on the Ark. Simon Smith says on ARGI 1, it scores just behind GBT 52 extra high for about 5.5 times less cost

per task. And it turns out that at least

per task. And it turns out that at least from a first impression basis, it's not just Google insiders who are really excited about this model. AI

entrepreneur Benu ready writes, "Gemini 3 Flash 3 intelligence too cheap to meter. Gemini Flash series has been one

meter. Gemini Flash series has been one of the best small models ever. We have

100 times more usage on Flash compared to the Pro version. Flash 3 seems to be Google's best model yet." Delete Brow writes, "I had early access to Gemini 3 Flash and it shocked my vibe test as I

walked in with 2.5 Pro to flash expectations. Looking at the evals now,

expectations. Looking at the evals now, it all makes sense. The latent story here from my point of view is Google has cannibalized a big chunk of 3.0 Pro use cases. The fact that Google pushed this

cases. The fact that Google pushed this out shortly after 3 makes me think they already know future 3.x Pro will have stellar performance. Flash is now the

stellar performance. Flash is now the best agentic model hands down for its price point. The lower score on HLE and

price point. The lower score on HLE and GPQA Diamond over Pro means it is not as knowledgeable as Pro, which makes sense.

The choice is clear. Flash 3.0 should be the deacto agentic model unless you are in a knowledge heavy domain but I suspect even there with sufficient context management you can get good value out of flash 3.0 Oh, also giving a

little sum up to the year, he writes, "Gemini LLMs have been a black swan for a big chunk of 2025. I doubt any outsider could have predicted total paro frontier domination by the Google franchise by end of year." Now, the one

thing that people are pointing out just for the sake of some comparison is that the hallucination rate seems very high relative to other models. In the

artificial analysis omniscience hallucination rate test, which measures quote how often the model answers incorrectly when it should have refused or admitted to not knowing the answer, it was at the very top of the heap at

91%. So that is something certainly to

91%. So that is something certainly to keep an eye on. Still, overall, people are incredibly excited about this release. Google also announced that in

release. Google also announced that in addition, they are expanding access to the pro models and providing paid subscribers with higher limits. And one

practical operator's note, you might have noticed that the model selector choices have changed from fast and pro to now fast thinking and pro. Google's

Josh Wward gives a key. Fast equals

three flash. Thinking equals three flash but with thinking. And pro equals 3 pro with thinking. So if you want three pro,

with thinking. So if you want three pro, you have to select pro, not just thinking. Anyways, I haven't had too

thinking. Anyways, I haven't had too much time to really dig in and test, but so far it seems like a great model and another good addition to our toolkit.

Next up, we move to some big breaking investment stories from the last couple of days. Bombshell OpenAI fundraising

of days. Bombshell OpenAI fundraising news as Amazon considers making a $10 billion investment. The information

billion investment. The information reports that Amazon is in talks to invest 10 billion or more into OpenAI, citing three people familiar with the discussions. The valuation would be

discussions. The valuation would be higher than 500 billion, which was the valuation struck for the tender offer completed in October. Talks commenced in October after OpenAI completed their corporate restructuring that gave OpenAI

the ability to sell normal common stock to investors as well as putting a final end to their exclusive comput arrangement with Microsoft Azure. The

report highlighted several potential aspects of an OpenAI Amazon mega deal.

First, it would of course help OpenAI deal with immediate cost pressures.

They've committed to spending 38 billion renting servers from AWS over the next 7 years. So, an equity for compute deal

years. So, an equity for compute deal could be more cost effective. The

information sources also said that OpenAI making use of Amazon's Tranium chips could be part of the deal. Amazon

has been pushing hard to drive tranium adoption, including making the use of Tranium a condition of their investment in Anthropic earlier this year. Now,

crucially, Amazon won't be able to offer OpenAI models through their bedrock platform. Microsoft still has the

platform. Microsoft still has the exclusive right to offer OpenAI models in cloud services. However, Amazon and OpenAI have apparently been discussing other partnership opportunities. One of

the sources said that Agentic e-commerce opportunities have been discussed while OpenAI also wants to sell enterprise chat GBT seats to Amazon staff. Now,

it's worth reinforcing that 10 billion is a very big number that could go a long way to bridging the gap to OpenAI's anticipated IPO. OpenAI's 2025

anticipated IPO. OpenAI's 2025 fundraising total was 40 billion, and they're projecting a cash burn of 100 billion over the next four years. The

deal could also shake up allegiances within the AI space. At that point, OpenAI would be partnered with all three of the major cloud providers with everyone but Google also on the cap table. More complicated are OpenAI's

table. More complicated are OpenAI's chip deals. They already have plans in

chip deals. They already have plans in motion to develop their own custom silicon with Broadcom and Amazon's Tranium adds to their diversification away from Nvidia. But for the moment, Nvidia is still the only chip producer

with the quality and capacity to fill OpenAI's new data centers. Nvidia is

also an existing shareholder in OpenAI and announced a deal in September to invest up to 100 billion or more. It was

later revealed that this fundraising was highly conditional and basically gave Nvidia the option to invest if they so choose. There is another whole dimension

choose. There is another whole dimension to this around whether this suggests that Amazon's partnership with Anthropic is drifting apart or if this just reflects the fact that in AI today everyone is going to partner with

everyone. Whatever the case, obviously

everyone. Whatever the case, obviously markets liked it. Amazon was up 2.3% in overnight trading. And while some people

overnight trading. And while some people immediately rush to say how news is bullish for some and bearish for others, I kind of think Amit is investing had it right here when he wrote, "Bullish Amazon, not really bearish Nvidia, AMD

or Broadcom. In my opinion, market may

or Broadcom. In my opinion, market may interpret it as bearish, but the chip ecosystem is massive and only growing even with custom ASEX. The bigger

question is how OpenAI continues to get every single important tech company to invest in them and tie part of their success to OpenAI's ability to scale."

Now, that was not the only fundraising news from OpenAI this week. Later in the week, the information again reported that OpenAI had even more fundraising in the works. Talks are reportedly underway

the works. Talks are reportedly underway to raise not 10 billion dollars, but tens of billions of dollars with sources saying the final number could be as much as hundred billion. The valuation is

said to be at 750 billion. Now, sources

noted that the talks are still early and nothing has been finalized. This would

obviously be a 50% jump in valuation since OpenAI's tender offer in October, but it might not be as crazy as it seems. For example, if they can actually source 100 billion in new investment, then that makes up a lot of the

difference for a post money valuation.

Still, the new valuation wouldn't leave a lot of room for OpenAI's IPO, where they're reportedly targeting a trillion dollar valuation. Now, there's no

dollar valuation. Now, there's no precedent for an IPO that size, and if they have to reach even further, the IPO could start to really stretch the limits of public market liquidity. Multi-t

trillion dollar companies exist, but they generally don't have a wave of venture fund managers who need to sell their stock all at the same time. The

bankers working on next year's batch of IPOs from OpenAI, Anthropic, and SpaceX are apparently thinking about this problem already. The information reports

problem already. The information reports that bankers are considering staggered lockups for existing investors. The

standard arrangement is a 90 or 180day cliff, but allowing so much selling all at once could easily overwhelm market liquidity. We are really getting to the

liquidity. We are really getting to the point where any news about OpenAI fundraising or dealm is kind of a roar shack test on what you think of OpenAI.

Daniel Newman writes, "The smartest and most sophisticated investors are all piling into OpenAI at eyewatering valuations while a wildly bearish narrative spreads about its demise. I'll

bet on the smartest and most sophisticated. AI will be more than

sophisticated. AI will be more than fine." Now, staying on the theme of Open

fine." Now, staying on the theme of Open AI and Amazon, but shifting over to the Amazon side of that deal, Amazon also made further news with the announcement of a new AI focused department. CEO Andy

Jasse announced on Wednesday that veteran AWS executive Peter DSantis would lead a new organization that pulls their disperate AI initiatives under one roof. The organization will bring

roof. The organization will bring together the AI models team that's in charge of training Nova as well as their AGI labs team that's currently working on computer use agents. Silicon

development, including the Tranium chip and Amazon's quantum compute project, will also be folded into the new AI or Jasse commented, I believe we are at this inflection point with several of our new technologies that will power a

significant amount of future customer experiences. Now, Dantis is currently

experiences. Now, Dantis is currently the senior VP of AWS and has been with Amazon for almost three decades. Jasse

wrote that Dantis led some of the most transformative technologies in computing history. For example, Dantis led the

history. For example, Dantis led the launch of EC2, which is the core AWS service, as well as managing their custom silicon team, and a dozen AWS infrastructure and product innovations.

Jasse concluded, "The path ahead is full of opportunity with the foundation that's been built, the traction we're seeing, and Peter's leadership bringing unified focus to these technologies.

We're well positioned to lead and deliver meaningful capabilities for our customers."

customers." Now, alongside the reorg, there's also a changing of the guard for AI at Amazon.

Jasse announced that Rohit Prasad, Amazon's head scientist of AIG, is leaving the company at the end of the year. Prasad had been with Amazon since

year. Prasad had been with Amazon since 2013 and served as the head scientist for the Alexa project from its earliest days. While Jasse was magnanimous, most

days. While Jasse was magnanimous, most people are kind of assuming that Amazon's stagnant year in AI contributed to Prasad's departure. As another

potentially important footnote, robotics specialist Peter Aiel has been placed in charge of the Frontier models team.

Peter joined Amazon in 2024 after an aqua hire deal with his robotic startup co-variant. They were the first to

co-variant. They were the first to launch a commercial foundation model for embodied AI in March of last year and first impressions are positive. Computer

science and exromagined Domingos writes, "Unlike meta Microsoft or Apple, Amazon now has one of the best AI researchers in the world leading its AI efforts.

Great move and godspeed Peter Aiel."

Now, on the one hand, we tend to take leadership shakeups as signs of trouble, but there is a very clear pattern here.

Just like Google went through a hard period where they had to reorganize everything under a single roof and get everyone aligned in a common focus, Meta has been going through that all year.

And it seems like Amazon is following suit. Now, my guess is that Amazon is

suit. Now, my guess is that Amazon is poised for a much better 26 than they had a 25 because of all these moves. And

so, I'm excited to see what the new moves produce. Shifting now to a little

moves produce. Shifting now to a little bit of market news. Some think that a busted debt deal for Oracle could be the pin that pops the AI bubble. The

Financial Times reports that Blue Owl Capital has declined to fund Oracle's next big project, a $10 billion data center in Selen Township, Michigan. Blue

Owl is a private equity group that has been one of Oracle's primary funding partners over the past year. Sources

said that negotiations have stalled out and the agreement to fund the project will not go forward. The FT reported that Blackstone is in talks to step up as a backup, but they're yet to sign a deal putting the project in jeopardy.

Now, Blue Owl had been structuring these deals through specialurpose vehicles.

Essentially, new companies that build and own the physical assets and then lease the facility to Oracle once complete. Other large investors like

complete. Other large investors like pension funds, family offices, and sovereign wealth funds, then buy debt and equity issued through the special purpose vehicle. The data center and

purpose vehicle. The data center and associated revenues are typically put up as collateral for the debt. Reportedly,

investors pushed for stricter leasing and debt terms for this deal in light of shifting market sentiment and fears about Oracle's growing debt load. Now,

Oracle for their part played down the issue, commenting, "Our development partner, Related Digital, selected the best equity partner from a competitive group of options, which in this instance was not Blue Owl. Final negotiations for

their equity deal are moving forward on schedule and according to plan." A

related digital spokesperson added, "The notion that Blue Owl walked away is unequivocally false. This is an

unequivocally false. This is an exceptional project that drew significant interest from equity partners. Still, the news has markets

partners. Still, the news has markets rattled with Oracle stock plunging by 5.4% 4% on Wednesday after the article was published. The stock is down 45%

was published. The stock is down 45% since its all-time high in September, which is of course when they first announced their $300 billion OpenAI deal. It is now trading below levels

deal. It is now trading below levels from the deal. Andreas Tenno Larson writes, "This is how bull markets end.

If debt markets dry up, i.e. the

availability of money, then the AI trade is in trouble. This is the key question for 2026. Wherever we are, it's very

for 2026. Wherever we are, it's very clear that markets are on the edge heading into the end of the year. This

is the first indication we've had that private equity firms don't have an infinite appetite for data center funding, but it's not obviously the end of the trend. This could all blow over if Oracle finds a new partner, and it might even just be a seasonal issue with

few investors looking to make new allocations in the final weeks of the year. Still, the jitters are evident and

year. Still, the jitters are evident and becoming much more frequent as we close the book on 2025.

Next up, another story that could easily be a main. Chat GBT is ramping up their third party integrations with the release of their app store. There is now an app directory allowing users to browse available integrations and OpenAI

has also rebranded their connectors feature now calling them apps as well.

According to a support page, chat connectors are now apps with chat. Deep

research connectors are now apps with deep research and synced connectors are now apps with sync. ChatGBT is one step closer to becoming an everything app or an AI operating system rather than just

a chatbot. Earlier in the week, OpenAI

a chatbot. Earlier in the week, OpenAI hired former Shopify VP of product Glenn Coats as their new head of app platform.

And his announcement made it very clear what the plan is, writing, "We're going to find out what happens if you architect an OS groundup with a genius at its core that can use its apps just like you can. I can't think of anything

more exciting to work on." And with the App Store now live, we had a wave of third party integration announcements.

Salesforce CEO Mark Beni off wrote, "Welcome agent Force Sales and Chat GPT.

Our Christmas gift, the world's number one salescloud, is now alive inside the world's number one LLM. Door Dash CTO Andy Fang posted, "Dor Dash is now partnering with OpenAI to bring grocery

shopping directly into Chat GBT. You can

now turn a recipe idea instantly into a shoppable grocery list in the chat and seamlessly check out on Door Dash for on demand delivery." In an announcement

demand delivery." In an announcement post by CEO of applications Fiji Simo, a whole slew of other partners including Adobe, Air Table, Apple Music, Clay, Lovable, OpenTable, and Replet are all

now also available. I would say in general, the attitude among observers is interested observation. Developer Nick

interested observation. Developer Nick Dobos writes, "Intriguing but odd.

Available on mobile, but buried deep.

Photoshop is nice combined with image gen. Nice to see some common business

gen. Nice to see some common business ones like linear and notion. Some are

super weird. Fairly uninteresting first batch. Who is connecting Pelaton or

batch. Who is connecting Pelaton or Zillow? Hoping we see more weird and

Zillow? Hoping we see more weird and creative stuff now that you can finally submit apps. I need to sit down and find

submit apps. I need to sit down and find the time to actually jam on some app ideas. Some commentators, on the other

ideas. Some commentators, on the other hand, are a little bit more skeptical of the whole idea of an app store. James H

asked Mark Beni off, "Wouldn't it make more sense to have GPT inside Salesforce instead of the other way around?" And

this is kind of the thing to watch going into next year. Now, if you are interested in the whole chat store conversation on Sunday's big think episode, I'm reviewing some big predictions that others have had and I actually talk a little bit about a

prediction that chatbt becomes the next great app platform.

One last openai update, we got more details on OpenAI's deal with Disney.

Bloomberg reports that no cash changed hands as part of the deal with Disney taking payment exclusively in stock and warrants. Disney received a billion

warrants. Disney received a billion dollars in stock upfront and also received an option to buy an undisclosed additional stake in the company at a later date. While Bloomberg source

later date. While Bloomberg source didn't get too specifics about the terms of the warrants, they did say the deal was structured to align the company's financial interests if Sora is a hit.

Alongside the financial terms, CEO Bob Iger disclosed that the deal only had a one-year exclusivity period, after which Disney could sign similar deals with rival AI Labs. While the structure means Disney is waving a cash licensing fee,

Iger believes in the upside potential, stating, "We want to participate in what Sam is creating, what his team is creating. We think this is a good

creating. We think this is a good investment for the company." Iger also discussed the rationale for cutting a deal instead of fighting the AI industry in court. He said, "No human generation

in court. He said, "No human generation has ever stood in the way of technological advance, and we don't intend to try. We've always felt that if it's going to happen, including disruption of our current business

models, then we should get on board."

Now, in our final section of this extended news episode, we shift our attention more to the government and public side of AI. Starting with a new announcement from the Trump administration of a department focused on AI infrastructure that they're

dubbing the US Tech Force. According to

the government website, the tech force will recruit, in their words, an elite core of engineers to build the next generation of government technology. The

administration is aiming for around a thousand hires. Recruitment will focus

thousand hires. Recruitment will focus on early career technologists from traditional recruiting channels.

Experienced engineering managers will also be secounded from private sector partners. These partners include AWS,

partners. These partners include AWS, Apple Google Dell Microsoft Nvidia OpenAI Oracle Palanteer and Salesforce. The government is looking

Salesforce. The government is looking for people with expertise in software engineering, AI, cyber security, data analytics, and technical project management. Importantly, the tech force

management. Importantly, the tech force won't be a military division. However,

it's not clear where in the government hierarchy it actually will be located.

Teams of workers will be allocated across all government departments, reporting directly to individual agency heads. The idea seems kind of like

heads. The idea seems kind of like embedding forward deployed engineering teams into government agencies to drive AI and tech projects. Still, the

recruiting ad compared Tech Force to the storied US engineering corps of World War II. The website said engineering

War II. The website said engineering teams would tackle the most complex and large scale civic and defense challenges of our era. From administering critical financial infrastructure at the Treasury Department to advancing cutting edge

programs at the Department of Defense and everything in between. Essentially,

the initiative is an all of government effort to completely rebuild software and hardware infrastructure. Scott Capor

tweeted, "Your government needs you to transform the federal government through modern software development. If you're

up for a huge challenge, join the country's best and brightest technologists in the inaugural class of US tech force. We're partnering with the top US technology companies to take on this challenge. You'll learn a ton.

this challenge. You'll learn a ton.

Network across the most important government agencies and private sector companies, ultimately creating powerful career opportunities. whether you want

career opportunities. whether you want to continue in public sector or join the private sector. And indeed, the

private sector. And indeed, the initiative appears to be in part a jobs program for younger tech workers who've been hit particularly hard by current conditions in the labor market. Recruits

will serve 2-year stints in government and then will be eligible for preferential hiring at those partners.

AJ Wald writes, "I think tech force is needed, but anyone that has worked in tech knows it takes more than 2 years.

It should be 5 years and all student loans paid off if they finish."

Staying on White House related policy, Nvidia is considering ramping up production of H200s to meet Chinese demand, writes Reuters. Nvidia has told Chinese clients that it's evaluating adding production capacity for its

powerful H200 AI chips after orders exceeded its current output level. Now,

that would be a very sharp departure from recent reporting that suggested that Beijing would heavily ration H200's. Bite Dance and Alibaba have

H200's. Bite Dance and Alibaba have reportedly already placed large orders.

And while Reuters stated that Beijing is yet to greenlight any imports, the Chinese tech companies clearly want to get their orders in as soon as possible.

Now, the news raises a few implications for US chip policy. First, it gives hope that the US strategy of flooding China with US chips isn't too late. Nor Chu,

the investment director at White Oak Capital Partners, also confirmed that H200's far exceed anything being produced by Huawei. He told Reuters, "Its compute performance is approximately two to three times that of the most advanced domestically produced

accelerators. I'm already observing many

accelerators. I'm already observing many cloud service providers and enterprise customers aggressively placing large orders and lobbying the government to relax restrictions on a conditional basis. The surge of orders also revisits

basis. The surge of orders also revisits concerns about Nvidia servicing Chinese demand over the needs of the domestic industry. TSMC is capacity constrained,

industry. TSMC is capacity constrained, so chip production is a bit of a zero- sum game at the moment. US lawmakers had proposed a bill last month to force chipmakers to give preferential treatment to domestic firms, but it was

set aside after Jensen Huang visited Washington. Nvidia's decision could

Washington. Nvidia's decision could cause a backlash if we see a shortage of chips hold up data center construction.

And indeed, it is with data construction and the politics around it that we will conclude this episode. I'm sure you have seen this, but Senator Bernie Sanders is calling for a moratorum on data center

construction in order to slow down the AI race. In a new video announcing the

AI race. In a new video announcing the policy proposal, he referenced multiple projected harms, including isolation of children through chatbot use and worker displacement, Sanders said, "There is a whole lot about AI and robotics that

needs to be discussed and analyzed. But

one thing is for sure, this process is moving very, very quickly, and we need to slow it down. I will be pushing for a moratorium on the construction of data centers that are powering the unregulated sprint to develop and deploy

AI. That moratorium will give democracy

AI. That moratorium will give democracy a chance to catch up and ensure that the benefits of technology work for all of us, not just the wealthiest people on Earth. Now, Sanders has, of course, been

Earth. Now, Sanders has, of course, been increasingly focused on AI labor disruption as a core policy discussion over recent months. In July, he attached the AI productivity boom to his calls for a 4-day work week. And more

recently, he wrote multiple op-eds fleshing out his viewpoint. His central

thesis was that democratic input is required on a technology of such transformative potential and that the conversation around policies to protect workers needs to start now before the bulk of the damage is done. Now, even if you don't agree with Bernie's

laborcentric and democratic socialist political views, a lot of folks agree that elevating this discussion is a pretty reasonable place to start.

Previously, I even said that I liked that Bernie Sanders was focusing on this 4-day work week idea because it on the one hand acknowledges the inevitability of AI while also trying to distribute the benefits more widely. In what will

be unsurprising to many of you, I feel very differently about this type of moratorium and lots of others seem to as well. AI media creator Matthew Berman

well. AI media creator Matthew Berman wrote, "I hate everything about this video. Cherry-picking the doomer talking

video. Cherry-picking the doomer talking points to scare people is gross. A

moratorium on data centers would hand the AI race to China. They are wishing we stopped building. I really don't want a 90-year-old to dictate tech policy.

And by the way, if you're thinking that the narrative of an AI pause sounds pretty familiar, then you're not far off. Sanders has been consulting on

off. Sanders has been consulting on policy with Jeffrey Hinton, who was central to advocating for an AI pause in 2023 as part of a coordinated effort led by the Future of Life Institute. That

push, believe it or not, is still ongoing with the Future of Life Institute now conducting polling in an attempt to establish popular support for their cause. It is very clear that AI

their cause. It is very clear that AI development and data center policy are going to be key in next year's midterms. Sanders is now planting his flag at the extreme end of the political discussion, but he's far from alone, and this is not

right now a leftright issue, or at least there are both members of the left and the right that are carving out this anti-AI political territory. Florida

Governor Ronda Santis hasn't gone so far as to lock arms with Bernie Sanders yet, but he did recently argue that data centers have zero economic benefit for locals. He said recently, "Once it's

locals. He said recently, "Once it's done, it employs like half a dozen people." And these tech companies will

people." And these tech companies will likely bring in foreigners on some visa.

They're not going to hire from your local community. That's just not what

local community. That's just not what they do. Now, it is way beyond the scope

they do. Now, it is way beyond the scope of this particular show to dig into how much I think the people building data centers have failed to recruit communities to their cause and actually made them valuable to those communities, but I'm sure that's something we'll talk

about a lot more in 2026.

When it comes to the mortorium specifically, many economically minded folks on X pointed out that this would pretty much have the exact opposite effect of Bernie's stated goal of ensuring that the benefits of technology

work for all of us. Austin Alred writes, "Making it so no one else can build data centers literally locks it into the richest companies owning AI." Nick Dobos again writes, "Constrain AI compute and free tiers will vanish and only the rich

will have the best AI. Imagine going to school and only the rich have AI to help learn. This will do the exact opposite

learn. This will do the exact opposite thing you're aiming for." Dystopia

Breaker writes, "I agree with many of your concerns around risk and governance, but nimi the data centers is going to make the problem worse, not better. If you halted compute buildout

better. If you halted compute buildout here, you're ensuring that only the relatively wealthy have access to this technology that you yourself describe as transformative through the same price effects as nimism and housing." Some

took this as an example of the work that the AI and tech industry need to do to actually align with people's perspectives. Nick Arner writes, "It's

perspectives. Nick Arner writes, "It's increasingly important that the tech industry effectively communicate and diffuse the benefits of AI faster or this view will become more popular and widespread." I'll close with a take from

widespread." I'll close with a take from Perita Mistra, a computational biologist. She writes, "It is so sad

biologist. She writes, "It is so sad truly to think that data centers are only helping Mark Zuckerberg and all the other billionaires when there are people waiting for a cure to their disease with 5 years to live. Discovery is the zeal

of life. You want to slow down

of life. You want to slow down scientific discovery? You think only

scientific discovery? You think only Mark Zuckerberg is affected by this?

Come on, we all know the importance of scientific discovery. It's why we have

scientific discovery. It's why we have the worldwide web. Thank god they did not cut that down. You know who needs data centers? We need data centers. The

data centers? We need data centers. The

countless small businesses, scientists, and engineers that are spending all their time on Earth trying to cure disease. In lie of optimized hardware,

disease. In lie of optimized hardware, you need more data centers for cancer discovery state-of-the-art. It is a

discovery state-of-the-art. It is a desperate need for many. Some of us are trying to use AI to fight diseases. Some

can clown on us, but the smartest people in the world see the need for AI for biology as obvious. Some even see it as a moral imperative. Blanket deceleration

is what you are proposing. The data

center demands for biology are exploding and there are consequences for not having a nuance view. This is important for all American patients, all families, all people on Earth. Like I said, there is so much to discuss around this. It

will be an unavoidable part of the conversation in 2026. For now, we will leave it there and we will close this episode. Hopefully, this gives you a

episode. Hopefully, this gives you a sense of all the other things that were happening this week. In any case, I appreciate you listening or watching as always.

Loading...

Loading video analysis...