Elon Musk Just Gave a Shocking WARNING that Everyone Should Hear
By Glenn Beck
Summary
Topics Covered
- Retirement Savings Become Irrelevant
- Singularity Accelerates Unpredictably Now
- Tesla Robots Displace Factory Workers First
- Negative Income Tax Replaces Welfare
- AI Demands Market Safety Valve
Full Transcript
I want to play something that Elon Musk just said on some podcast just a couple of days ago that is shocking. Shocking.
I want you to listen to this. Let me
take you through it.
>> Yeah. Um well, one like side recommendation I have is like don't worry about like squirreling money away for uh retirement in like 10 or 20 years. It won't matter.
years. It won't matter.
>> No.
>> Okay. either either we're not going to be here or it it just uh like it's not >> stop only only Elon Musk can just say something like that and then just move
on.
>> Yeah. Don't save for retirement. You
know, it's not going to matter. Okay.
Okay. Now listen to the rest. Go ahead.
>> To save for retirement. If if any of the things that we've said are true, saving for retirement will be irrelevant.
>> The services the services will be there to support you. You'll have the home.
You'll have the healthcare. You'll have
the entertainment.
>> The way this unfolds is fundamentally impossible to predict because of self-improvement of the AI and the accelerating timeline.
>> Yeah. It's called singularity for a reason.
>> Yeah. Exactly.
>> I don't know what goes happen what what happens after when after the event horizon.
>> Exactly. You can't never see past the black hole or the event horizon. The
light cone.
>> I mean, Ray Ray has a singularity out way too far. I mean, this is like the next what what's your timeline for?
Yeah this >> we're in the singularity.
>> Well, we are in the singularity for sure. We're in the midst of it right now
sure. We're in the midst of it right now for sure.
>> And we just feel >> we're in this beautiful sweet spot which is, you know, the >> we're the roller coasters were just >> Yeah, exactly. That's a great analogy.
It's like that feeling.
>> You're at the top of the roller coaster and you're about to go.
>> Yeah. But you know, it's going to be a lot of G's when you hit it.
>> Uh it's like people like I don't have to just have courtside seats. I'm on the court.
>> Exactly.
>> And it blows my And still blows my mind >> sometimes multiple times a week. Yeah.
>> Um and so >> just when I think I'm like, "Wow." And
then it's like 2 days later, more wow.
>> Yeah.
>> Um >> exponential. Wow.
>> exponential. Wow.
>> Yeah. I think we'll hit um AGI next year in 27.
>> You got that? Let me just say, Do you hear that? I think we're going to hit
hear that? I think we're going to hit AGI next year in 27.
AGI is artificial general intelligence.
That means the computer, the AI system is smarter
at everything than any human is. It is
better at, name the topic, than the best human you can find. And it can do everything that a human can do better than a human. It's not super
intelligence. It's just better than
intelligence. It's just better than humans on everything. He thinks we're going to hit that. Some people said we were not going to even there was not a chance 10 years ago. People say we never even get to there. We'll never get
there. We'll never get there. He's
there. We'll never get there. He's
saying we're going to hit it next year.
I believe we are going to hit it. I
think I think there's a chance we hit uh the world is going to be different by 2028. It's just going to be different by
2028. It's just going to be different by 2028 and we we have got to prepare for this. So our last caller was saying, you
this. So our last caller was saying, you know, well, you know, Smoot Holly, you know, you can use tariffs to protect.
There is a conversation that we have to have right now. We have to have because of artificial intelligence, automation, robotics, you know, entire categories of
work dissolving quickly, you know, not in a generation, maybe not even in a decade, maybe in the next few years. Factory workers first,
you know, I I reached out to Elon Musk and said for my museum, I want one of the first Tesla robots for the museum. I
want to keep it in the box and keep it in the museum. the first mainstream robot. You know what he said?
robot. You know what he said?
All of the robots for the first I think he said two years. He may have said one.
All of the robots that Tesla will be making for the first two years will be for Tesla.
Think of how many robots he's going to be making in a 2-year period. And all of them are going to be for Tesla.
That is a workforce army army. So, and that's coming
army army. So, and that's coming quickly, really, really fast. So, it'll
go the factory worker, then the truck driver, then the coder, then the accountant, um, you know, the analyst who thought they was immune from everything because, you know, I work
with ideas. The ground is shifting
with ideas. The ground is shifting quickly. So, what is it we should talk
quickly. So, what is it we should talk about?
Well, the world is going to talk about universal basic income, and I am dead set against that. But that's the only thing anybody is talking about. I went
back and I did my homework last couple of days on Milton Freeman.
Milton Freeman was kind of for UBI, but a different kind of UBI, and he may have the answer that is a bridge at least.
And I want to I want to talk to you about that here in a second. So, when
the world is in a panic, that's not the time to discuss things. You discuss
things before you're in a panic. You
make decisions before there's an emergency. And now is that time because
emergency. And now is that time because once there's an emergency, the loudest voice comes in and they have usually the simplest answer and everybody's like, "Yeah, yeah, go with that." And that
answer is going to be universal basic income. That is the modern version of
income. That is the modern version of bread and circuses. And make no mistakes, the communists, the social planners, the Davos crowd, they're going
to offer it all as not as a temporary bridge, but as a permanent arrangement with you.
Um, a managed society, a population that is pacified, production centralized, dependency normalized. They're already
dependency normalized. They're already talking about it. Read Yaval Noah Yaval Harrari. uh read his work. He is
Yaval Harrari. uh read his work. He is
the one of the main thinkers on the left and with all of the elites and he talks about you know there's just a useless class of people. We cannot look at it
that way and people are going to go for this not because they you know love collectivism but because nobody offered them another path.
I have been talking about this and trying to get people to talk about this for a while and there's a guy that we have to revisit. Um, and he's honestly a guy who surprised me that he's in this
debate and it's Milton Freeman. It at
least surprised me that he's on the side that he's on. I mean, here's the free market economist, the man being a defender. He's accused of being a
defender. He's accused of being a defender of the coldest kind of capitalism where they just don't care about babies and children and and and food. uh he's the godfather of
food. uh he's the godfather of deregulation.
He actually supports or supported a version of basic income, but not the kind that's being sold today. He calls
it the negative income tax. Listen to
what he actually proposed. His idea, the negative income tax, uh works like this.
You eliminate the sprawling welfare state. Okay? Really important we do all
state. Okay? Really important we do all of these things. Eliminate the welfare state. That's food stamps, housing
state. That's food stamps, housing subsidies, overlapping programs, bureaucracy. Replace all of that with a
bureaucracy. Replace all of that with a simple income floor that everybody gets.
If you earn below a certain threshold, the government will send you supplemental income, but as you earn more, the support will phase out very gradually. That way, you're not being
gradually. That way, you're not being punished for working. And that last part is really critical because under the welfare system, if you earn a dollar, you lose a dollar in benefits. And so
the rational response is, well, why would I earn a dollar? Because I'm never going to make enough to really be happy.
So why just I'm just be I'm going to live here. I live off the government.
live here. I live off the government.
Freriedman's system preserves incentive.
Okay? You always gain more by working more. Um because he knows that you just
more. Um because he knows that you just can't make people comfortable in the lack of work. Okay.
And um the system that we have now is honestly designed to get people destitute and destroy productivity. This is the
destroy productivity. This is the opposite. Okay? And I'm shocked that
opposite. Okay? And I'm shocked that Milton Freeman is thinking this way, but he understood something. Freeman
understood something fundamental.
Markets require stability.
Free societies require order. So when
we're talking AI, technological advancement is going to become so severe at some point that AI could create pockets of severe
displacement. And with that, you'll
displacement. And with that, you'll either get violent populism, authoritarian redistribution of wealth, or a market compatible safety valve. And
that's what his negative income tax was.
A pressure release without central planning. This is absolutely
planning. This is absolutely critical to keep the choice decentralized.
The individual, you still get to choose how you spend the money. The federal
government doesn't decide on which apartment you can rent, what food you can buy, which training program you have to attend. None of that. shrinks shrink
to attend. None of that. shrinks shrink
down the state as you create a floor for everybody. This is very different from
everybody. This is very different from modern UBI. Okay, it's that's tied to
modern UBI. Okay, it's that's tied to digital IDs, programmable currency, behavioral compliance. I mean, all of
behavioral compliance. I mean, all of that is the new world order and one world government. I mean, but it is
world government. I mean, but it is coming. If AI eliminates 15 to 25% of
coming. If AI eliminates 15 to 25% of current job categories over the next decade, you're going to see sudden income collapse in certain sectors, huge
geographic economic deserts, most importantly political radicalization. Okay, there is a piece
radicalization. Okay, there is a piece there's only two copies of this book that I have. One is in the National Archives because it was all handwritten and hand the charts are all made by hand
and it was made for President Roosevelt in uh World War II. It was I think it was the election of 1936 or 38 and it's
called uh radicalism revolution or recovery.
And they're making the case to FDR that I'm about to make to you in just a second. Radicalism, revolution, or
second. Radicalism, revolution, or recovery.
That's what really enhanced the New Deal. They chose incorrectly.
Deal. They chose incorrectly.
Will we choose the right thing?
Loading video analysis...