Michael Keaton's Google AI Film + The Company Producing 5000 AI Podcasts a Week
By Joey /// VP Land
Summary
Topics Covered
- AI Short Films Legitimize Avatars
- LLM Scaling Hits Diminishing Returns
- AI Slop Floods Feeds with Synthetic Podcasts
- Meta Glasses Realize Google Glass Vision
Full Transcript
Welcome back to Denoised. We're back in the studio. We're back. Good to be here.
See you again, Addy. In person, not virtually. Yeah, it's been months. I know. Just
weeks. Yeah, it's been a few weeks, but it feels like months. Yeah. All right,
got a couple stories today. So, yeah, let's get into it.
First off, I got an important question for you. Have you been following the Amazon Prime saga of The Summer I Turned Pretty? No.
Please enlighten me. Wow. This is, this is already interesting. Well, the grand finale was, was last week and now, you know, you got to figure out if you were team Jeremiah or team Conrad who got the girl at the end. Okay. Oh, okay.
No, no. So you've not been following along for the past like three months. I'm
not the demographic for that show. Let's just say that I'm the, uh, the husband demographic where my wife starts watching it and then it's on in the background and then like I get sucked into it too. And then now we're, yeah, now it's over. It is finished. Okay. Yeah. So which team are you or your wife? Yeah,
over. It is finished. Okay. Yeah. So which team are you or your wife? Yeah,
we were team Conrad and you know, why not to spoil it? Well, you gotta watch the show, but you know, you're just watching. You're just like, it's like a 12 hour version of the notebook. Oh shit. I mean, look, congrats to Amazon for finally having a show that is in the pop culture sphere, right? Like this is huge for them. Yeah. I mean, I guess, you know, that's actually a good point
too, because like usually the shows that have done really well on Amazon are like big strong dude with a gun, like Reacher, Terminalist. Yeah. I mean, Fallout does well, but that is also an action. Fallout. Yeah. Yeah. That's a good point. It's probably
the first one that's like just regular contemporary drama. Okay. I got a show recommendation for you. Yeah. I still haven't seen Alien Earth. Alien Earth is so good. Yeah.
for you. Yeah. I still haven't seen Alien Earth. Alien Earth is so good. Yeah.
I hear great things. It keeps getting better. Got to watch Black Rabbit on Netflix.
I tried to watch it last night. I fell asleep. Dude, Jason Bateman kills it on that show. All right. I'll give it another shot. Okay. I was really out of it yesterday. Did you say you fell asleep? I fell asleep. All right. Maybe
don't watch it. All right. Let's talk about AI stuff. First up, we got another update from Google's AI program to embed themselves with Hollywood filmmakers. They have another short film. I don't know. I guess they produced it, co-produced
filmmakers. They have another short film. I don't know. I guess they produced it, co-produced it, but it's part of their AI on the screen program called Sweetwater, directed and acted by Michael Keaton and written by his son, Sean Douglas. Yeah. And it also stars Kira Sedgwick. Yes. This is a really interesting story. Yeah. What is interesting about it for you? What I find interesting is that... You know, this is like, it's
how Fallout used the LED volume, like in the LED volume. There was no like fakery per se. Like it was an artificial screen and it's meant to be that.
In this example, AI is written into it as AI. So it's literally used. So his late mother's holographic AI version has the
used. So his late mother's holographic AI version has the connection with Michael Keaton's, you know, current form. And, you know, instead of like...
shot replacement and replacing VFX and all of those things that we're concerned with. It
just kind of plays into what AI will eventually be in the future, which is some sort of avatar for either late people or existing people. You're talking about like AI as a whole, like in our cultural impact. Yeah. Like, I mean, nobody really knows what AI will be in five to 10 years. I mean, yeah, I mean, I have thoughts like, I don't know if that's, you know. Except for Sam Altman.
He seems to know everything. I don't know if that's the mentally healthiest thing for us to progress to, to be able to, yeah, speak with past loved ones. That
aside. Yeah, that aside, I think the only way to recreate someone pretty convincingly is with AI, right? Like if you can have enough, like if you can train a model on the likeness of a person's personality, character traits, all that stuff, then an AI can replicate it to some extent. Yeah. I mean, that was literally a
Black Mirror episode. Right. Yeah. So they play into that. use of AI into this AI powered film. Right, because he's speaking with like an AI holograph of his late mother in the film. It's part of the storyline of the film. How many times have I said AI in the past? Sorry, folks. I'm glad it's not a drinking game. So, right, that's interesting. The other part is like, obviously, this is made with
game. So, right, that's interesting. The other part is like, obviously, this is made with their AI film collaboration program. AI on the screen. The details of what of the actual production of the film are light right now. Basically they said they made the film, they premiered, it's going to do a festival run. And that was really, I didn't really get any details of, is this all generated? Is this something like? Ancestra.
Ancestra. Ancestra. The film that was like part live action and then part, you know, AI baby, AI effects. And it was like a very nice blend of AI. It's
like a compositing of a AI baby onto a live plate, which we both of us were really impressed by. Like, it was well done. Yeah, it looked great. Yeah.
And the... But, you know, the bulk of it was live-action actors. You know, this one I'm curious about. Like, it does... You know, in this one picture of the film Sweetwater, like, I mean, they kind of look AI-y. Like, they look a little...
synthetic-y. Like you can tell it's Michael Douglas, but Michael Keaton. But, uh, nah, no, you think it's real? No, that's just a heavy, uh, color grading. Yeah. Okay. So
then that aside, I'm curious what that, what it is. Yeah. What they used. I
mean, I'm going to guess they use the Google tools like VO3 and, or some crazy custom version of VO3 that we don't have. They have VO4. Yes, exactly.
No, it's, it's really, um, I think a proven pathway for technology adoption in film and TV, when a new tech comes along, you generally don't want to throw a lot of money into it. So a short film format with a few million dollars budget, great. And then it also goes into the film festival circuit, which in and
budget, great. And then it also goes into the film festival circuit, which in and of itself, if it's successful, has a pretty good runway into getting the film acquired and then eventually made into a feature. Yeah, sort of. I mean, I think a few million is a crazy number. you say for a short film? Yeah. I mean,
with Michael Keaton, right? I mean, I still think that's a crazy number. I feel
like an acquisition of a short film is a crazy number. This is like, I mean, like it's a marketing thing. Like you get a legit actor and name involved.
He makes, you know, he's involved in AI, makes an AI short film. Yeah. It
just, you know, helps legitimize the tool set that they're trying to you know, get embedded more into more Hollywood productions. Yeah. When we say, you know, AI short film, what sort of budgets are you thinking typically would make it successful? I mean, I for this thing, I would still picture it to be in the traditional realm of like if you were making a short film, like with a decent budget. So 100,000
to 250,000? For like, let's say, 30 minutes. 30 minutes. Yeah, I mean, if it's 30, yeah, maybe a bit more. It also depends. Obviously, this depends what is it one location? Is it a set? Is it a complicated thing? You know, the idea
one location? Is it a set? Is it a complicated thing? You know, the idea of AI is supposed to be like, well, it's supposed to reduce budget or make things more possible. So, you know, were they using it for exteriors, B-roll, VFX? Yeah.
I don't know, I feel like 100,000 is nothing, especially if you take physical production into consideration. Yeah, I mean, if it was like a two-day shoot, then maybe.
into consideration. Yeah, I mean, if it was like a two-day shoot, then maybe.
Oh, okay. That's small. I mean, I don't imagine. I don't know. The details are light on this. We don't know how long it is. We don't know what it's about besides the AI thing. Just my gut instinct, just having produced a short film.
I'm not talking about making a film. I'm not talking about his fees for being...
No, no, no. Of course. Yeah, I'm just talking pure production costs. Yeah. I think
a couple million easy. You think so? Oh, yeah. Oh, yeah. I mean, just to be comfortable. These are the two different worlds that I come from. These are the...
be comfortable. These are the two different worlds that I come from. These are the...
A couple million bucks to make you a feature for that. Well, just renting like a Arri Alexa for a week is like, what, $20,000 or something? You don't think they shot this on the new Nikon Red $2,200 camera? Then you're playing in a, you're not going to win TIFF or, you know, CAN with an iPhone shot.
I get what you're saying. OK, agree to disagree. Yeah, I mean, I still think a couple million would be crazy just for them to produce the film. Not, you
know, excluding the marketing and the promotion they're going to be doing, you know, for pushing this film. Yeah, but like we're saying, we're speculating because I would love more details about what this actually entails. It's very cryptic. Just a Google blog. It is
just a Google blog post about the Q&A at their premiere screening in New York, which also I thought, I mean, I guess he lives in New York, but I was like, interesting that's New York and not L.A. The other Darren Aronofsky film was also New York based. True, that's true. I guess they're all there. Yeah, Google does have a giant office in Manhattan. Yeah, on the water, right? Yeah, yeah, yeah. Another
converted old industrial space like the one they have here. I'm sure the lunch in there is great. So you never leave. You know, but speaking of the budget, so we didn't get into details because I think we were out of town. But there
was that other OpenAI producing critters feature. Yes, yes. And that budget is like, $30 million. So I have some notes on that movie. Okay. Maybe we can cover
$30 million. So I have some notes on that movie. Okay. Maybe we can cover it in another episode, but it sounds like a ton of CG stuff went into it. I'm sure. I mean, to still have the $30 million budget and it seems
it. I'm sure. I mean, to still have the $30 million budget and it seems that people be like, I thought I was supposed to make this stuff like cheap with 30 million bucks. It was like, well, 30 million compared to 200 million. Yeah.
It is a lot cheaper, but yes, 30 million is still high. Most, with the exception of Pixar, Disney, and DreamWorks, Nobody's making 100 million plus animated movies anymore. The average animated film as of 2020 is when I last budgeted
movies anymore. The average animated film as of 2020 is when I last budgeted one. A $15 million budget will get you a really nice animated film, mostly made
one. A $15 million budget will get you a really nice animated film, mostly made in India. What time frame? What time over what? One and a half to two
in India. What time frame? What time over what? One and a half to two years. Right. And they're trying to do it by can next year. Oh, okay. Sure.
years. Right. And they're trying to do it by can next year. Oh, okay. Sure.
Yeah. I mean, because I think, you know, the flow is a great example. And
it's like, that was... 4 million, but over like four or five years. So
the more you squeeze the timeline, the more the cost expands because you just hire more people. Yeah. Yeah. 30 still does feel high. Yeah, it does. And I'm curious
more people. Yeah. Yeah. 30 still does feel high. Yeah, it does. And I'm curious where the... Yeah, like how much is just going to be traditional, like CG, build
where the... Yeah, like how much is just going to be traditional, like CG, build it out, and where the AI lift is going to come into play? You think
they made the whole movie with AI, and that was like 10 million. Then they
had to delete the whole thing, make it in CG. That was another 20 million.
Or just when they made the first version, all of the models have improved so much. It's just like, we just got to redo it all, because everything we had
much. It's just like, we just got to redo it all, because everything we had before, it doesn't work anymore. Yeah, something's off about it. So I do want to investigate a little bit more. Yeah, I'm curious about the numbers break down. But I
mean, it could also just be because they're trying to do this very fast. Yeah,
and also OpenAI. is generally not a hollywood player so they're probably spinning up new infrastructure to do yeah production management yeah yeah and i'm curious because like sora has kind of not been in the conversation as much uh since all the other models have come out so like are they trying to get back in the game or is it also like they're producing it ish and maybe they'll use sora but they'll
probably just use whatever tools are i saw a lot of dolly stuff uh in the articles like it's a I guess a new version of DALI being used? Is
it? I mean, I'm still wondering too, because also like depending how the articles were written or researched, if you have AI research stuff, it will bring up DALI. I've
noticed from trying to research experience and like from open AI stuff, I mean, they haven't called a product DALI. I don't think in like since the beginning of the year, like it just shifted to chat GPT image. And if you like call up the APIs or if you go into the other tools, it's ChatGPT image, like DALI as a model name. Maybe it's internally referred to that still, but like I feel
like publicly it has not been called DALI for like six months. I have an OpenAI comment I like to make. OK. Watching a YouTuber, like an AI-oriented YouTuber, explain it way more elegantly than I will. Essentially,
large transformer models like ChatGPT and Claude and all that stuff, they've hit the upper ceiling of how intelligent they could be. ChatGPT5 was, I think, trained on like four trillion parameters. And the intelligent quote unquote, intelligence gain was like law of diminishing returns. Like it was barely. Yeah,
it came out and I was like, OK, cool. Yeah, right. Nice. And so they've fundamentally hit a limit where they bet big on scale. And now that bet is not paying off. So internally, within the AGIASI community, there's like a bit of a scramble. And it's not just that opening up. Of like what to do next or
scramble. And it's not just that opening up. Of like what to do next or how to train. What to do next. Because I mean, I also imagine like 4 trillion parameters That's like the earth. Like what else do you throw out this stuff?
That's all of the internet. Yeah, what else do you throw out these things? Exactly.
So and now there's some Chinese models that are, actually, I'm not sure if they're Chinese, but they use a distributed system architecture. And
it's like a few hundred million per different node. So it's like, imagine like a hierarchy of models working together. And that's supposedly more intelligent than ChatGPT 5.0.
So it's not about scale. And if it's not about scale, then that puts NVIDIA's big AI bet at risk, right? Because they were betting that they would build city-sized infrastructure for not $4 trillion, but for Quantillion or whatever the next number is. Well,
I don't think that's slow down because didn't I just see something this morning that they announced that NVIDIA is investing in OpenAI? Yeah, I saw a post that it's a 100 gigawatt. commitment. Like they're just going off power. NVIDIA to invest $100 billion in OpenAI as AI data center competition intensifies Reuters. Well, that
doesn't stop them. Yeah. And that's essentially the theory. They need to keep building bigger and bigger centers and more chips. They're not going to stop because I think the shareholder price and all of the valuation depends on it, right? They're just going to have to keep making more chips. Yeah, exactly. But at the same time, the whole scramble, remember when we were laughing when Zuckerberg's poaching backfired and
all the folks were just leaving? A lot of that had to do with the fact that LAMA just couldn't hit any level of proper AGI metrics, no matter how hard they tried. Yeah, no matter who they brought in or how much they trained. And we're hitting this very upper limit in the research community that, I mean, it doesn't affect us day to day. I'm very happy with ChatGPT
the way it is now. It's totally useless for certain things, but the companies are betting so big and it's such a long timeline that if they don't show progress along that slope, then it just dismantles the entire business model. And that is a big, like if... Well, I mean, that was also the fear, you know, with when DeepSeek came out, like beginning of the year. Yeah. And it was like, oh, you
could do, you know, you could do ChatGPT for a fraction of the cost. You
don't need all these crazy data centers. Exactly. Didn't really seem to... have an effect on how things were going. The entire Silicon Valley. You're saying like where it's like, oh, you need the big models first to distill them. Yeah, I think where it will affect us is the entire Bay Area Silicon Valley economics is entirely reliant on this AI thing working out and not being a bubble. And if that
bubble comes crashing, that's going to have a huge impact not just on the US economy, but the world economy as a whole. And like that. Yeah, I mean, look, I've heard about the bubble thing. I mean, and also the comparisons to like crypto and stuff. Sure, sure, sure. You know, I mean, this feels different. This feels like,
and stuff. Sure, sure, sure. You know, I mean, this feels different. This feels like, oh, you get very like tangibly see benefits that come out of this and like ways that things can change. Crypto is always sort of like, how does this make things better? Where is this besides pumping up a coin that you made up? I
things better? Where is this besides pumping up a coin that you made up? I
got a neighbor that's a crypto bro. Got a brand new car. There are some things I get about crypto that do seem overall positive. But there was a big lift. But AI, I mean, yeah, there are immediate benefits. But does
lift. But AI, I mean, yeah, there are immediate benefits. But does
it support the amount of data centers and chips that are being built? I don't
know. I think for the near term, we don't have enough data and chips. I
think the next few years, absolutely, the demand is higher than supply. So NVIDIA will continue to see a rise in state. But then it'll plateau eventually, or something will happen. Like underneath the actual architecture for transformers will have to be a
happen. Like underneath the actual architecture for transformers will have to be a new architecture. And I'm guessing that new one will be far more efficient to where
new architecture. And I'm guessing that new one will be far more efficient to where you won't even need trillions of parameters. And at that point, instead of like using 100 megawatts of data center power, you need one. And then it'll significantly reduce, like the supply demand curve will completely flip on its head. The same way the film industry has, right? It's like in the past, we couldn't give the audiences enough film.
Mm-hmm. And so there was such a giant economy around this limited number of films that were released. Fast forward to today, we have more entertainment than we know what to do with. So it's flipped. And now each piece of entertainment has less value attached to it. Yeah. Yeah. I agree. So I think the same will apply for AI models. Or you have to build a big model and it gets distilled or
AI models. Or you have to build a big model and it gets distilled or shrunk into a smaller, faster model. Yeah. Like looking at it. I mean, because I'm thinking like even the, you know, the training of large language models, that's one thing, but also running the video generation stuff takes a lot of compute. And if there's also this, this, going with Luma's vision where personalized videos for
everyone and that partner that ship they have with Humane in Dubai or Saudi Arabia or wherever. Yes. They're building big data centers and their vision was like, yeah, everyone's just generating their own personal videos and videos everywhere. And it's like, sure, that'll take a lot of compute and a lot of power until the models get distilled and shrunk and reduced. You can even see with VO3 fast where that got to
much smaller that Google's like, just generate it. Like unlimited VO3 fast, which came from VO3, the full model. Yeah. Just stepping outside of our M&E bubble, though, we think image and video generation is a big chunk of AI usage. It's actually really not, right? Like I would say it's probably a few percentage of the overall data center
right? Like I would say it's probably a few percentage of the overall data center utilization if you look at finance and law and all the stuff that ChatGPT is doing really well, insurance and real estate. You think that's a bigger compute version? like
90% of what NVIDIA is spinning up. That kind of stuff. For those industries. Yeah.
Also, what about not even just M&E specific, but like advertising, like image campaigns, like for e-commerce and... It's big, but it shies in comparison to like, just take any like real estate. Like there is a million transactions that are happening today, just around the US probably. And each one of those million transactions is probably going
to have some type of chat gpt api call so you're looking at like millions of calls just for that industry alone add finance add legal add whatever software code development all that stuff those i think are the big buckets of usage and if you don't see general intelligence there those markets will will not use
it yeah i mean i'm also thinking like that is a forward thinking like I'm wondering like what law firms and insurance agencies have like adapted AI like the edge of the workflow yet I think as an enterprise maybe not too many but at an individual level like you and I use it all day long and I have friends who are lawyers they use it all the time yeah I get yeah I
get what you're saying yeah I just feel like one push of the video generation is going to be like equivalent to like my text chat.tp2s for the day. Oh,
gotcha. Yeah. But overall, I can already say, we're building a lot more data centers.
We'll probably need them in the future. Eventually, we won't need them as much. And
then what happens? It's like when China builds those empty cities. Yeah. We're just going to... We can just convert the data centers into affordable housing. It'll be warm.
to... We can just convert the data centers into affordable housing. It'll be warm.
Yeah. Big tangent. But let's come back to, well, I mean, this sort of ties into what we're talking about. So this company, Inception Point AI, built by former Wondery, the podcast company, Execs. They're launching this new company and their plan is to basically just churn out AI podcasts at scale. 5,000 podcasts, 3,000 episodes a week, $1
cost per episode, $1. That was the Hollywood Reporter's headline of this. You know, when we spun up our podcast, this one, the first thing I was thinking is, does the world need another podcast? No, the world needs 5,000 AI-generated podcasts.
So, all right. I mean, so the business plan to this is they're going to spin up a bunch of these podcasts, push them out, and... Here, Inception Point AI already has more than 5,000 shows across its Quiet Please podcast network and produces more than 3,000 episodes a week. Collectively, the network has seen 10 million downloads since September 2023. It takes about an hour to create an episode from coming up with the
2023. It takes about an hour to create an episode from coming up with the idea to getting it out in the world. And I think the play here is bust out a bunch of quantity and then get hits with dynamic ads insertions and make some money that way. The fact that the... The entire storyline of the
podcast and all of the content is AI generated. That just really, it gives me the ickies. Yeah, it's like, this is the AI slop thing that we're
the ickies. Yeah, it's like, this is the AI slop thing that we're trying to battle against and dissuaged fears of just like, oh, you could just produce the equivalent of podcast brain rot en masse. Absolutely. And this is the podcast equivalent of that meta AI post with the artificial family. Yeah. That's posting about their
vacation. Oh yeah. That or the chop by like talk to the, talk to the
vacation. Oh yeah. That or the chop by like talk to the, talk to the stepmom or whatever, like that weird stuff. I think it's gimmicky. I think it'll, it'll probably have a short term little spike, but in the long run, I don't see any of this working. Yeah. Oh, here the company's able to produce each episode for a dollar less, blah, blah, blah. This generally means that if about 20 people listened
to that episode, the company made a profit on that episode. I see. You know
what? So, right. It's a volume play. Oh, I mean, all it takes is 20 people. For us legit podcasters, like launching a podcast and building a
people. For us legit podcasters, like launching a podcast and building a podcast, putting it out in the world and like building up the audience is a pain in the ass. It's one of the hardest things to grow on. Thank you
for watching. Yeah, thank you for watching and subscribing and the reviews. But it's a pain like excluding YouTube because they have a much better model, but like Apple Podcasts and Spotify. Yeah. pain to grow and develop there. And then if the market just
and Spotify. Yeah. pain to grow and develop there. And then if the market just gets flooded with all these other podcasts, with these AI automation podcasts, it just like drowns everything out. I think it just makes the noise floor bigger, but the noise floor is still a floor. Yeah, it was like, how do you, like, okay, maybe you got a couple hits on this, but it's like, are you building a brand?
Are you building, are you building? It's a click through right there. Yeah, it's like, what is the longevity here? If you're just like, let's just keep churning out stuff en masse. what's the long term play? Also, I think the the
en masse. what's the long term play? Also, I think the the hyperscalers like Azure or AWS is probably like funding a lot of this because it's just helping them spin up their virtual machines. Yeah, I see that. I hear the company produces different levels of podcasts. The lowest level involves weather reports for various geographic areas. I could get how that like I mean, it's like an SEO play or
areas. I could get how that like I mean, it's like an SEO play or simple biographies and higher levels involving subject area podcasts hosted by one of about 50 AI personalities they've created, including food expert, Claire Delish, gardener and nature expert, Nigel Thistledown, and Ollie Bennett, who covers offbeat sports.
Did they prompt ChowGBD to have witty names? ChowGBD, like, why is there a whole company? There should just be one person just, like, spinning up AI agents to produce
company? There should just be one person just, like, spinning up AI agents to produce this stuff. I think their overhead is way too much. I want the conspiracy theorist
this stuff. I think their overhead is way too much. I want the conspiracy theorist guy, like Mr. Tinfoil. Yeah, Mr. Tinfoil. Yeah. So yeah,
like you said, this kind of gives me the egg. It's just like the AI model that's not the best. It also reminds me, you ever heard of dead internet theory? No. It was like, basically, it was after forums and other boards and
theory? No. It was like, basically, it was after forums and other boards and everything started growing, the point of the internet turning today where most of the interactions and comments and stuff on the internet is like bots and bots commenting and bots interacting. And so that the actual comment ecosystem on the internet is not real
bots interacting. And so that the actual comment ecosystem on the internet is not real people and it's just sort of this dead internet. Most of it is not. That
is statistically true. This also feeds into the dead internet theory where just the bulk of action happening on the internet is not like people to people, it's people to robots or just robots to robots. Like going back and forth with each other. Which
is not good because future AI models will train on all this data. I always
wondered that from the start. Like WTF, man. Yeah, that was my always big question on when they started training the LLMs and they were pulling stuff from the internet.
And it's like, OK, you pull stuff from the internet. And then you start writing blog posts. It's fake to begin with. And then you start putting the blog posts
blog posts. It's fake to begin with. And then you start putting the blog posts that were generated up there. And now we're just going to keep draining. It's like,
oh, just our knowledge point is going to stop at like 2020 of original stuff.
And then everything after that is going to be the new training is going to be incorporating synthetic data. For sure. I mean, the lens that I will sort of, I see this every day. I work with image models. I work with AI. The
minute you introduce synthetic AI images into a training set, it really messes up. Yeah.
If you're not careful about it, you're throwing off the whole training. Yeah, I see the Reddit comments and stuff where people are like, I got a data set here.
It's like all synthetic stuff. And they're like, get out of here. That's crap. Yeah.
No, seriously. It's going to be crap. And you could tell which AI models use a lot of synthetic data. And the output is quite synthetic because of it. So
I would imagine the same thing would apply for LLMs and stuff if it's like, Reading through the comments of all the bots, it's going to sound like a bot.
Yeah. Oh, here we go. OK, they are using agents here. The episodes themselves are built using AI powered by 184 custom AI agents who work with several large language models, including OpenAI, Proplexity, Cloud, Gemini, and more, to build out the content. That's what
I would have used. I mean, also, like, Google. Why not 185? 184 is it?
184. Yeah, it's very specific. Yeah, this also feels like something that Google would just, like, blow up in a second. Like, if they just turned Notebook LM into a product or a service, which I'm sure eventually will happen. They'll just turn it into an API, and then Riverside can grab that API and then turn it into a whole platform. I mean, I think Google would just do it all in their own
whole platform. I mean, I think Google would just do it all in their own full stack. Yes. I mean, their Notebook LM podcast is really good for learning information.
full stack. Yes. I mean, their Notebook LM podcast is really good for learning information.
Yeah. All right, last one. Meta Ray-Ban glasses. So the good and the bad. I
mean, overall, it sounds like pretty good. Overall, it sounds like they got half full scared. Like everybody's kind of, um, this is the
scared. Like everybody's kind of, um, this is the internet reaction that I'm seeing. Wow. Those are great. Damn. Apple missed the boat on that one. I mean, or they're just like, I mean, well, like we said before,
that one. I mean, or they're just like, I mean, well, like we said before, they're never first. They're just usually the best. So like this is, they've had the meta Ray-Bans for a bit, you know, with the Ray-Bans and the camera inside. They're
really good. But these are the first ones that have a screen built in that you can see, uh, uh, You can see a screen inside the glasses. I believe
it's a tiny projector that's projecting on the right eye only. OK, just one eye.
Yeah. Yeah, it's small. It's like 800 by 800 pixels, I think. Yeah. It reminds
me more of the snap spectacles that I tested at AWE that are way chunkier in the build. I think they're supposed to be a lot smaller next year. But similar where it's like a very kind of narrow-ish field of view. And
year. But similar where it's like a very kind of narrow-ish field of view. And
you can kind of see alerts and information and stuff. It's chunkier, but it's also super light. I think it's like, I mean, these Ray-Bans look good. I mean, it's
super light. I think it's like, I mean, these Ray-Bans look good. I mean, it's got like a little chunky in the thing for the battery. The thing is all battery, like the bridges or whatever. I think it's only 130 grams or something like that, whereas like a real Ray-Ban Wayfarer is like 70 grams. Oh, okay. So it's
like just twice as heavy, but it's got compute right on it and batteries and all that stuff. Yeah. What did the, I think, was it The Verge or someone called it? They said that this is the first thing that has like It felt
called it? They said that this is the first thing that has like It felt like what Google Glass promised. Everybody's crapping on Google. It was like what they promised like 10, 15 years ago. This is like the first thing that actually delivers on that vision. Yeah. So yeah, I mean, what do you think about uses and stuff
that vision. Yeah. So yeah, I mean, what do you think about uses and stuff for this? I think a daily personal assistant thing, just notifications and managing calendars.
for this? I think a daily personal assistant thing, just notifications and managing calendars.
If you're walking around the street, you know, directions, like all those obvious things. Great.
Most people don't think of glasses as a listening device and a personal one.
The speakers are like right above your ear and they're really good. So I wore the first gen Meta Ray-Bans and then I was just listening to some music and then I asked, like the guy was literally this far away. I was like, can you hear any of it? Like, no. Oh, wow. Oh, that's cool. So like they could replace your AirPods. You know, whatever you wear your AirPods for, you could just
wear a pair of glasses and now you have ear and vision. It's got the built-in meta AI chatbot and can show you pictures and text your answers to questions.
So it also feels like it has a bit of that smart AI element that they, I think you could do translations too, like that they showed with the AirPods.
That would be nice. But, you know, this one you can see things and sort of interact with the world. Yeah, I mean, it feels like the best first step for actually useful AR glasses. I have an interesting use case. You might want to run this up by Lightcraft. OK. AI Glass Vcam. Oh, so you kind of see like a. Yeah, so like you're crafting the shot. Instead of holding a device, you're
like a. Yeah, so like you're crafting the shot. Instead of holding a device, you're just like kind of doing this with your head. doing a chicken head. Like a
director of view finder thing. Yeah, you're like doing this, right? With your glasses. You look like Spooling. I thought you were saying, oh,
this, right? With your glasses. You look like Spooling. I thought you were saying, oh, you could see the camera feed or something in your glasses. Sure. That could be.
Because I remember Strata, they did that test when the Apple Vision Pro came out.
And then they rigged it up so that the AC could have the viewfinder in their Apple Vision Pro and pull focus, like using that, the Apple Vision Pro. That's
super useful, especially if it's a large set and the guy's like way the hell out there. I mean, I don't think you would want to pull focus on a
out there. I mean, I don't think you would want to pull focus on a 800 by 800 pixel screen. No, you pinch to zoom, man. You do one of these. The other thing, I don't think it can, it doesn't
of these. The other thing, I don't think it can, it doesn't have like spatial mapping. I don't think it can. Oh, no, no, right. Because that
was the other thing that the Snap Spectacles could do. Yeah. They could identify surfaces and stuff. Yeah, it's doing slam tracking. Animation on a physical surface. I don't
think these can do that yet. I'm sure that's obviously planned. The other thing they announced, too, which didn't really get as much coverage, was they also have a partnership with Oakley, and they released smart Oakley sports classes. Oh, I didn't know that. Okay.
Yeah. A display free... Oh, wait, I thought they had displays in them. Never mind.
I thought these were... The other thing I want to cover is... I thought these were like a HUD thing. So, they have this like smart wristband thing that is essentially reading your muscle contractions and then figuring out... Yeah. Control. Dude, that thing has been in research land for like 20 years. I remember in like the early 2010s, we were trying to... use a lot of that for motion capture because we could
just eliminate finger gloves and all that and just like capture. And it was not quite there yet. That's also why like when Apple finally does release something like this, it's probably just going to like blow everything else out because they're already, they have like the groundwork. Cause like Apple. Spoken like a true fan boy. They're going to blow everything out, man. Are you sure, Joey? I mean, I'm not 100% sure, but
I mean, they have a pretty good shot, but because they're, they have the Apple watch. And that Apple Watch already does, can track pinching gesture control. They'll have the
watch. And that Apple Watch already does, can track pinching gesture control. They'll have the glasses or whatever. They'll have the like phone with the crazy processors built in connected to it. Cause it's also like, this only, a lot of this stuff only works
to it. Cause it's also like, this only, a lot of this stuff only works with your phone nearby. Cause the phone's doing the processing on the Meta glasses. So
look, Apple has the groundwork and they have the. What would make you want to get one of these right now? Like what would make Joey Gobe spent $800 on this? Yeah, that's a good question. I think it would have to have, it would
this? Yeah, that's a good question. I think it would have to have, it would have to need to have some sort of like AR and like spatial mapping or something. Like it would have to be something, like I don't need a screen in
something. Like it would have to be something, like I don't need a screen in the corner of my face to see like a text message. Like I don't care.
Yeah, like so I don't really care enough about that. It would need to have like something where it could, you know, plop displays or graphics or on the wall or on the table or from like walking around a city, like actually like show arrows and like map directions and stuff. Would you use the cameras at all? Maybe.
I mean, the camera thing has been kind of appealing for just like, oh, it'd be cool to grab a quick shot of this or something of like a POV kind of shot. I can't think of any like professional use. Professional use, yeah. I
mean, I can't like, aside from virtual location scouting or something like that. I'll imagine
like a police officer would have. I can't think of where headsets have even been.
Police officer would have great use for this. Like instead of body cam, in addition to the body cam, they have this camera. And then whatever license plate information. Yeah,
they could pull up information. Like the stuff they would go back to their car for the computer on. But you need heavy AI integration and all that included with it. Yeah, and you would need like fast processing and data connections and stuff like
it. Yeah, and you would need like fast processing and data connections and stuff like that. And a battery that lasts all day. Or if you're working at like an
that. And a battery that lasts all day. Or if you're working at like an Amazon warehouse or any kind of big industrial facility, then you have relevant information pulled up. with vision and everything. I don't know. Yeah, it's been tough. It's also got
up. with vision and everything. I don't know. Yeah, it's been tough. It's also got a little tough for me too, is I don't wear glasses. So it's not like, it'd be like, it's a, it's a, it would be a new, like beyond just like the tech decision, just like a personal style slash, you know, changing my day-to-day wear to wear glasses. Yeah. So yeah, it's, it, that's a good question. That's
a good question. And they're, these are heavy. Yeah. The pros would have to significantly outweigh the, the nuisance of wearing glasses every day. Yeah. I mean, the Apple Vision Pro, I've only used it a few times, but that's such a different, like that's something where you could theoretically be traveling or something or bring up a bunch of screens and change the way you work a little bit. Yeah. Smart glasses are a
bit of a different watch. I've been wearing smart watches for, I don't know, a long time now since the Gen 1 Apple Watch, and now I can't live without them. Two main reasons. One is all of the watches that I used to have
them. Two main reasons. One is all of the watches that I used to have would drift in time. over time. So even a minute off is too much.
These things are obviously all synced. Everything's synced. Yeah. The second is I kept missing meetings because I never would get the calendar invites and stuff. And now I get the meeting invites and all the numbers. So now I feel like I look like a professional person, but this thing's helping me do that. So I think if those two things were put on the glasses, then I could maybe not wear the watch.
Yeah. I mean, the fitness thing, I could see that be useful for glasses. Yeah. So I could see stuff without having to pull my phone out or
glasses. Yeah. So I could see stuff without having to pull my phone out or see, like, metrics and stuff. Yeah. That would be useful. Okay. It's such a big lift where it feels like a very dedicated, like... full-time athlete kind of purchase. It's
like a nice to have. I mean, the tech companies want us to wear the watch, the AirPod, the glasses. And then the last thing is they jack us in with Neuralink. And we're just toast. Just like, oh, awesome. Maybe that's the thing with
with Neuralink. And we're just toast. Just like, oh, awesome. Maybe that's the thing with the Matrix. The Matrix got wrong was that the robots forced everyone to plug into
the Matrix. The Matrix got wrong was that the robots forced everyone to plug into a battery. In reality, we're going to voluntarily plug ourselves in. 100%. That world is
a battery. In reality, we're going to voluntarily plug ourselves in. 100%. That world is way better. You're just going to have more fun. Just, just, just, just. Yeah, it's
way better. You're just going to have more fun. Just, just, just, just. Yeah, it's
going to be more like Ready Player One Oasis. Exactly. we're voluntarily going to go there. Yep. All right. So yeah, that's a roundup for today. All right. We'll be
there. Yep. All right. So yeah, that's a roundup for today. All right. We'll be
back next week with the AI roundup later in the week, this week. I have
some shout outs. So Spotify has been getting some cool comments. Those of you who are listening or viewing on Spotify, thank you for doing that. Paul Trapani, thank you for your comment. OlaLee92, thank you for your comment. And again, if you're just new to this, You know that little rating? If you give us a little five-star rating, that goes such a long way. Please consider doing that. And yeah, also shout out
to Robert Flowers TV for always a good commenter on YouTube. And so, yeah, thanks for the action. Thank you. Glad you have. He commented about ComfyCloud. Oh, yeah. I'm
excited about seeing that coming too. Joey and I were just talking about having more ComfyUI episodes. We were. Yeah. So yeah, if you've got anything specific you want to
ComfyUI episodes. We were. Yeah. So yeah, if you've got anything specific you want to see, we will try to break it down. Links for everything we're talking about at denoisedpodcast.com. Thanks for watching. We'll catch you at a roundup this Friday.
denoisedpodcast.com. Thanks for watching. We'll catch you at a roundup this Friday.
Loading video analysis...