Developer Preview: Introducing Meta Wearables Device Access Toolkit
By Meta Developers
Summary
Topics Covered
- Meta's SDK unlocks AI glasses for third-party developers
- User control is paramount for AI glasses integrations
- AI glasses can revolutionize sports experiences
- Wearables expand independence for the visually impaired
- AI glasses lower the barrier for live content creation
Full Transcript
[Music]
the Hey, Meta, take a video.
>> All right, guys. Uh I before I get started here, I have a question for the room. Um for from all of you by a show
room. Um for from all of you by a show of hands, how many of you already own a pair of Ray-B band Metas?
Seems like most of the room, I think I can say safely. All right. Now, again,
from the whole room, not just the RBM owners. Uh how many of you have ever
owners. Uh how many of you have ever thought that would be really great if you could hack on the sensors that are in these AI glasses and build your own things with them rather than just using what they come with? Okay, pretty good
number again. Okay, this should be a lot
number again. Okay, this should be a lot of fun. Um, I'm here to talk a little
of fun. Um, I'm here to talk a little bit about the first step towards a wearables developer platform. Uh, it's a new SDK that we announced this morning
in Bos's keynote. Uh, that we refer to as the Meta wearables device access toolkit. Uh, it is a way for you to
toolkit. Uh, it is a way for you to build your vision and connect it with our AI glasses.
So, I'm Tom Carlo. I'm a product manager with the wearables developer platform team in London and I'm going to be taking you through some of the features and then some of my colleagues are going to be coming online uh later in the talk
to talk about some of the integrations that have been built already.
All right, so I have the glasses, you know, they're up here on my face and a lot of you own them, so you're probably familiar with this, but I'm going to run through the sensors since that is such a central thing to the toolkit in general.
Um, you know, first and foremost, these glasses have a great highdefinition camera that captures right from your point of view. They have advanced spatial microphones that are able to pick up your voice as well as any voices
and noises in your surroundings and distinguish between the two. Finally,
there's these great openear speakers that allow great audio quality while keeping you connected with your environment.
Finally, the glasses really offer a truly hands-free experience. You can
control them using your voice or you can use a few tap gestures, but you're able to stay in the moment and keep doing what you're doing. So, you know, already I think so many of us here in this room and around the world are using these
glasses to capture the moments of our lives and share them with our friends and families. But up until now, the apps
and families. But up until now, the apps that were able to access these sensors were limited to only a few that had specific integrations that came with the glasses. That's going to start changing.
glasses. That's going to start changing.
What we're adding now is the device access toolkit that allows access from apps into sensors on the hardware. U
before we get into the details of the SDK, uh we have a brief concept video that'll show you kind of our vision of what we believe one of these integration looks like, what one of these
integrations looks like for a consumer at their end. So, let me play the video.
>> Starting plant check-in. Glasses camera
is active.
>> Hey, gardens in. How do you think the snake plants doing? It's looking great.
Give the soil a feel. And if it's dry, give it some water.
>> All good.
[Music] Should I fertilize these soon?
>> No, you won't have to until fall. Cool.
Are these ready to be picked yet? The
color looks good. If it has a slight give when pressed, go for it. Great.
Take a photo and post it to Tomato Tribe.
Got it. Posting tomato tribe.
[Music] Okay, so that's only a concept video. Uh
if if somebody can build it, it would save a lot of plants in my backyard. I
know that there are also some other concept videos posted online this morning from some of our partners uh that were uh showing a little bit more of what can be built using this toolkit.
But how exactly does all this work?
All right. So, at its core, the toolkit allows a mobile app on Android or iOS to access the camera sensor in the glasses.
Eventually, it'll extend to the other sensors that I talked about such as the microphone, uh, etc. That means that a mobile app can offer an integration
that's secure and userled. And when I say userled, how does that work? Users
will manage the connection to their glasses via the Meta AI app. kind of in a similar way to the way that they currently manage the existing connections like Audible or Spotify. So
that allows users to be in control. Uh
when an integration wants to access the camera or another sensitive sensor, there is a permissions flow and the user gives permission for the integration to get access so the users remain in
control. That kind of user control
control. That kind of user control ensures that users feel safe trying out the integrations that all of you build.
On the developer side, we're also adding into the Meta AI app a developer mode.
This developer mode allows you to build, test, and iterate locally uh for rapid iteration. We understand it as a team
iteration. We understand it as a team and as a company that iteration is really critical to building any great software product. So, we've tried to
software product. So, we've tried to ensure that you're able to do that locally and quickly as you build these integrations.
All right. So thinking about, you know, I think the question that folks probably have this morning is, okay, you've announced this. When are folks going to
announced this. When are folks going to start to be able to access this and work with it? Uh, this morning we posted
with it? Uh, this morning we posted videos online showing how to use the SDK and what, you know, the kind of the basics of building an integration and testing it. Uh, as you might know,
testing it. Uh, as you might know, there's also a demo stand outside with folks showing off how an integration works and you can talk to various members of our team about that. Over the
next few weeks, we'll start to onboard the first developers to the platform.
There is also an interest form up and available so you can sign up to indicate interest and um that will continue to ramp over the next few months. Later
this fall, we will start to do uh the first bit of limited publishing enabling some of the integrations that align with early east early use cases to publish
their integration in their app out to consumers around the world.
Uh our goal is that by next year this opens up wider and wider enabling both open access and open publishing. We're
taking this kind of a phase approach to help us learn what resonates with our RBM consu or their Rayban meta consumers, what the developers need from us and what the best way is to try to
start to grow a scaled ecosystem around this platform. As the platform continues
this platform. As the platform continues to mature, we'll continue to open up publishing access.
But really, this access to the camera sensors and what we're talking about this week and what we're shipping in a few weeks is really just the beginning of this program. We think of the device
access toolkit as opening the door to a a general wearables developer ecosystem.
We're committed to opening up the hardware over time. That means
eventually access to the mics, accelerometer and inclinometer data, IMU data, so you can get gesture recognition uh and contextual cues. Um, and I think interestingly based on some of the
announcement yesterday, we'll also be looking at how we can roll out support for on glass displays and notifications going forward for all of those great display glasses that you saw announced
yesterday. Eventually,
yesterday. Eventually, you'll be able to distribute your integrations to every AI glasses user around the world.
So, the device access toolkit, it really isn't just an SDK. It's really the start in our view of a new ecosystem of development for AI glasses and wearables. At the core, it's about a
wearables. At the core, it's about a secure direct access to the sensors on the camera on on the on the glasses.
It's about crossplatform support. We'll
be supporting both iOS and Android right from the start. So you can build for all of your mobile users wherever they are.
There are simple and straightforward permission flows to help users feel comfortable using the new integrations from different developers. And then on the developer side, there's dev mode and other developer centric tools to enable
faster, quicker iteration and exploration.
I think um I speak for all of my team when I say we're really excited to see what everyone builds with this. you
know, we we expect that there will be a lot of video usage. There'll be
different kinds of contextual services and utilities, but we also expect there's going to be a whole bunch of things that are built that we never anticipated. And and in a lot of ways,
anticipated. And and in a lot of ways, that's that's what's great about building developer platforms. Um, you know, we're excited. We really believe that this is the time to for all of us
to come together and build the future of AI glasses together as a community. And
now I'm going to hand off to Chris Barber, my colleague, who's going to be coming on stage with some of the partners we've been working with over the past few months to build out the
wearables device access toolkit.
[Applause] Good afternoon. Wow, this is incredible.
Good afternoon. Wow, this is incredible.
We knew there were some interests, but wow, this is this is fantastic. Hope
everyone has enjoyed the announcements and the demos over the last couple of days. Uh, as you can tell, we're
days. Uh, as you can tell, we're incredibly excited about this opportunity that glasses will will bring to uh the future of computing and uh
feeling all of your enthusiasm has been really infectious. My name is Chris
really infectious. My name is Chris Barber. I lead the wearables content
Barber. I lead the wearables content team here at Meta and we work with partners and developers to build experiences for our platform. Um, and
for us, this is an incredibly exciting and momentous time, uh, given all of the tools that Tom talked about a moment ago. Um, I'm honored to be joined today
ago. Um, I'm honored to be joined today in conversation by a few friends, uh, whose teams have been working with the technology in an early access.
So, I'm here with Eddie Louie, uh, CEO of 18 birdies.
um Louie Phipe May, vice president of innovation and product uh for human wear
and Ashray Ers uh founder of uh Logitech uh uh Streamlabs. Apologies. Thank you
all for being here.
>> Thanks. Thank you. uh your teams have had early access to the toolkit that Tom talked about uh and the work that you've been doing has been foundational and inspirational for us as we've evolved
the technology. The insights have been
the technology. The insights have been invaluable and we thank you. Uh I'm
excited to have you share a little bit about the work that you all are doing and the kinds of ideas that you're exploring.
>> Right Eddie let's start with you. So 18
birdies has really transformed the game of golf uh through technology.
Um it's helped people use technology to better engage with the sport and to engage with their community. We think
that AI glasses have the power and the potential to completely transform sport.
How are you thinking about wearable technology as it relates to the kinds of experiences that golfers want to have on the course?
Yeah. Well, uh, thank you, Chris. Very
excited to be on this panel. Uh, over
the last 10 plus years at 18 birdies, our mission and focus has been on one thing, which is to leverage the best technology available
to us to build products to help golfers improve and enjoy the game more. So, for
me, this is super exciting time.
The potential of how we can leverage AI and glasses in golf, I think it's truly incredible. And honestly, I cannot think
incredible. And honestly, I cannot think of a better fit than golf to actually showcase what AI glasses can impact the
fundamental experience of a sport.
Uh, as I said, at 18 birdies, our mission is to leverage technology to help golfers improve and enjoy the game more. What we
have learned is that most golfers, especially the younger and newer golfers, they want the same thing. Uh
they are very open to try and to embrace technology when it comes to the game.
So uh why do I think AI glass is the perfect fit for golf? Uh to start we can take a look at how uh millions of
golfers on the 18 birdies platform today how are they using golf uh using technology in their game.
More than anything else golfers are using 18 birdies to navigate the golf courses they play. How far am I to the tree or to the middle of the
green? Uh with the wind blowing in my
green? Uh with the wind blowing in my face standing on a uphill slope. What
club should I use for this shot? These
are the type of the questions every golfer needs to think about before each shot. So with AI glasses, golfers now
shot. So with AI glasses, golfers now can ask these questions as they're preparing for their shot as if they're speaking
with the best caddy in the world and they can expect a very uh consistently reliable response from them right away. And not too far, uh,
pretty soon I think they will be able to get this information right on the glasses showing up just at the right time without even having having to ask. So, I
think that's really exciting.
Besides navigation, uh, golfers are also using 18 birdies to track their rounds.
How many pars do I get usually on uh, when I play around? How
far do I hit my driver? or how accurate am I with my driver or even how far am I hitting a club from the short grass versus the long grass. Golfers wants to
know these things because it gives them direction on how to improve and golfer is all about improvement. Everybody's trying to
improvement. Everybody's trying to improve and uh with AI glasses whether by speaking with the AI or
seeing with AI with the camera all these informations now can be easily captured hands-free without any additional effort
or distraction for the golfers. So now
golfers can get that while staying in the moment and focus on their game.
There is one particular feature I would love to build and probably will be using a lot. So, I tend to hit a big slice
a lot. So, I tend to hit a big slice with my driver and lose the ball way to the right. And I think that's probably
the right. And I think that's probably the number one problem most golfers are struggling with.
And that's why you see if you go to a golf course, you will see golfers spending probably half the time just looking for their balls everywhere. So,
what AI glasses >> not new one in this So with AI glasses, I think they will still hit that slice here and there, but
now they will know exactly where to go to find their golf balls. And I think that will be a big win when it comes to quality of life golfers.
In addition to navigation and tracking, golfers are also using 18 birdies to connect and
share their golf with their friends. you
know with golf courses uh very often built around beautiful natural settings and with golfers usually playing golf always with their friends or their
families. So golfers actually have a lot
families. So golfers actually have a lot of motivation to capture their their golf and the fun memories with their friends by taking a lot of pictures and
videos. Now with AI glasses they can
videos. Now with AI glasses they can easily capture anything their eyes can see. It could be a a friend who's new to
see. It could be a a friend who's new to golf making their first birdie. It could
be my brother in agony after missing a short putt to win a match.
>> So with AI glasses, I think we'll see a lot more of these fun and memorable golf moments being captured and shared. Absolutely. And I
think that will um really enrich the the golfing experience for everybody.
>> It's hard to think of an activity where hands-free access to technology and insights could be more valuable than golf. Uh so it's really e exciting to um
golf. Uh so it's really e exciting to um have had a chance to to see the work your team is doing to explore the space.
>> Yeah, very exciting. Louis Phipe.
>> Hey.
>> Hi. You know, one of the areas that we are uh truly inspired by is the excitement uh for AI wearables, AI
glasses within the blind and low vision community.
Um we see enormous potential for these devices to expand independence and access. You and I have talked quite a
access. You and I have talked quite a bit about that. Um, what are some of the ways that you all are thinking about how wearable technology uh can bring real everyday value to to people in this
community?
>> Well, in a lot of ways it's it's a game changer. Of course, we said that many
changer. Of course, we said that many times over the last few hours. Uh, but
for the the assistive technology community, it it is uh because at humanware, we've been developing solutions for for people who have visual
impairments for for years.
And um of course a lot of these products are few people who need to read braille or have to you know magnify some text and stuff
like that. But all these devices are are
like that. But all these devices are are so different from what people are used to see. So they're it's always this kind
to see. So they're it's always this kind of barrier between blind people the tool they use and the mainstream uh the
population in general. So, so, so the the AI glasses are opening up uh a lot of possibilities by removing these
barriers and providing solutions for people who have visual impairments in a form factor that is elegant, uh, light,
easy to wear, nonobtrusive, and and it's making a huge difference in everyday's life. And some of the use case we can
life. And some of the use case we can easily imagine and there's tons of others. We could could spend an hour
others. We could could spend an hour just talking about use cases. Uh
>> we have a little bit less than that but >> yeah well okay I'll do a short version.
Um the the you know for example the simplest situation a blind person comes into a big space that he or she doesn't
know and wants to find where's the restroom you know. So a blind person typically will either explore by by by himself or herself and or ask some
people to to have some guidance and all this is all right and but you want to have independence. It's in human nature
have independence. It's in human nature to be independent and self-sufficient because it brings dignity brings in inclusions and and and it's so
important. So all the tools we're doing
important. So all the tools we're doing it needs to solve that problem. all to
give independence to people regardless of their handicap or culture or language. So the AI glasses are it's not
language. So the AI glasses are it's not it's it's a big step in that direction.
And also well honestly we have to be I I need to be very transparent about this.
Traditionally the products that we were making are expensive. These are
expensive product because they are very niche made in very low volume and of course the cost of electronics at low volume is is is expensive but the price
point of the the the the Ribbon Metas and the other glasses that were announced yesterday are really reasonable relatively speaking of course
but they are really reasonable and open ups the possibility to give tools to more people who would probably not have the means to purchase like more traditional solutions for for
themselves. So, um so of course that's
themselves. So, um so of course that's that's that's why I'm saying it's a game changer.
>> Thank you so much. You know, we talk a lot about um uh AI glasses being the ideal form factor for AI because they can see what you see and hear what you
hear. uh and as I've gotten to know the
hear. uh and as I've gotten to know the the team at human where I'm inspired by this uh the build on that which is uh they also enable the AI to see what you
can't uh and there's real uh uh opportunity in that. Thank you so much again Ash.
>> Hi how are you? Um live streaming has transformed the way that creators uh make content and engage with their audiences.
um bringing them d more directly into their into their world. Uh as wearables and and specifically AI glasses evolve um they'll open the door to even more
immersive and spontaneous kinds of content um where the boundary between you know your real life and the content
you're making continues to to blur. uh
how is Streamlabs thinking about um this opportunity for creators and wearables and and where this could all go for for your creators?
>> Yeah, thank you Chris. Uh at Streamlabs, our mission has always been to uh enable creators uh to do more. Um we've always been focused on making it easier and
easier to go live and to share your content. and AI glasses are a huge step
content. and AI glasses are a huge step uh to making that even more accessible uh and leveling up live streams, making it possible to have better and better production quality. Uh a few of the
production quality. Uh a few of the things I'm specifically excited about, if you think, you know, just a decade ago, uh you know, what it would take to live stream, if you look at the big streamers that really came up in that
first generation of live streamers, they were deeply technical, right? It wasn't
an easy thing to do. Um, and over time, you know, we've been hard at work building tools to make that easier and easier. And now with AI glasses, it's
easier. And now with AI glasses, it's possible with just a phone and a pair of awesome looking Ray-B bands to create super high production quality live
streams where when you're just walking around, you know, your viewers are seeing alerts pop up as they're tipping or following or subscribing to you. You
have overlays that are personalized, maybe in line with your branding, maybe in line with the vibe you want to go for for that day. You have chat popping up.
Um, you know, and you're able to multiream simultaneously broadcast at more than one platform at a time. You
know, the reality, uh, that now you're able to do this with just a pair of glasses and a phone. Uh, I'm so excited to see what this is going to do from an accessibility of live streaming standpoint. uh you know if you look at
standpoint. uh you know if you look at younger generations and just how many want to live stream and create content versus you know how many are actually doing it there's a big gap there and I
think that uh as this you know kind of makes its way out there and we ship this integration I'm just so excited to see uh the kinds of content that new emerging creators uh you know kick off
and start their journey with the other key piece I think is for a lot of existing content creators uh there's so much excitement around this concept because of what is enabled with this first-person view that you can start
adding to things, right? Uh just as our team has been testing, we've had so much fun with, you know, Lego, right? Uh
think about cooking streams. When you're dealing with steam and stuff like that, it's not easy, right? What what a better place to have a camera position to try to navigate things. Um I think, you know, something you can always count on
is that, you know, creators will be creative, right? And so we are just so
creative, right? And so we are just so excited to see the kinds of content that emerges with all of this and how streamers make use of this technology.
>> Uh you know we've talked a little bit about um uh the creator audience and and what they need the kinds of tools they need to to enable them to to do what
they do. I I shared with you I'm
they do. I I shared with you I'm particularly excited because creators many of them are kind of at the leading edge of culture and uh you know one of the things about the the portfolio of
glasses that we have of course in partnership with our our uh partners at Essor Luxodica is that there's a range of styles and we saw some of those uh announced yesterday. So, I'm really
announced yesterday. So, I'm really excited about that aspect of how creators are going to adopt the technology as well to be even cooler
for everyone. Um, if we succeed at
for everyone. Um, if we succeed at bringing uh super intelligence and AI to everyone uh uh through glasses, uh it'll
inevitably reshape how we interact of course with the world and and with each other, which is is central to how we think about this this opportunity. From
your vantage point, um what do you hope this shift will feel like for people? uh
and what role do you see your work playing in that?
Um well, I think the main reasons for people wearing glasses is likely to change from primarily a
medical solution designed to correct our vision, our see better to where possibly everybody is wearing AI glasses as a
preference or a necessary tool just like smartphone today. So with AI glasses and
smartphone today. So with AI glasses and with the ability to easily capture, analyze and share whatever our eyes can
see, I think can lead to very uh very significant uh behavior change and really change the way how uh we are connected with the uh to the world
around us. So for us at 18 birdies uh
around us. So for us at 18 birdies uh we're we're very excited to have already started through the partnership with uh Met Rayban to start already exploring
and building really promising features for the glasses for golf and we're super committed and really excited to really
making AI glasses a must-have gear for all golfers and that will be pretty awesome for golf.
>> That's great.
>> Great >> guys. Any thoughts?
>> guys. Any thoughts?
>> Yeah, of course. The, you know, seeing the the keynotes that there was here yesterday and today kind of gave me a new perspective. So, I'll go a little
new perspective. So, I'll go a little off script. Chris, don't worry.
off script. Chris, don't worry.
>> There's no scripts. you you know that >> it's a you know our our parent company at human war is sell lxotica and they are you know catering 97% of the
population with sunglasses and of talmic lenses like like like you were mentioning and then so so having a
technology that is accessible to use for people who just don't who just wear sunglasses but benefits from the AI also So it integrates well with regular
prescription lenses for people who need them like me and for people who are blind or are visual and handicap of some sort. You know it it reach everybody. So
sort. You know it it reach everybody. So
how it's going to be done I think it's it's in big part being accessible being intuitive being easy to use being it needs to look good as well. Obviously
everybody wants to look at their best right? So, so, so having having all
right? So, so, so having having all these box checked, I think is the success of of the future and it coming back to my buzzword of being the game
changer. It's it's really going to do
changer. It's it's really going to do that because that's why it has the potential of replacing the smartphone because it's it's checking all these boxes that no other products could check
so far. So, so that's the future
so far. So, so that's the future obviously.
>> Uh, any thoughts? Yeah, from from my perspective, right? Really thinking
perspective, right? Really thinking about this from the creator angle. Um,
you know, first I think it's critical that it maintains that feeling of being natural and seamless, which I think we're there. Uh, I think you feel it as
we're there. Uh, I think you feel it as soon as you try it out and start using it. Uh, and what I'm excited to see
it. Uh, and what I'm excited to see next, uh, is, you know, really leveraging super intelligence and solving for safety in a really meaningful way. I think this is a huge
meaningful way. I think this is a huge opportunity and I'm so excited about the work that we're starting to do in this.
uh you know as you can imagine right we're not far from a point of being able to you know blur sensitive information uh you know street names things like that uh as people are streaming. So I
think what this could mean from a safety standpoint especially while we see like you know on on Twitch you see the IRL category growing like doubling almost in terms of the number of creators creating content year-over-year. So as this space
content year-over-year. So as this space is growing and there's so much excitement around this kind of content uh I see this being a massive enabler on the safety front and couldn't be more
excited about that. That's fantastic.
Well, Ashray and the team at Streamlabs and Logitech, Louis Phipe and the team at HumanWare, and Eddie and the team at 18 birdies, we thank you very very much
for being early contributors to uh driving our roadmap for a wearables platform. Thank you. Uh we're going to
platform. Thank you. Uh we're going to take a couple of questions if if folks have have some questions for our guests.
>> Yeah, Keith, go ahead.
>> Question for you. uh you know on the golf course at any given moment there's 100 points of data you that could be helpful to you that you can take in that you want to prompt or initiate how do
you decide what what is most relevant to certain >> uh you're right I mean there are a lot of data out there to to to really to you
know make some choices on what's important I think really come down to hopefully it's really for the individual what is really important for this person, for this golfer, right? You
know, a lot of times it come down to making a decision of what club should I use for this shot. That's constantly
what golfers think about, whether it's purely on a distance or on a on a serve is your balls landing. Are you on ups
slope? Is it deep grass? So I think
slope? Is it deep grass? So I think having AI to be able to analyze and capture and just give you
what you needed the most uh instead of saying we are going to uh put a list of what's important I think you know I would go the AI route for sure >> please
>> also for you >> lots of golfers here huh >> from a sports perspective especially I was curious you were mentioning how in your test so far.
Something so small or was that hopeful or genuinely >> I think it's both because there there are a lot of uh attempts out there from different technologies whether it's
using a phone to to track the tra you know to trace the trajectory of a ball.
So, I think applying that to the glasses would definitely make sense and uh and as I think everybody can use it if you're playing golf. So, I would be betting on that.
>> We're going to use the microphone so so the folks online can hear. Sure.
>> Thanks so much for a great panel. Um I
have a question just for anyone that's like a third party indie dev that wants to develop like say another sports app.
Are you able to have custom overlays for like a data visualization dashboard, say for like a game of golf or for, you know, any other sport? Um, that's a lot been my focus for many years is like
data visualization within XR. So,
>> awesome. Well, we'd love to hear more.
Do you want to take that one?
>> Yeah. So, this is something that we're definitely looking at. I I assume you're referring specifically to the new display glasses. Um, but we are trying
display glasses. Um, but we are trying to figure out the right ways for developers to be able to provide real-time feedback to users uh depending on what their service or or experience
is. We recognize that that's going to be
is. We recognize that that's going to be uh one of the things that folks really really want.
>> Hi guys. Uh great great platform. Love
it. Uh and great examples. Um my
question is this is seems like really very well suited for consumer use cases.
I'm curious if you're also pursuing a parallel path for uh more enterprise specific uh use cases.
>> Uh you know, we're we're opening a platform for developer experimentation and exploration right now. And we're
curious to see all the kinds of use cases that that developers want to make.
I think that's that's what what I'm able to say on that topic now. And I truly in the spirit of we want the tools in your hands. Uh and we want you to tell us the
hands. Uh and we want you to tell us the kinds of things you want to build with the platform and and that'll help us tailor the p the platform in those directions.
>> Um sure.
>> So this is more so geared towards like the meta team for let's say a developer that's building a tool. if they allow the consumers to enable access to let's
say their data, their camera feed, their microphone feed. Are you able to I
microphone feed. Are you able to I assume the the standard is like accessing that in real time and using that to give recommendations or features. Is there also the world where
features. Is there also the world where we're able to digest that data and kind of take all of the camera feed or microphone inputs and then use that for
other things if the user allows that?
>> Take that one.
>> Yeah. So I think it's a difficult to answer that question without being more specific is I guess what I would say. We
expect that the camera will be used not just for video capture but for things like context recognition you know potential um as we're saying here
computer vision uh guidance navigation all kinds of different things. So, um
there's there's that. Um there will be um kind of guidelines for the usage of the sensors and the data that we try to make sure that the usage is reasonable and respectful and so forth. Um but it's
hard to get without kind of looking at the very specific usage. I, you know, I can't really kind of comment more than that.
>> Can we pass up a mic?
>> Where are we with the mics?
>> Sure.
>> All right. Down front.
>> Thank you so much for this. We've been
waiting for this for for so long and and and with the meta wearables device access toolkit. Just for some
access toolkit. Just for some clarification is that it clearly supports non-dis glasses which is great.
Does it also support display glasses at this time or are we >> going sequentially >> in time? Yeah,
>> I'm happy to speak to that.
>> Yes, please. Thank you.
>> Um it will support display glasses in the exact same manner that it supports the non-dis glasses initially which is as we discussed access to the camera, things like access to the microphones.
We do plan to bring out specific support for the display glasses down the road.
Um there'll be like some slight details and exactly when the support on the consumer side ship, but we our plan is to support all AI glasses that are shipping
essentially by the end of the year.
>> Excited. Thank you so much.
>> Hi. Um I know a few of you up there. Uh
can you explain a little bit about how invocation works? The intents like how
invocation works? The intents like how that setup works not necessarily the programmatic but for the user how you would invoke something like the golf or yeah >> you want to talk a little bit about the
session model and how >> uh so the way that the model works so there's there's kind of two key events that happen one of which is when the user elects to make a connection between
a particular mobile app and the glasses and that routes through the meta app so you know to to very much simplify it it's initiated from the app uh there's a confirmation dialogue that occurs within
the meta AI app and then once that happens the meta AAI app basically says to the glasses okay you're good to go to talk to this app and start communicating with it. Um there are also permissions
with it. Um there are also permissions flows that follow that in terms of starting a session. Uh the model kind of it's unfortunate the audio wasn't on that uh concept video but the the model
is often that it's a longer session with the app. So um for example with 18
the app. So um for example with 18 birdies they think of it as you are in the 18 birdies app you initiate a session with the glasses from there and then you go into all the interactions
with uh the integration from that spot.
That is I think where we're starting from with this toolkit. We want to hear back you know how people view that option. Uh whether or not there's other
option. Uh whether or not there's other options that they would prefer. Um that
is part of what we're going to be trying to understand and explore as we go forward.
>> Yeah. I'd underline that that last note.
You know, we we as you well know as somebody who's explored our platforms uh historically, we're very much in the mindset of uh evolving the technology
towards the needs of of the customers of the platform. So your ideas and and
the platform. So your ideas and and pushes are helpful.
>> Another one. Uh, so I was curious if there's consideration for using the the neural band on its own as a separate wearable, so to speak. So I could connect to a Quest headset, to a phone directly to my PC. There's a lot of
really great input there that >> What are your ideas?
>> I mean, the idea, >> you don't have to answer that.
>> I won't go to that, but just Yeah. Are
there thoughts towards that? Um,
>> in time, maybe. Yeah. I mean, uh, right now we're very excited about, um, uh, the neural interface with with the display glasses and all of the opportunity that that's going to unlock
to to have more confident and and frictionless, uh, ease of interaction.
Um, so that's where our focus is right now. But again, ideas are welcome.
now. But again, ideas are welcome.
>> I could add, >> do we have Oh, please. Yeah,
>> the tiny bit. The reason mainly was cuz uh as someone who doesn't wear glasses, like I got laser and luckily I always lost my glasses. But I do love the idea of what the neural band can do. I feel
like that as a separate accessory is an amazing idea. I'll pass it on.
amazing idea. I'll pass it on.
>> That's great. Think we can take probably a two or three more. One more. Okay, I'm
being told one more. So,
>> uh first of all, amazing that this is getting uh uh something that is for real and is pushing the industry. Um, one
question is what obviously is early in the road map, but what about a single shot experience? Maybe an event or maybe
shot experience? Maybe an event or maybe I'm traveling and going for a museum is something that I don't know do don't not tell on my app or my my phone is
something that I discover and I experience for one time. Is that in the road map? Is there there a plan or or
road map? Is there there a plan or or maybe even a branded experience that it happens for for one only one time?
>> I I can speak.
>> Yeah, go ahead.
>> Um, explicitly in kind of the use cases that we contemplated when building this were things like museum tours and and other types of things which I think fit in with what you're we're talking about.
Um, at this point they still would be driven by downloading installing an app which I know I recognize for a single shot experience is not always optimal.
people would love to just, you know, shoot a QR code and have it work. Um, I
think those are the kind of questions that we're going to kind of keep looking at as we go through the next year.
>> Well, thank you all again for join Oh, we're good. Uh, thank you all again for
we're good. Uh, thank you all again for for joining us. This has been a really wonderful conversation. I want to thank
wonderful conversation. I want to thank again our our guests. Uh, Tom, myself, and the team are incredibly grateful for your interest and enthusiasm. So, hit
the website where we lost the QR code.
Um, uh, and please, uh, jump in. All
right. Thank you.
[Music]
Loading video analysis...