2025 AI+Education Summit: AI’s Impact on Education – A Visionary Conversation
By Stanford HAI
Summary
## Key takeaways - **Tech Initially Widens, Later Closes Gaps**: With technology adoption like Khan Academy and YouTube, initially people with more means adopt it first, but over time it raises the floor and levels the playing field, as seen with YouTube now having 93% of users for learning in remote regions like villages in India. [03:10], [04:06] - **GPT Tutor Increases Dropout Rates**: In Code in Place experiment with 10,000 students, those with early access to GPT-4 tutor had a 3 percentage point larger dropout growth rate compared to control, while human help improved completion by 10 points, showing humans inspire persistence. [09:42], [10:14] - **Training Wheels Hide Balance Skill**: Like training wheels teaching pedaling but not balancing on a bike, AI helps students get answers but skips practicing critical thinking, communication, and teamwork; educators must guide focus on skills that matter. [11:00], [12:08] - **AI Boosts Educators' Feedback Speed**: In classes, AI suggestions via DPO allow teachers to grade orders of magnitude faster, freeing TAs for 12 extra hours of one-on-one student time, enhancing personalized support. [16:11], [16:33] - **Shift to Reviewing Over Generating**: AI handles generation like writing or coding, so education must emphasize revising, editing, and judgment; previously 90% time on generating in English classes, now more on review. [22:25], [23:20] - **Loneliness from AI Replacing Humans**: Human teachers provide deep connection essential for long-term learning; replacing them with AI tutors risks making learning lonelier, leaving students unsure of their place in human society over 10-20 years. [34:06], [35:03]
Topics Covered
- Tech Levels Equity Gaps Long-Term
- AI Tutors Increase Dropout Rates
- Training Wheels Block Core Skills
- AI Shifts Focus to Editing Judgment
- Loneliness from AI Replaces Humans
Full Transcript
how's everyone doing today all right excited for this closing conversation hopefully all right thank you I like the enthusiasm well thank you so much for the invitation um thank you all for
continuing to stick around for the last session of the day hopefully it'll be inspiring I'm really thrilled to learn from this group of of wonderful experts and Scholars in the space um so I wanted
to quickly do a round of introductions um who we have joining us here today um to my right we have shantanu CA VP of uh of Google for Education he also helped
start KH Academy and has been in education for over 15 years welcome thank you great to be here um next up we have Chris pich did I get that correctly
okay um assistant professor here at Stanford in both computer science and education um he helped to start code in place which he'll um talk about with you all um which has helped to teach V CS
virtually for free to tens of thousands across the globe really exciting opportunity to learn more there and last but not least we have drew bent who's a Stanford gsse Graduate um started
Schoolhouse as part of Con Academy and is now leading higher education initiatives at anthropic so round of applause to welcome our panelists thank you um so to get us started um and just a little bit about me I'm um Allison
Scott I'm the CEO of the cap War Foundation we're located um across the water in Oakland um and our overarching goal is to build a more Equitable technology ecosystem um that can address societal and
Humanity's uh biggest challenges so we're very interested in and we believe in the potential of AI um similar to many of you who are in this room today um especially in the fields that have experienced um long-standing challenges
and inequities and so in education we really want to know how can AI be best harnessed to improve education um how can we improve learning how can we close Equity gaps for marginalized students um how do we prepare young people for a
very different future than what education traditionally has been created for um and we know that this can't happen without intentional Focus so um hopefully this conversation today will
illuminate opportunities and implications for um ai's uh Equitable integration into education and learning um so to start us off um as AI
systems are increasingly integrated in education how do we ensure that these Technologies close and not widen uh existing opportunity gaps and maybe um each of you can share a little bit about
what you've seen what you've learned and and maybe something that gives you hope who would like to start us off all right um yeah it's great to be here and and really en looking forward
to the conversation here um and by the way I knew Drew for a really long time he was actually a high school intern when I was at KH Academy so we were were actually longtime um uh colleagues over
over 10 15 years or so um in terms of like the impact on closing I could I like it's actually something I've thought a lot about over the years and one of the things that you get with
technology ad option um is initially you do have this risk that the people are going to adopt it are the people with more means they're the people that have access to the devices have access to the connections
have the motivation um and even in the early days of kadem I remember hearing like being asked the exact same question which is the people who will come or people have a device they have high bandwidth against stream YouTube videos
and they have motivation how do you make sure you're reducing the Gap versus expanding um The Gap and I think one of the things that you definitely see with technology adoption is that over time
you know there might be some near-term impacts but over time you can really raise and and the the and level the playing field and raise the floor and if I think about you know like let's fast
forward 15 years from those early days when we're putting YouTube videos were really only a segment of the of the population was using it to today where YouTube has two you know billions of users around the world I think recently
we saw um in a survey 93% of people who use YouTube use it for some form of learning and Ed and in in informational intent uh it's you know in the most
remote regions of the world and you know if you go to a village in India on their Android phone they're watching they're watching educational videos on YouTube and it's like remarkable and way way you
know so over time it gets there and I'm pretty optimistic with the role that AI will have here because one we already seen the last few years the cost curve
of what's happening is dramatic I mean it's bigger than mors law you're seeing like 10x reduction in in in the cost of delivery you have these open source models that are coming out which which
really open it up for large um um you know access to lot lot of people you have on device models where you can now do pretty impressive things just on a like a cheap Android phone um and I
think that trend is just going to really continue over time so as technology adoption gets there I think it really does start to make it far more accessible and more Equitable awesome
thank you k you have some some um learnings and and ideas coming from your experiences yeah you know I'd like to plus that I you know this is a very exciting time and you can imagine for folks around the world this is a great
opportunity um for all of us you can imagine the things that you don't know now we now have ways to get past those barriers um so truly exciting but maybe just to be playful I could push back in a couple of things one thing that makes
me concerned is the job part of it there's like what we're able to do and what we know I think this is going to be a wonderful moment but I spent the summer in Kenya which is where I grew up and I would say the level of concern I
saw amongst people who are learning entry level data science was much higher there than I would say I experienced in the Bay Area so there's maybe this growing fear of winner takes off whether
or not that's true we'll find out and then there's this other part of me that wonders if this might be different than like early days of YouTube which is access to the highest end models is very
valuable you know like being able to to program with the most recent version of large langage mods feels pretty different and so are the students who are from more marginalized communities are they going to be getting access to
the same ones yeah so I suppose my first concern was like I think you're right but my first concern is that actually maybe the job market could be hit in a different way and the other concern uh would be who gets access to these models
because they are sometimes expensive like The Cutting Edge ones are a really different price point uh and so will that drive inequalities in a different way than say YouTube access yeah I think
the access point to it's not just theoretical when we talk to students um across college campuses they often put them into two categories the students that have the pro versions of these AI
models and the students who don't and they have dramatically different types of learning experiences now especially we've talked to a lot of business school students about this and so I think that is worrying um to shantan new's point
the costs are coming down but we need to do more there both from a cost perspective but also from uh just like an awareness perspective I think the other thing is and how we design the
models when we think about access that needs to be baked into the models from probably the earliest levels um right now so much of the conversation in AI
has been around increasing the IQ of these AI models but I think especially for Education the EQ Parts is really important and so if we are on the you
know precipice of having an AI tutor or an AI coach or an AI assistant and everyone's hands we want that to be able to work with Learners of all different backgrounds um and be able to
accommodate all of their needs and that requires very high EQ and so I know just at least at anthropic that's something we focused a lot on is the character part of it and I think that's also going
to be a key part to make sure that this is broadly available and helpful for um you know all types of Learners super helpful hello there we go super helpful
and if I if I could add one uh one additional thought that kind of keeps me up at night is how do we think about both the utilization of tools and also preparing young people to be critical consumers of those tools and how are we
integrating both together um and are we focusing more heavily on one one or the other uh but I think there's a ton of opportunity to do both um so um we know
that many of the AI driven tools that we've seen um have great potential to scale uh learning experiences globally and Chris you've seen this firsthand um and shantanu as well uh however how
do we ensure that the scale does not come at the expense of uh pedagogical depth um cultural re relevance and also relational aspects that we know are so critical to education and you all
started to touch on it so would love to hear um additional Reflections yeah I'll just jump in okay I want to go um it's so interesting right I actually I really love the presentation before about this learning
through creation cuz that's Mike's the place I'm most excited about like the the ability of students to make things and that to turn school from maybe a drudgery to something they're very excited about I think there's so much
potential there uh and maybe I had this one interesting moment of caution so we run this code in place class 10,000 students every year thousand teachers uh and it's also this great test bed for
different Technologies so when we have new ideas in artificial intelligence we can ab test them and see how they impact uh different learning outcomes and I think that's a little bit special I think there's lot people throwing AI tools out there without the ability to
find out is this actually benefiting our students last year we did this interesting experiment that had two parts one part we got people everybody from 150 countries who was in the
experimental arm got early access to a GPT tutor and on another arm their students were able to be helped by humans so if a human says I want to help somebody they could drop into their
sessions and the results were quite interesting right the the control group you call them neutral um did pretty well the f who got access to humans so they would be something like near peer
teachers people a little bit more experience coming in and spending 15 minutes with them 10 percentage Point Improvement in completion it was very exciting and then on the other end the students who had early access to a GPT 4
based tutor they might have done better in the class but they were dropping out at 3 percentage Point larger growth rate and I don't know what that told me but maybe one of the takeaways was I'm
excited for Generation but I feel like when we get to these sorts of basics of educations I suppose we shouldn't throw out the basics like humans can be really inspiring great content great paji can
matter uh so I suppose there is like excitement and and room for caution yeah and um you just and that's really interesting study you know one of the things it reminded me of is um you
know when people are using technology and particularly students they're really bad judges of you know they think they know more than they do and it actually reminds me of you know when my oldest
son was learning to ride a bicycle and I you know I made a mistake a lot of parents did which is I put training wheels on and and he's all excited he's going from point A to point B think oh I'm really good at at riding my bicycle
and I just could not get the training wheels off um and you know take them off and he throws a fit and like okay I got to put the trading wheels back on um until one of my friends came and said you know you really made a big mistake
that you should not put training wheels on a bicycle because what that does is like if you think the scale of riding a bicycle is learning how to Pedal that's great but it takes you like 30 seconds to
learn how to Pedal the skill of running a bicycle is learning how to balance and by putting the training wheels on you're taking away your ability to actually focus on the skill that matters and I actually think you know I mean it's
really interesting studies about like how people are adapting to but like you do have this impact where students will often think they know or think that they're learning really well with AI they're getting help like oh I get it
and it it helped me through that process but they're not necessarily practicing the skill that actually matters and I think that is where the role of the educator to me is like the most critical part of this equation because the
educator can come in there and say well actually it wasn't about getting the answer right on a piece of paper that's like pedaling to the end goal it's not even about writing a five paragraph essay that's that's just getting from point A to point B it's about the
critical thinking it's about the communication it's about the teamwork the problem like th those are the skills that you're really trying to reinforce and I do think as we think about how we
bring this to um at scale it is going to just be so critical to think about the human systems around it so that people are actually embracing the technology the right
way I think there's also a shortage of creativity with what these Ai and education interfaces can look like we're so stuck in the sort of chatbot format
that it's hard to think about Alternatives here and you know I think chatbots generally work very well for product activity and then work use cases
um but you don't interact with the teacher through a chat bot and for a good reason and so I think we've seen a little over the past year with things like artifacts or I think Google's
notebook LM of just showing what do different media look like uh either on the creation side like Chris was talking about or um just different types of
interactions which you can't always find a human analog of what a student and an AR are able to do and so I think we're just at the the early stages of trying to figure out what these new interfaces
look like and I think many of you here are trying to build those and I think that's really important can I ask a follow-up question um related to some some pieces that you
all touched on um and what what do you see as the evolving role of Educators so um if we were to think about you know not just the the technology but the the ways in which um Educators which have
been so critical to the outcomes of students um are able to adapt and utilize these Technologies in the most effective ways like what is the what what do you all see as as some promising examples of like the new potential role
of Educators and you you all have highlighted the importance of Educators continuing to be critical in this conversation yeah I I think I think educators are really kind of the Lynch
bin of of Education systems and and I think there's a few things that that that we're seeing so one is I've been pleasantly surprised at how how how
quickly educators are embracing AI to help themselves in in in this way I think it's one of the things like there was kind of the knee-jerk reaction when when these chat Bots first
came out like oh maybe it's used for cheating Etc um but now you do see a lot of Educators saying well this actually helps me helps me write these emails all these tasks because any task an educator does you have to multiply by 30 to 100
so anything you do it's like a lot of work so you know AI can have a huge impact here we've even seen like a few months ago um on on search on Google you typed ai4 the first auto complete was AI for
teachers which surprised me and then I thought was maybe that's because I'm a Google for Education I didn't it still worked it was still the first the first search it was actually the first auto complete it's actually one of the first areas you see real real strong Market
fit for so I think the first step is how do you actually help educator scale themselves get time back um and we're spending a lot of time um thinking about that and focusing on that then I think the second part is the role of the
classroom is going to evolve like I was saying I think the Ed the reason educator is so important they're the ones who can look at that and say don't focus on pedaling focus on balance let's make sure that we're actually developing
the skills um the right way and I think that's um you know I think there's a lot of exciting stuff there but there's also a lot of change right in terms of like how do you bring this stuff into the classroom and in a reasonable way and
and and and that is definitely going to be something that um uh it's super important and we spent a lot of time thinking about how do we support Educators through that I think a teachers could probably tell a story
similar to what I'll tell but so I teach this class probably for computer scientists um and over the last year our ability to give feedback to students using AI has dramatically improved so we still have humans in the loop but the
our grader or sorry our teachers they're first doing a DPO process where there's suggestions coming from a large language model so we're able to create orders of magnitude faster and of course we're doing the obvious thing with that we're spending all of our extra time with
oneon-one FaceTime with our students so now uh each of my Tas is spending 12 extra hours one-on-one with students and I feel like that's made all the difference uh so maybe the AI is buying
us time on things like grading and then we're able to help people the other thing that i' add to that is in code in place there's two contentions you know we have one teacher for every 10
students and it's obviously because we think teacher is important so like I'm hard on the teacher is important here but you know there's this flip side to that I think teaching is important in this moment where we might see a split
between entry-level jobs and those who are really Advanced we are going to need more people and more places where you can do something on your career path teaching is what a wonderful thing to do you get to share your knowledge with
others so I think there'll be more demand for people who want to teach I mean there's an argument to make that in an AGI World majority of people will be
teachers so much of education is about or should be about Social Development turning people into great citizens of the world thinking about agency and a
part of it has become about learning skills that will lead into jobs but I think that's actually distracted education from its like core goal which is to bring together communities
socially and why we have a public education system and that's the foundings of the US public education system and the last 100 years it's become very much about social mobility
and leading it to jobs but that starts to break down in a world where if you just think about it by the time that AI can teach a student how to do skill X
and skill y that AI can also do skill L skill X and skill y itself and so if we're basing all of our education right now on teaching people how to do these
skills that they'll then use in jobs that the AIS are going to be able to do that's that's a losing battle right there and so we just have to remember that education and our education system
is not just about learning skills it's much more than that and so I think at least from the K12 pict I think this is very clarifying actually for the education
system perfect transition into the next question which is um even Beyond things like Automation and efficiency how do we think about using AI to unlock human potential in learning and um we've often
heard and talked about learning looking very different from what it's looked like in the past um how do we enhance curiosity creativity lifelong learning if we're going to see massive changes in
um in the job markets um how do we think about learning in a totally different way um so curious your perspectives both from industry and from higher education as
well yeah I mean I'm you know someu Chris was mentioning about the creativity in in the prior presentation I'm like super excited at the potential there I do think like it is so important
that we actually really spent a lot of energy experimenting and and defining uh new type of pedagogical experiences that can be used that that unlock that
um and I do think you know is as as you start to see with AI right now the content transformation piece is huge right like notebook LM was mentioning before like you could take now a text of
something it's turn it into a podcast and that's just the beginning right like we're going to it's not that far away before you can imagine you know video on the Fly that's personalized to me or or
very different type of experiences um and and and this going to explode over the next 5 10 years so um what type of new experiences can happen in the classroom um I'm really excited about
things like you know now a a kid can really like they can you know when I was a kid if you wanted to make a video I mean I don't know we take some cam quarter out and it would be like some really sad thing now they're making like
Star Wars quality like special effects and like like really like amazing like the the tools that they have are are incredible right so I do think that opens up all kinds of Project based
learning all kinds of new experiences for students um I also think AI can be used in such different ways you know moving away from those traditional assessment models of the five paragraph
essay or the multiple choice exam and more into okay let's have a debate with an AI on this topic and and and really like test how well we understood the material like the Harkness table Philips
T each that like maybe you can have those type of experiences now with um with with AI agents and I think that again it'll take some time to explore and figure out how to bring that um to
users but I think all of that like speaks to where the future of these educational experiences are going to go and how it's going to move away from lecture homework test yeah for for
example we've seen students at Stanford use Claude where they to be aligned with the Stanford honor honor code they'll upload their syllabus they'll upload the uh the course's AI policy to tell the AI
to follow that AI policy and then they will you they have an oral exam one of these classes where like every week they have to meet with a
TA or professor and they get quizzed on how well they know um the material as a a side notes those professors could probably also use AI to start to scale that up but what these students have
started to do is that they are doing these mock oral exams where then they then ask the AI to quiz them on as sort of stress test their thinking around these Concepts so that they can come in
really prepared to these uh you know real oral exams so I think there's going to be lots of use cases like that I think the other shift that I see happening with learning across a bunch
of different fields is there was always some distribution of work spent uh learning how to generate content or type of work and then reviewing and
evaluating so for example if we just look at you know English classes at least I remember in my classes we'd spend 90% of the time on the generating part and then 10% on the revising part
it was kind of like a last minute you know thing you just add it on um now with with AI at least in terms of like how people are using it in their work so much more of the generative part to
something you're doing in conjunction with the AI and now you have to spend way more of your time revising and editing same with code generation programmers in the past spent all of
their time writing the code and then it was kind of you would do the code reviews is just like a formality almost um now you see some of you know
at all of these AI labs and everyone using these you know latest AI models a lot of time is spent reviewing code not from necessarily other humans but maybe
generated by Ai and so our education system spends a lot of time teaching the generating part not always the reviewing and the editing part and I think there's a key question here which is how much do
you need to go through the generation part in order to be good at the revising and the editing part um because you clearly can't just go straight to the editing part and the having good judgment part and having good taste part
and so I think across all of the education field we just have to know where it's headed and then sort of work backwards to figure out what do we need to teach for students to get
there thank you um I have a few more questions I think one on the positive side um what do you think is the most exciting or unconventional way that you envision AI shaping the future of Education if you haven't already
mentioned it and then I'm going to ask the um the converse as well maybe I can start with that one I think you guys might know that like I wrote a book with my then 2-year-old daughter I feel like
this was eye opening for me I just like it's saw the generation and and like the the thrill it gave her I'm going to actually and and then certainly there's an element of like we have people writing code by hand that they can take photos I'm very excited
about that but I think I actually want to focus on just programming for a moment I think a lot of people in this room are programmers and can we just note that this is the most exciting time ever to program like we think that these
large language models are good at English no they are amazing at python they're like they are so good at JavaScript and and like if you know that Foundation of central programming knowledge like I spent 5 days
programming and I did in five days what I used to do in like half a year and I felt so empowered there was a moment when I was worried about these large language models like taking over all of our jobs and then like the thrill of
those five days and I'm like oh I want this for everyone I want it for my daughter I I I want it for all of you I mean I I think that actually programming and being able to take visions of what a computer could do from your mind and
turning into reality getting that into everybody's hands you know teachers hands you learn that nugget that underlies you know the core principles of python and you can go on a road trip that will allow you to create things
that you could have just imagined before isn't that an exciting [Laughter] moment you know I AG I agree with that I
mean but it does speak to like where the future ahead are we going to be programmers are we all going to be prodct managers who just like envisioning these things and and you know you asked me last year I would have said product manager and you asked me
today I'm like we I'm a programmer and in fact actually like a mathematician like I've done proofs that I was you know that have pushed me farther than ever and I used to think we would just be programming and then one day LMS
would be so good we would stop and now I'm convinced it's like the transfer from we did machine code to python it didn't stop people from programming it's just programmers could make ortis magnitude more there's a Cambrian explosion
I think we're not going to be product managers I think we will be creators and just the things we can create maybe in afternoon my dreams will come true yeah I I think just to build on
that I think like if I think about the future it is the creation aspect that I think is just mind-blowing it's creating programs it's creating podcasts creating videos it's it's creating anything like that is where we're where we're headed
and I think that is also where like the power of these AI models in some ways like that's where like Hallucination is no longer flaw but a feature because it like helps us think in new new and different ways and and bring it so I
think it is one of those um areas where that like I think we're still in the very early phases of understanding the implications of of that when when creation becomes that democratized you
know across all these different fields I I agree uh we at Schoolhouse by my previous role that that's listed up there uh we have a programmer on our
team who um was a chemistry teacher and uh she late in life uh learned how to program because she wanted to teach her students about how to balance some
charges and so she Googled how to program and found scratch from MIT and uh took a year to figure out how to build these interactive uh activities that she could
then share and practice problems she could share with her students and then they could you know use her Creations um and that's really inspiring and actually what I love about is then she learned how to program she switched into
software engineering and we were able to bring her on uh to work at Schoolhouse but unfortunately she's the 1% of folks who are going to be able to make that
transition and so to shantanu and Chris's Point uh I was just talking to a teacher about this yesterday who just in the course of a couple minutes just you
know uh spun up a interactive uh flashcard app for her students using Claude and that's incredible if now every teacher can do
that every student can do that uh and so yes I think if we take this seriously of everyone whether we call it a product manager or programmer I don't really care uh it's some hybrid we see product
managers becoming more like programmers and programmers becoming more like product managers everyone is going to be like that teachers are going to be like that students are going to be like that and so um it is just incredible when you
have THX amount of code gain generated every day and anytime you have a thought some code gets generated to help create that thing for
you thank you just one more question um I think this has been really exciting and inspiring conversation and um not intending to end on a on a negative note at all but I think it important to um
important to also think equally um clearly about potential risks and so if we think about the broader AI space and we know that we've talked a lot about the importance of of harnessing ai's
great potential and how do we think about um harnessing that potential in education what are some of the risks that you all are seeing that we um as a community should be very intentional
about addressing um and also just um acknowledging like the current state of K12 education as I'm hearing all of these like really exciting ideas and understanding many of you who work
day-to-day in education spaces um and there are so many um structural challenges that we're still trying to deal with so how do we balance both the exciting opportunities with the realities of the day-to-day um uh
context that we're that we're discussing so potential risks I mean I think the the first one is just we spent so much time talking about the technology and we're all
programmers so we've converged on that a very important thing and I think it is but uh as has been talked about at this at this Summit we really just can't do
technology for technology sake and I think the reason why all of us got interested in education was for Equity reasons and the reason why I've joined an AI company is because I think
AI has the potential to really uh close these Equity gaps but just as importantly has the potential to actually wide in equity gaps if we're
not careful and so that is exactly the type of challenging area that you know I like to work in is where it's like we can help steer things in the right
direction um but there are risks and when we talk to students and professors professors are are wor still worried about cheating I know a lot of people say that's over but I think every single
institution every single individual is on a different part of that path of going from AI being used for cheating to this transition to AI being used to transform learning and teaching and
still many many people including many of the professors at a lot of these top universities are worried about that and then the students just as importantly are worried about brain rot and this is
something that's really interesting as we talk to students there's more and more of a concern of I'm offloading my thought to an AI it's the short-term
solution to get to you know completing my assignment but it's going to hurt me in the long run and I think this is a failure of our tools if we're not able
to you know students they have the right intention but if we make it so easy for them to go down One path they will go down that path and so I think there's a lot more of uh Evolution we have to do
there so those are the things I'm thinking about and there's there's many others you'll hear from yeah no I would I would Echo that and I I think like probably the you know one of the things we say Google is going to be bold and
responsible and the responsible part is a really important part of that statement um and it's it is this like misapplication of AI that I think is uh one of the biggest risk and you see that
and and it's important to recognize like the forces for that are actually really really strong which means that we have to be very very cognizant for it so you know um you know there there's one as Drew was just mentioning is like you
know like student are bad judges of this stuff they'll offload things learning is struggle like you have to have productive struggle and everybody wants to make it easier but making it easier doesn't make it better learning and it's
really hard to get that balance right and it's really hard if just technology and the user is trying to get that right because the the end user you know technolog is just going to support what the user wants right so it's it's really
hard to to make that that balance right because you need to have products that are use and all all of that um but I think even on the educator side right I mean I think one of the first things with with the cheating risk they came out as people like I need an AI detector
and you had people like okay we could like figure out these AI detectors and you know we looked at that and Google like that doesn't make any sense you can't really build an AI detector that has like any level of like confidence but the demand was really really high
people really really wanted it people went out there and said here's an AI detector it does this things and people are like adopting it it's like and you say like the risk of that the misapplication is huge right if you have
an AI detector that's like 10% of the time telling the teacher that a student cheated when they didn't cheat like what does that mean like from a you know education system the biases all the different things that might be happening
there and I think and and when you see all the new stuff that's coming out right like you you know these models are amazing they're they're getting like 80% on the the aim exam which is a really
hard exam if the math exam but the 20% they're getting wrong right so what what about that 20% and what does that mean when that's just getting being thrown and and and you know as we're saying before people will trust this stuff more
than they should in many cases right you're in a self-driving car you're going to be like okay they can drive after minutes like okay it seems like it can drive um and it can now very very well I've tested it out but it is it is
one of those things where like that last five and 10% is where like all the work is like all the work um and it's so important for us to really just be cognizant of that um and and and and and
recognize the forces where people will try to use this thing in some ways ways that it's not really ready for um and it takes a lot of you have to be really focused on the responsibility part of
that you know listening to responses I'm going to go with kind of one word and then I'll explain myself loneliness I think this is the biggest risk we've got
and then I'll explain it which is you know we I I came from AI you know I spent my PhD coming up with algorithms that folks that like dual lingal will use to trace your knowledge and so I came from this perspective uh and then I had this experience of seeing how much
human teachers mattered to students and I've been spending the last few years reflecting why was that getting such a lift over the AI and I had different theories my first theory was some sort
of AI identity threat do you guys know identity threat is like somehow if you have to confront your identity uh it could somehow distract you from your learning I was thinking that we were identifying as humans being confronted
with an AI that was very powerful and that was prevented I actually don't think that's it I think it's just human connection is so deep in our minds and if we start thinking we can have an AI
tutor replace your human tutor then I think learning will slowly become L sorry learning could be become lonelier and those students might do well for a little bit but are they going to do well over 10 years time and over 20 years
time if they're starting to feel more unsure of their place in the world unsure of their connection to Human Society um so I think we should all be thinking about this a little bit how can
we work towards a vision of a future for Education that's not just making us all more skillful but also making us feel more connected to our fellow humans um and so there I'll go with loneliness
that was excellent thank you for for ending ending us on that note and I think um uh this has been a really inspiring and engaging panel and I think Isabelle if I'm correct I think we might
have time for some questions oh yeah yeah um so maybe first a quick Round of Applause for our panelists thank you that was
awesome um Shanner 2 when on YouTube can we watch uh lesson of Chris and interrupt him and ask him a question or he ask us a
question I do think so by the way on that um uh the the podcast generation thing now we actually have rolled out the ability that you can like call into the podcast on Notebook LM and
interrupted and asked a question in the podcast and it's actually kind of a surreal experience so it's not very far um I I can't speak to YouTube's product road map plans or anything on any timeline on that a video generation is
quite more intensive than than audio generation but you know that is like this interactive um experience is not that is not that
far away but but that's an interesting example of the symbiosis of where the AI does really well and where the humans do well because Chris is a personality and all of his students love him and they
love him because he's a human uh but it would be cool if they could watch your recordings afterwards and also engage with you maybe at scale thousands of people around the
world as someone who doesn't write a lot of code but I can write code if needed I'm excited about the prospect of uh lm's writing code for me but um I wonder
about what happens when things go wrong do we have a generation of of people who who going to get used to this and not be able to find the
mistakes um maybe this is an instance of brain r that you've you've discussed but how how do you think we're going to deal with that yeah I think it's a great question I mean it reminds me of like you know
when we all had the the navigation systems in our car and now all of a sudden like you can't like like you know you're in New City like you can't drive anymore you can't fure out to get from point you know um because you just completely like offloaded your your
sense of direction to to the to the computer here um or if you're you know lost without your phone all of a sudden you feel like oh my God what can you know so I do think there's there there's a real big risk there and to me the
fundamental thing it comes back to like what are the skills and that we're going to keep needing and what is a profession what are these professions move into right if if it's like I I would imagine
you're still going to need I don't know I depends on on how good AI gets and all all of this stuff but I would imagine like that ends up becoming more of the important skills for a lot of these
professions is being able to like anybody can draft a lot of code really quickly but how do you bring it together how do you like make it work ends BEC a really important skill maybe AI gets really good at that then you can offload
that thing I think there's always another skill that becomes important which is really where we need to focus on I I love that question I'll just really quickly but I guess the question
is is this another layer of abstraction like there was this moment where we do like machine code and then we invented this layer abstraction it's like Python and you know human readable coding languages and that was fantastic is this
just another lay of abstraction or is it deeper I maybe we'll just leave that as a question I'll try to answer okay the rhetorical question I think AI can also help you go
um if done well into lower levels of abstraction um and if you're working with the AI and there is an issue you can actually pretty quickly if you wanted get to the Machine level code and
sort of have it explain to you in a targeted sense what you need to know so I I think the other part of this also is with all the latest reasoning models having good Chain of Thought where you
can also see um like having some auditability of these models and then thinking through the learning side of this so that you can learn how to start to understand the
machines version of thinking and go back and forth I think that's important because yeah if you just like make a huge change to your code base and you don't know what changes it's making and you can't audit it uh then that's that's
a problem use my voice he's one right after okay you go first um so thinking back to like the
dawn of social media and the like current situation and Fallout from that where we're learning about mental health issues and misinformation and impact on
democracy and elections um what do you think are the like potential future doomsday impacts and how can we Safeguard against those um I feel like
this is like the next thing and it's a little scary that we're guinea pigs in this yeah I was to say ask I heard like anthropic a lot of people have a really high P Doom ratio
so which is the ratio every meeting starts with it if you do make it but I mean I I do think like um I do think there are real risk here and I think I think it is just kind of like when
you're in the precifice of the unknown um like and and I do think things like loneliness things like dependence on technology like it will change it's you
know obviously I'm sure even with my generation like people like watching TV and remember my teacher thinking like she would call the TV the idiot box because we weren't reading anymore um
and like you know everybody changes ultimately did that make us worse or not I don't know um you know and now we look at the mobile phone I think things will keep evolving um it is hard it is change
though right and I I think like we are definitely going to miss aspects of what Society was like you know and and and there's going to be aspects that are like a million times better in in different ways and I think we the best
thing we can really do is um you know really keep an eye out on on on the risks and I and I think the long-term risks of like dependence on AI or
getting attachment or embod or or like you know um people um personifying Ai and and like there is a real risk to a lot of that stuff um and I I personally
think you have to be very very careful with that stuff particularly when you're dealing with you know under 21 under 18 year old um and really like be very cautious with with a lot of
that can I add one quick thing to that one um I think as as someone whose background is in education and not in um computer science or engineering I think I think a lot about all of the um the
you know um over a hundred years of scholarship that we have on like Child Development and so I think something that shantu said that really resonates with me is what are we trying to solve
for like technology can't solve it all and technology is not the Silver Bullet but if harnessed correctly it can uh be very powerful and do some really amazing things so I think we still need
Educators we still need education researchers we still need that um that in the integration of that knowledge into our everything that we're doing the technology development investment policy
Etc or else we're we're going to um over index on technology and we're going to miss all of the things that were already mentioned here okay um thank you very much I just wanted to speak about something there
doesn't seem to be a consensus on right now and um which is ownership um you know in the context of Education um when you were talking you were like um you know when you do this with AI and you
talk about a code you know you're like okay you generate this thing you modify it and you know um I mean just fine tune it and technically becomes your own and stuff like that um but you know recently
um I think I saw in the news um maybe two days ago student from Minnesota it's like okay um I used AI didn't use AI just you know accusation it's like okay
no I did my thing it's my thought I put it in AI you know f it becomes my work and I feel um that's something that um maybe out there in the industry it's like you know c um just um The end
justifies the me um but in education it's like okay um this is not your work and um just how do you see that you know um moving in the next um couple of you
know years um you know um just just stands about that you know um what does ownership um what would ownership look like yeah I think it's an open and interesting question there something we've been thinking about a little bit
too because I think you're exactly right like if if my daughter writes a book with chat gbt but it's not all the illustration is it her book um and actually some of her ideas to be honest she too they're a bit derivative from
another book that we were reading that's not that important right now but but you know I think there's an interesting uh intellectual problem uh an intellectual problem we could pose which is how could
we quantify how much you own in a process and I think there's like how many decision points did you make and I I feel like there's a way we could start to talk about what does it mean to owner to own a product and what percentage you
you feel like you contributed to so I think this a cool question we'll come up with some neat answers but I don't have them yet awesome uh Hey guys uh my name is Tyler Wright um Shen you talked about
some cool stuff around um being creatives and sort of accelerating ourselves to that Chris you sort of mentioned the same like whether we product managers or coders or we probably a lot of things I think you
know is is probably part of the answer and then Drew you talked about like let's not train ourselves to the things that are actually getting solved for which are like a lot of the jobs sort of today and so where that leads me to
think is that there's this tension that's going to start to arise around the way we look at curriculum today and its ability to evolve Vance like if we can have two two phds in the same time
it takes to get one then our tied down this to both grade in progress start to challenge you know time as well as what we're learning like what's in the curriculum I think there's this
interesting tension to question curriculum and I sort of I want to put it to you guys to say it's such a foundation for what teachers teach too today but that is sort of the foundation that needs to think about evolving so I
wanted to get your guys opinions around the tension on curriculum and its ability to sort of keep up or to lead where we want people to get to next to get to these great multi Creator States
as you guys described I I 100% agree I think so much of the conversation around Ai and education is using AI to change
how we learn but more importantly if we look a little ahead is changing what we learn and I think there are some analogies I've heard people share that
are interesting until they break down for example you know even after the Industrial Revolution we still do physical labor in the context of gyms what does that look like for
learning even when we don't have to learn a skill for an economic reason what is the equivalent of a learning gym what's interesting about that is if you take that idea
seriously then the concerns around AI for cheating even ownership all of that start to look a little different because
we worry a little less about cheating in gyms it's there but there is an agency problem in gyms of things that we feel like we should be doing versus what are
we actually doing and so I think there could be similar analoges here in uh in education the the other sort of anecdote I'll pull out here is you know if you
look like before calculators people are you know would learn how to do logarithms by hand long division by hand after calculators we don't really learn how to
do logarithms by hand but we still learn how to do long division by hand and so even if everything can be done by the AI we are there's still some thought that has to go into what
are the things that are still worth learning versus not and it would be nice if we could just bring together all of the minds in this room and just backwards plan into figuring out what
those things are I also think there's a big there's a difference in that question of like higher red versus K12 right I think like a lot of the K12 education I mean
ultimately I don't know how many I guess maybe it's more academics here but like balancing chemical equations and the stuff that I did when I was in K12 they weren't really economic value skills right the the reason for the learning
was the thought process the problem solving seeing if my mind can learn new Concepts that's what it actually was about right and I do think that's that is actually what a lot of education is
so um but I definitely agree like the skills the the what the what is you're going to have question right like if if if any AI can balance meical equation perfectly it's become this amazing
calculator for like every task how much do I still need to need to need to learn that but I think it still goes back to par a lot of Education are those other
skills it is the the collaboration it is the critical thinking it is all of that um and that's what you actually do in your jobs and for the most part in real life and I don't think that ever goes
away right and I think part of the reason long division is still taught is probably because it is important for you to learn how to follow rules and it's still probably important like there's aspects of it that are still valuable um
and I think that is that's always going to be the case one small dissent just like I guess I don't have to teach k12 and actually by the way I I have a lot of
appreciation for all these meta skills how to choose the right problem and how to learn all incredibly important but I'm excited that I get to be in this slightly free place where I get to change my my curriculums and I get to
think okay you've learned to code in the same way for the last 20 years but now we can rethink what that looks like and as a teacher that's exciting when I'm given the space to do it I still want shared experiences for my students but I
feel like learning to code must be different in the next few years hi thank you so much for this inspiring panel um so my question isn't
about ai's impact on education but on AI education or AI in computer science education and it's twofold it's a bit controversial I'm sorry so I run a pre- accelerator training program for
aspiring AI startup Founders including Founders who do not come from technical backgrounds people who did not study computer science but are now you know it's too late for them to go back to
undergrad and they want to build AI products and so in my program we chose to adopt teaching AI product development using no code low code tools something that I think the Stanford computer
science department is not yet fully willing to adopt sorry Chris so my question is twofold one if you were in our shoes and you were teaching nontechnical aspiring Founders
who come from social impact Fields like education climate um sustainability people who want to have impact and want to build AI products to drive impact how would you teach them if they don't come
from technical backgrounds and if we use no no code low code tools what are the basics we have to cover and second in your opinion what is the future of
Computer Science Education I know Stanford might be the last place on Earth to adopt teaching computer science using no code locode tools but when do
you think that might be possible 10 years 5 Years thank you can I take this one I love the question and and I'm
gonna qualify everything I could be wrong I could absolutely be wrong um so first of all you know I'm really excited I tell you about teaching these Humanities professors I'm going to be teaching this group of 10 Humanities professors they came to me they're like
we want to learn how to program uh not because we want to be computer scientists we're like very successful professors own and I'm so excited to teach them it's going to be so much fun um now I have an opinion I could be
wrong that English is not precise enough for describing the world and and I I I believe that programming is a slightly more precise language I also think that
there's such a small fundamental core to programming that you could learn all that in about a two we intense of experience and then I think beyond that very soon you will not be needing to
learn all the extra apis so to that extent because of those two reasons that I think English is just might never be precise enough um and because I also feel like that shrinking
core of what you would really need to know to feel like you have Mastery over this uh I am continuing to be in the camp of I'm going to teach you the the found fundamental skills and then I'm going to watch you flourish and you'll
be able to go towards low code experiences very quickly so maybe I'll be low code with you but I can't maybe no code's the right answer and I'm so stuck in my ways that could be that
could be the case um and maybe tactically for low code no code from a few years ago there were different tools that would do that I would maybe encourage that with these students and
with everyone that you just go straight into the the coding editors themselves whether it's a cursor or a wind surf or a GitHub co-pilot
because the secret there is that the programmers are doing what some people call Vibe coding but it is low code programmers are doing low code now um
and the tools are pretty powerful but to the earlier point they don't abstract away all the code and I think that's important and importantly they won't just write code for you they'll also
answer questions about the code so when I'm programming and I'm in a new code base I'll spend nine out of my 10 request won't be asking it to write code
it'll be telling asking it to explain where in the code base you know this particular feature is implemented or why does this you know code work that way
and it is actually the perfect Educational Learning Experience if you just know you can do that and that's how you take someone who's never used code before and then they can learn how to
program they can do it in the process of building a project thank you so much I wish we had more time to take all of the questions um this has been such a great conversation both with the panelists and and also
your your questions have been really thought-provoking I have a lot to think about when I go home tonight uh but we want to make sure we get you all out in time for the reception so thank you all again and appreciate your contributions
to this panel thank you that was very fun
Loading video analysis...