LongCut logo

GopherCon 2025: Go's Next Frontier - Cameron Balahan

By Gopher Academy

Summary

Topics Covered

  • AI Amplifies Developers, Doesn't Replace Them
  • Software Engineering Outlives Programming
  • AI Code Generation Creates Review Bottlenecks
  • Go's Design Makes AI Code Legible
  • Go Ecosystem Flywheel Accelerates with AI

Full Transcript

[music] I'm Cameron. I'm the product lead for

I'm Cameron. I'm the product lead for the Go programming language at Google. I

don't have all the answers, but um I try to find them. Uh in this presentation though, I'm going to tell you a bit about the Go team's vision for the future of software engineering and what

we're building to get us there.

Uh before we do that, I'll tell you a little bit about myself for a second.

I've been on the Go team for a little over 5 years now and I've been as product lead since 2022. Um, as I mentioned, this GopherCon is extra special and not least because New York

is where most of the Go team sits and so you can find us here this week at our booth or in the hallways. Uh, we'd all love to chat, so don't hesitate to to approach us. I myself have lived in New

approach us. I myself have lived in New York for 19 years, which is like a badge of honor. New York can be a hard place

of honor. New York can be a hard place to live and sometimes I think I've had enough. But whenever I go anywhere else,

enough. But whenever I go anywhere else, I'm reminded why I stay. New York is this place of pragmatic energy, authenticity, and this unique brand of kindness.

In the next 20 minutes, I'm going to talk about what the Go team sees happening in software engineering and how that impacts the work we do. It's

probably no surprise that AI is going to be a big part of that. First, we'll

touch on what we see happening to software engineering generally, how AI impacts how we build. Second, we'll talk about Go, you know, our founding principles, our legacy of leadership,

and how the evolution of software engineering intersects with our principles.

Finally, we'll talk about what we're doing about it, a broad description of what's ahead.

Now, before we get started, I want to acknowledge that many of you are probably really tired of AI. There's a

lot of hype and snake oil, a lot of misconceptions around what it can do.

And I appreciate that that's really frustrating. I think this story is true.

frustrating. I think this story is true.

There are some big voices in the industry that have these sort of bombastic things to say. Jensen Wang

says that soon nobody will have to program, that the miracle of AI is that the programming language is human. Elon

Musk says we won't need to write code and we won't need software engineers.

Then there's other voices like Google's own Sundar Pachai. They've got a different take. They say that AI will

different take. They say that AI will empower developers. Sundar said

empower developers. Sundar said something more recently about envisioning that we'll need to hire many more software engineers because the opportunity space is expanding. So you

know yes developers are more productive but that just makes their jobs more important. Statin Nadella says something

important. Statin Nadella says something similar. He says that AI amplifies what

similar. He says that AI amplifies what you do. The developer is the pilot. The

you do. The developer is the pilot. The

AI is the co-pilot.

And then there are some practical opinions that I think we can all resonate with. This is the one I relate

resonate with. This is the one I relate to the most. I want AI to do my laundry and dishes so that I can do art and writing. In many of our internal tools

writing. In many of our internal tools and processes on the Go team, we've sought to identify the laundry and dishes in software engineering so that we can delegate it to AI.

And then aside from what people say, there's also the data. You know, most engineers say they're using AI a lot.

Nearly half told the Stack Overflow developer survey this year that they use AI tools daily. So there's probably something there, even if the hype can

sometimes overshadow the virtue.

But I don't want to ignore this cohort either. You know, I'm sure some portion

either. You know, I'm sure some portion of these respondents actually don't have a use for AI, and that's what they mean when they say they don't plan to use it.

But I'm also pretty sure that a significant portion of these respondents think that AI is bad. And I just want to say that I see you. You know, the Go team sees you. And we're committed to

doing our level best to make sure that what we build is grounded in advancing the same core principles that we've always sought to advance, not the hype or the bombastic rhetoric.

So, all of that said, here's what we've observed and what is a core theme for the rest of this presentation. When it

comes to developer tools and programming languages, programming language platforms like Go, AI and humans have surprisingly similar needs. Clearly,

engineering will evolve as it always has. And as it evolves, developer tools

has. And as it evolves, developer tools like Go will also evolve. For our part, Go will continue to provide the tools and capabilities that make both humans

and AI productive and all workloads production ready.

So, let's dig into it. First, let's talk about software engineering.

And the first thing we should do with that is define what software engineering means. Russ, our former TL who's in the

means. Russ, our former TL who's in the audience, uh, writes that software engineering is what happens to programming when you add time and other programmers, which is a riff on an

original idea by our former Google colleague Titus Winters, whose phrasing was software engineering is programming integrated over time. The significance

of this definition is in how software engineering is distinguished from programming.

Where programming is about solving a problem by writing code and then running it. Software engineering is the act of

it. Software engineering is the act of collaborating with others to design and implement a durable system that evolves over time. So programming is a part of

over time. So programming is a part of that, but it's just a part. As we'll

discuss in the sections that follow, go's principal objective is and always has been to serve the whole of software engineering, not just the programming part.

So how does AI change any of this? Well,

let's walk through how we see AI AI show up in software engineering today. So we

started with predictive text, you know, just an extension of traditional IDE autocomplete with code suggestions. We

get this with tools like copilot or cursor.

The next step was conversational interfaces in the IDE tools to chat with your code. This sort of functionality

your code. This sort of functionality came in standalone chat interfaces like chat GPT and Claude, but also in the same tools and plugins we use for predictive text again like Copilot or

Cursor.

And more recently, we've seen what I'm loosely calling coding agents. And these

are often CLI interfaces with a ripple and where the context is the entire project. Some examples are Cloud Code

project. Some examples are Cloud Code and Gemini CLI.

And we're also seeing agents all across the whole software development life cycle. These are often autonomous

cycle. These are often autonomous services that run asynchronously, just fire and forget. Some examples are Google jewels and copilot agent.

And this timeline of evolution tracks how developers today say they use AI. So

from the Stack Overflow survey, we see that the number one use case for AI is writing code. The next highest ranked

writing code. The next highest ranked use cases are a lot of tasks that are related to learning like when you're chatting with your code. And finally, we see the stuff in the rest of the SDLC

code review, creating and maintaining documentation, testing code. Only 10% of these respondents said they used AI for deployment and monitoring compared to

nearly 60% who said they use it to write code.

which has led to this interesting phase where the AI is used disproportionately over the software development life cycle which creates a bottleneck in existing processes

because AI reduces the cost of producing code there's a lot more code being produced with a lot more code being produced there's a lot more code to review but we don't use AI to review

code as much as we do to write code so developers are spending more time reviewing code than they used to that's the current imbalance in software engineering in the AI era. AI is doing

the art and writing and leaving us to do the laundry and dishes.

This all leads to the takeaway of this section which is that the current state of AI powered development has changed the roles of engineers and programming languages in at least three ways.

First, AI and agents are doing more code generation and editing whether through code suggestions or in ripples or in autonomous agents. This makes rapid

autonomous agents. This makes rapid iteration a component of agentic productivity. Agents are more successful

productivity. Agents are more successful when they have a fast dev loop.

Second, the growing imbalance between the costs of code generation and code validation mean all the stuff that make code easy to understand and reason about

is even more important. Readability,

typeeing, stylistic consistency, easy concurrency.

And third, ecosystem. So, as we'll discuss later, ecosystem is what makes a language. As AI chooses more

language. As AI chooses more dependencies, including tools, libraries, and services, we need tools and signals that ensure the AI's choices

are secure, reliable, efficient, trustworthy, and well-maintained.

Next, let's talk Go.

As Rob wrote back in 2012, Go was designed by and for people who write, read, debug, and maintain large software systems. Go is about language design in

the service of software engineering.

And language design in service of software engineering actually requires not just a language but a platform built for software engineering. That meant

being opinionated so whole teams could structure, format, and test their code in the same way. That meant promising that the code you wrote today would not

only still work in 10 years, but would still be good code in 10 years.

And over time, it became clear that the service of software engineering required not just a language and a platform, but also an ecosystem built for software engineering. That meant building a

engineering. That meant building a global system for software dependencies that could scale with your teams. And as security became increasingly foundational to software engineering,

software engineering required not just a language and a platform and an ecosystem, but one with security woven through it.

Go is a platform in the service of software engineering. Software

software engineering. Software engineering is what happens to programming when you add time and other programmers.

And here's the thing, increasingly AI is a teammate and benefits from many of the same tools and platform features as you and me. This author Thomas Pachek

and me. This author Thomas Pachek alludes to this. He says, "I work mostly in Go. I'm confident the designers of

in Go. I'm confident the designers of the Go programming language didn't set out to produce the most LLM legible language in the industry. They succeeded

nonetheless. Go has just enough type safety, an extensive standard library, and a culture that prizes often repetitive idioms. LLM's kickass

generating it.

The things Thomas called out together make for code that is easier to generate and validate. You know, type safety

and validate. You know, type safety catches errors early. An extensive

standard library promotes consistency and security. Repetitive idioms create

and security. Repetitive idioms create clarity and consistency.

This all makes it so that developers can iterate on AI generated Go code with greater speed and confidence.

These things along with other things like a fast dev loop, easy concurrency, secure defaults, and simple readable code all stem from our founding design principles in service of software

engineering. Because AI benefits from

engineering. Because AI benefits from the same things as you and me, these things make the AI better and make for better generated code that is easier to validate.

And then there's the ecosystem. You

know, as I said earlier, the ecosystem is what makes it all. Because Go is a language platform. We have two more

language platform. We have two more opportunities to improve the quality and validation of AI generated code.

First, Go's endto-end approach. The fact

that Go is a platform and service of software engineering gives the AI tools and resources throughout the software development life cycle.

This includes things like our static and dynamic checkers, our vulnerability management system, our build tools, our libraries, our language server, and all the other things that we provide out of

the box as part of the Go language platform. AI can use these tools just as

platform. AI can use these tools just as humans do to generate more reliable, more efficient, and more secure code.

And humans can validate that code faster and with more confidence.

Second, the Go platform can also derive all sorts of quality signals from the ecosystem. So this includes things like

ecosystem. So this includes things like vulnerability data, usage and trust signals, and combinations that can help both AI and developers assess the quality of libraries so that they choose

dependencies that are better maintained, more reliable, more secure, and more trustworthy.

And that brings us to the title of the talk. What's next?

talk. What's next?

Go is famously productive and production ready. And when we think ahead, we

ready. And when we think ahead, we usually consider how any proposed work that we we're doing furthers one or other of these objectives. Go is

productive because it's easy to learn and maintain and scale across teams. And Go is production ready because it's reliable, efficient, stable, and secure.

So let's see how we plan to hit each of those objectives next.

As AI redefineses productivity, we see a few big initiatives.

First, there's what we call the stuck in the past problem. Because we train LLMs on data that exists in a specific point in time, LLMs are more likely to generate code that existed before that

point in time. And this can leave out new products, new features, new APIs.

To help address this problem both on an individual developer and ecosystemwide level, we're building modernizers that identify old idioms, method signatures,

APIs, and more and replace them with their modern equivalents. This benefits

individual code bases by making it easy to rewrite old code to take advantage of new features. But more importantly, it

new features. But more importantly, it pulls the whole ecosystem forward, which increases the rate at which new features can propagate in libraries and in the open source code that trains LLMs. Allan

from the Go Tools team is a talk about this at 2:25 this afternoon.

Second, as we've established, AI can benefit from the same tools as you and me. In order to make it possible for AI

me. In order to make it possible for AI to effectively use tools written in Go, including the entirety of the Go platform and its tool chain, we're building an official MCP SDK. You may

have seen the experimental MCP server in Go Please, which allows Go Please to expose some of its functionality to AI assistance as MCP tools. Uh Katie

Hawkman, who used to be on the go Go tools team, has a talk on MCP tomorrow at 11:35.

And third, we want to shift left as much as we can with our quality and security signals. As LLMs choose more of our

signals. As LLMs choose more of our dependencies, the sooner we can get things like vulnerability info in front of the LLM, the better.

Which leads to another observation about AI and software engineering. Go's

ecosystem has always been this flywheel that powers developer success. The more

capable and higher quality the ecosystem, the more developers leverage Go and invest in it themselves, which draws more developers and more developer

success and so on. Now with AI, this flywheel spins faster. So let's imagine that we get better AI assistants and agents because they leverage the best tools and resources from the Go

ecosystem. This leads to more successful

ecosystem. This leads to more successful developers because they're able to use these better AI assistants and agents and have an overall better developer experience.

More successful developers create more Go workloads, which draws more Go developers to work on those workloads.

More Go developers mean more developers who invest in the Go ecosystem, either through their jobs or through personal contributions, including open- source code, documentation, samples, and a

variety of ecosystem signals.

This improved ecosystem in turn is available to AI for example through MCP which gives us better AI assistance and agents and so on.

The second big area we're thinking about is production readiness where we want to be sure that Go developers can build reliable, secure, productionready AI applications and services the same way

we've enabled nonAI applications and services in the past.

Before anything else, the biggest takeaway here is that it turns out most AI applications are cloud applications that make high performance API calls.

And this is exactly what Go has always been extremely good at. Again, our focus on software engineering, which includes a focus on enabling modern software engineering architectures and

environments, pays dividends. But there

are AI specific things we need to we need to complete the circuit like the SDKs that enable AI applications including MCP but also things like the

agent development kit or ADK and the agentto agent protocol or A to there are also some things that are not AI specific but which benefit AI

infrastructure. SIMD which we're working

infrastructure. SIMD which we're working on now and we're targeting for Go26 in February benefits all kinds of high performance computing and it also

happens to enable writing high performance vector databases in pure Go and more generally we want to keep doing the same things we've always done make

Go faster and better as hardware evolves we're evolving the runtime with it so this includes initiatives like the green tea garbage collector introduced behind

an experimental flag in Go25 that scales better across NUMA architectures and the many CPU systems that are becoming more prevalent. Michael from the Go Core team

prevalent. Michael from the Go Core team has a talk on green tea tomorrow at 3:45.

So that's how we're thinking of the next frontier. The same principles adapted to

frontier. The same principles adapted to evolving engineering needs and we're pretty confident in this approach.

Serving software engineering has always been Go's primary objective and it's worked well. But we also look to our

worked well. But we also look to our users too and so far so good. We

consistently see really high customer satisfaction in our surveys, levels practically unheard of in the industry.

We also see remarkable growth in adoption. In the last Stack Overflow

adoption. In the last Stack Overflow survey, 17.5% of professional developers said they used Go, up from 14% last year and 9% 5 years ago.

And this proportional growth is happening at a time when the sheer number of developers is also rapidly increasing. But while the developer

increasing. But while the developer population at large grows, Go is one of only two languages that grows faster.

Meaning we're targeting the right use cases, providing the right value, and growing for reasons that go beyond that larger growth in software engineering.

Finally, we're seeing Go used all over the internet. Late last year, Cloudflare

the internet. Late last year, Cloudflare reported that Go is now the most popular choice for API clients. This is a huge jump from the previous year where Node was number one and had nearly double the

number of API calls as Go. A really

remarkable increase for Go and an important milestone in the kind of services for which we know Go is bestin-class.

Which leads me to you, the Go community.

You know, as I said earlier, Go's ecosystem is what makes Go what it is.

the language design, the platform, the focus on software engineering and the problems for which we solve. They're all

built on top of Go's ecosystem and the community behind it.

There are hundreds of thousands of Gophers around the world who meet up.

They come to conferences like this because they love Go. And there are thousands of Gophers who contribute to open source, write tutorials and samples, or contribute to the Go project

itself. These contributions are what

itself. These contributions are what make Go succeed now and in the future.

More open-source helps train models, making Go work better with AI in a world where more developers choose languages that work well with AI.

More quality libraries help humans and AI choose better dependencies.

Samples, documentation, and how-to guides help Go developers productively experiment and build AI applications.

contributions to the Go project keep go productive and performant as hardware evolves.

So, you know, I started this talk by joking about my complicated relationship with New York City. I said the reason I stay, the reason it feels like home is because of the people, the unique

combination of kindness, authenticity, and pragmatic energy that makes the city work. Looking out at all of you, I

work. Looking out at all of you, I really feel the exact same way.

kindness authenticity pragmatic energy. The future of software is

energy. The future of software is complex and it won't always be easy. But

building it with this community is what makes it feel like home. The reality is that the future of Go in the AI era will be built not just by the Go team, but by all of us.

Let's have a fantastic conference and thank you.

[applause]

Loading...

Loading video analysis...