Snowflake Summit 2025 Platform Keynote
By Snowflake Inc.
Summary
## Key takeaways - **AI Enables Unstructured Data Processing**: AI enables the processing of unstructured data like documents, files, and images, dramatically expanding platform capabilities to deliver deeper insights in a fraction of the time. [02:15], [02:22] - **Natural Language Data Interaction**: AI allows anyone across your organization to interact with data in natural language, simplifying data utilization for business at scale. [02:33], [02:45] - **Adaptive Compute for Resource Management**: Adaptive compute allows customers to provide policies and intent, enabling Snowflake to automatically manage compute resources, types, sizes, and scaling for better performance and utilization at scale. [22:03], [22:17] - **Simplified Ingest Pricing Model**: The new ingest pricing model charges based on the volume of data ingested, simplifying costs and providing roughly 50% better economics for bringing data into Snowflake, whether Snowflake or Iceberg data. [24:00], [24:25] - **Cortex AI SQL for Multimodal Analytics**: Cortex AI SQL enables natural language in SQL queries for multimodal data, including entity extraction, aggregation, filtering, and transcription directly in SQL, handling text, images, and audio with AI functions. [01:03:54], [01:04:13] - **Snowflake Intelligence Agentic Platform**: Snowflake Intelligence provides a first-party agentic experience powered by state-of-the-art models, enabling business users to access verified insights from structured and unstructured data across sources without coding, with citations for trust. [01:23:23], [01:24:03]
Topics Covered
- AI unlocks unstructured data processing?
- Adaptive compute revolutionizes resource management?
- Cortex AI SQL transforms analytics with multimodality?
- Semantic views make all data AI-ready instantly?
- Snowflake Intelligence accelerates agentic growth?
Full Transcript
[Applause] [Music] Heat.
Heat.
[Music] Please welcome Welcome Snowflake co-founder, president of product and today's conductor Benoa [Applause] [Music] Dajil.
Hello everyone. Hello and welcome to day two of summit. I am Benoa, co-founder of Snowflake and really so happy to be here with all of you.
As you heard yesterday from my CEO Shreda, AI is creating new ways of working from streamlining existing workflows to developing new applications.
And just last night, Terry and I were at the data drivers awards to celebrate our customers who are really pushing AI forward with pushing their business forward with data and AI.
And the winners are here today.
Congratulation to you all.
So the rate of AI innovation is is really exciting.
It enables really two revolutionary shifts in the data world.
First, AI enables the processing of unstructured data like documents, files, and images.
And that's dramatically expands the platform capabilities to deliver deeper insights in a fraction of the time.
Second, AI allows anyone across your organization to interact with data in natural language.
But how do you take advantage of all this AI innovation and put AI to work for your business at scale? And this can be full of challenges and it often raises really important questions like for example what are the opportunity cost of investing in AI?
Do we have the right data governance in place to see AI initiatives reach production or how will we integrate AI without complicating our infrastructure?
Here at Snowflake, we have been working on delivering solution for all these challenges.
We have been on a journey to innovate with you and for you since day one.
Simplicity has been a core product value for Snowflake.
And when Terry and I founded Snowflake, analyzing data at scale was really so complex that only few companies in the world could do it.
Instead, we wanted to make this process easy so that any organization could leverage all of its data and more.
Simplicity represent just one of our values.
Equally important to our commitments is to make snowflake both connected and govern.
And now we want to do the same with AI.
Like data, AI should be easy, connected and trusted.
Let's start with easy.
We provide you with a fully managed platform that removes all friction, allowing you to innovate way, way faster.
And the next value is connected.
With Snowflake, the data and business are no longer siloed.
You can work together, share data, and run AI powered applications openly and easily across your entire organization.
And finally, Snowflake is trusted.
The Snowflake platform lets you run LLMs and applications directly within our government's parameter and this means that you can safely experiment with AI and stay ahead of the latest technology trends.
Our product values help define snowflake unique market differentiation and through these values we deliver a unified platform that creates a connected ecosystem where organization can build, use and share data applications and of course AI and the possibilities are limitless.
I really look forward to seeing how all of you can leverage the Snowflake AI data cloud.
And let me now introduce our EVP of product, Christian Kleinerman, who will share the latest platform innovation so you can do more with your data.
Thank you all and enjoy the rest of summit.
[Music] Good morning, Snowflake Summit.
We We need more energy, you know.
I like it rowdy. I'll take it crazy if you want.
Hello, Snowflake Summit.
Are we excited?
Huh? That's much much better. I want to give a shout out to the orchestra.
Isn't it beautiful?
That that first piece that you just heard was composed by Adam who is running the music show here.
He did it inspired by Summit. But our topics here with the help of AI at some point we'll share more about the story, but I'm super super excited. Once we thought I thought we couldn't make this venue and this setting any better.
It has gotten better with the music.
So, hope we we all get to enjoy it. Summit is my favorite.
Yeah, summit is my favorite week of the year.
Two reasons. One, we get to share a lot of the innovation that we uh create and and and build for all of you.
But probably the most important thing is because we get to hang out. We get to see each other. We get to learn from each other.
We get to compare practices.
So we do have some announcers for you.
We have some demos.
No demos. Okay.
Cancel the demos. We've heard from you.
You also like hearing from what customers are doing with Snowflake. You will have a few of our customers on stage.
In some instances, they were so excited about our announcements that they will be making the announcements.
they're more excited or convey it better than I do.
I want to acknowledge we have thousands of people right now watching online.
Can can we get lights up and everyone wave to people watching online and we have every snowflake office out there.
They're watching us. So, hello.
We miss you. And with that, let's get going.
I want to pick up where Benois left off, which is we're building the AI data cloud.
Part of it is a technology platform.
Our goal is to help you simplify what you're trying to get done through the entire life cycle of data from the moment data is created all the way until it's archived and everything in between. But the other part of the AI data cloud is all of you in the room is this ecosystem, is this network. All of you here have the opportunity to collaborate more easily, more seamlessly with one another.
And that is what we are building together.
In structuring our conversation today, we could have done what we do in prior years, say it's a functional organization.
Here's AI and here's the foundation and all of this.
But we decided to pivot.
We're going to organize our talk today based on your needs. The world is not linear.
You never s sit down and say I'm going to do some AI today.
There are demands of all sorts on your time and we want to look at it from your perspective.
So I have a series of I want or I need statements that hopefully resonate with you and hopefully put into context some of the innovations that we're doing.
With that, let's get started with the first of those I want statements.
And I'm sure everyone of you here wants a data architecture that is futurep proof.
What does that mean?
That when the business comes and says, I have an additional need, I have a new project, it doesn't turn into this is going to be tricky.
That data is in a different system or I want to interoperate with something else.
We don't have a path to do that.
And Snowflake has been designed from the beginning and we continue to innovate with the mindset of a single unified platform.
We want to give you choice and flexibility in terms of data and compute.
And we think of AI as a foundational element of the platform that we provide. Both for you to build on top of AI and leverage AI, but also for us to enhance every aspect of the data life cycle with AI. And of course, we do so cross region and crosscloud.
We want to give you flexibility on the topology or the architecture that you implement.
If you started with a data warehousing and you say I want to bring transformation and upstream data captur into snowflake you can do that.
That's a lakehouse. But if your lighthouse at some point you say I need to empower the business units or maybe I have a lot of data residency and locality requirements.
No problem. You can also do a data mesh. The goal for us is to give you flexibility on all these three dimensions.
Looking at data in more detail from the very beginning of Snowflake.
We had structured and semistructured data, but we've added unstructured data.
We did it in 2021 to simplify the management of all data with a consistent and comprehensive approach.
And we are all living in an era of AI that unlocks unstructured in an unprecedented way.
It makes it easier to get value out of unstructured data.
We also want to give you choice in compute but not overwhelm you with choice.
We don't want you to have to know about these 500 instance types and which one to pick and which one's the best.
So we'll give you flexibility but we will abstract a lot of the complexity on your behalf. We have warehouses.
We have Snowpark container services.
We of course have SQL runtime.
And through Snowpark, we host the Java and the Python runtimes.
And of course, CPU and GPUs are hardware choices. From an AI perspective, we want you to be able to leverage the state-of-the-art on what AI is bringing or making available to all of you.
Key is we bring the AI to run within the security boundary of Snowflake.
We are committed to bringing you the latest and greatest models into Snowflake.
We are now the making available GPT41 and 04 mini in Snowflake.
the OpenAI models.
We continue where there is an ability for us to deliver value to bring our own models like what we did with Arctic Extract that powers document AI and other parts of our platform from Anthropics models when Opus 4 and Sonet 4 were announced that same day they were available in Snowflake.
Same thing with Meta and the Lama models.
Anyone knows what's the next icon?
Pipsick. Okay, someone someone knows it.
Deepseek. Some of you have said I really want it. Some people have said keep it as far away as I can.
We want to give you choice. And the last one, we also bring in the models from Mistraw.
All in all, we want to give you again flexibility, but also don't overwhelm you with the burden of making lots of these choices. When we think of futurep proof data architecture, we also want to make sure that you can bring in business logic and computation to run closer to the data without having to make copies of data because making copies of data is difficult.
And last but not least, I know nobody here wants to ever be logged in into a technology platform and we are committed to helping make sure that that is not true.
We're committed to making Snowflake open and interoperable through the adoption and support of Apache iceberg and through the original incubation and now support and evolution of Apache Polaris the catalog and we continue to innovate both advancing helping advance the open-source projects but also our implementation of those data types.
I think many of you have looked at iceberg supporting snowflake was like oh but snowflake had variant long time ago I need variant we're working with the iceberg community we're bringing it to snowflake same thing for geospatial data types I want to be able to write to tables managed by by other calendar cataloges which from snowflake's perspective is an unmanaged table that is also in preview for those using open catalog which are a hosting of polaris I want a private link I want all the enterprise capabilities that is now going into general availability and we continue to bring many of the capabilities that we did for Snowflake over the years to iceberg to open data so that you have choice.
You can interoperate but you get all the things that you have come to appreciate and love about Snowflake.
So hopefully it's clear that at Snowflake we want to give you the ability to choose a data architecture that grows with your business needs that evolves with your business needs not that holds you back. All of you should be looking at the at the rest of your organization and say our choice of snowflake it's a strategic asset.
We are turbocharged. We are more empowered and more enabled because we have chosen snowflake.
That makes us s super super happy.
Now let's go to I want statement number two which is I want to get better economics from my data platform.
Anyone does not want better economics.
It's better than saying quiet everyone.
Quiet everyone wouldn't have had that effect.
We understand it.
We are also part of a we help you get your work done with the best economics in place.
And it starts with helping all of you understand and manage your resource consumption and your cost utilization with Snowflake.
It's been years that we've been adding resource monitors and budgets and alerts, but here at Summit, we're announcing additional new capabilities.
Simple one, organizational usage views.
You can see all of your spend and and consumption with Snowflake across regions across clouds in a single pane of glass.
We're also introducing here at the conference spend anomalies.
What you see here in in the screenshot.
We compute a range of expected consumption from your perspective and we let you both view or notify you if the spend is out of whack, whether it's higher or lower.
At least you get to know.
Nobody wants to wait until the end of the month to hear that. Also, we want to help you classify consumption.
We're introducing query tags and object tags so you can say this workload is consuming this many resources.
Enables you to do showback and chargeback.
We're committed to helping you manage your spend.
Now, I want you to hear from an amazing snowly customer. We have Julia Morrison.
She's from Merit International.
Please join me in welcoming [Applause] [Music] Julia.
Thank you, Christian.
Thank you, Julia. Thank you for being here.
I trust your accommodations at at Snowflake Summit here are meeting your expectations.
Absolutely.
Obviously, working for the leading hospitality company.
I love to travel and I love staying in our amazing Marriott properties like the beautiful Marriott Marquee next door. Staying there feels like checking into tech conference and a luxury hotel at the same st time.
I wasn't sure whether I should unpack or pitch a startup.
Seriously though, it's a pleasure to stay there as I get to see how all data and NI initiatives come together for the greater life of experiences of our customer.
Maybe tell us a little bit more about your team and the mission that you have as a leader in hospitality.
It's our vision to enhance the lives of our customers by creating and enable unsurpassed experiences for business and leisure travelers.
Now, doing that at scale is a unique challenge for Marriott as we have over 9,500 hotels, more than 30 brands, and we are in 144 countries interators and counting.
We are currently in a multi-year digital and technological transformation to help us continue to lead the digital evolution and provide unique customized experiences for our guests and data is a large part of that transformation.
So what does that transformation look like? Well, we are m democratizing data and analytics across the enterprise by building a series of data products for a wide range of consumers from analysts and marketers to franchises and owners and automated systems. For example, when you check into Marriott Hotel, we deliver key data points to enhance your experience.
We operate on billions of data records flowing real time through our platforms. Then we build and run data products on top of them and then analytical models and then visualizations.
All of that to connect then to many internal and external consumers.
Our deal operations run at massive scale all focused on delivering enterprise value from data.
And Snowflake makes all of this easy.
Well, yes and no. Managing the scale of data.
What do you mean? No. Yeah.
Well, managing scale of data is not problems, Snowflake.
However, uh where it gets complicated, it's managing all the compute resources.
We have to rightsize our compute nodes. we have to, you know, decrease or increase our warehouses and really develop our own in infrastructures and rules in order to manage our compute effectively. Yeah.
So, so in reality, Snowflake did pioneer the serverless model a long time ago.
I wanted that screen to say 2015 because it's 10 years. Benois shows up on Sunday and say, "No, it's 2012 since the beginning of Snowflake." So, we change the slide.
So for 13 years we've been serverless where you to use t-shirt sizes to choose compute size but you think there's some room for improvement right?
Yes. One of the reasons I'm here is that our teams have been working together to address this exact problem.
And now I'm excited to announce that Snowflake is introducing adaptive [Music] compute.
With adaptive compute, we're able to easily manage compute uh resources at scale.
Christian, please tell us more about it. Yeah, we're we're very excited about this innovation.
It's truly next generation compute level where you as customers give us policies and intent and Snowflake figures out what are the resources you need, what are the types, what are the sizes, what are the scaling properties and at the end of the day is easier to manage.
you'll get better performance and very important you'll get better utilization.
So we're we're extremely excited.
This in prior preview it's a complex system.
It's going to take some time to make it out to all of you but we're incredibly exciting.
Julia, thank you so much for being here.
Thank you. Thank you.
Give it up for Julia.
I was looking up when did we introduce snowpipe as the way to ingest data and it's somewhere between 2017 and 2018 I think preview and general availability and when we did it the way we charge for snowpipe is a cost of per file and a compute cost which okay I don't know how we got there I I do know I can tell separately.
But in reality, it lacks that element of simplicity that Snowflake is known for, which is how do you make it predictable? If I have two files with the same data, I get charged different that if it's only one file from the same data. So, we said leading up to Snowflake, let's cross out that model.
We want to simplify and today we're sharing with all of you for the first time that we will be introducing a much simpler inest pricing [Applause] [Music] model.
What does that mean? We want to make it as correlated to the value that you get as possible which is charge based on the volume of data ingested.
We're also using the opportunity to say, you know what, in some instances, we've heard from many of you that the cost of ingest is keeping you from leveraging snowflake for ingestion of data.
Whether it's snowflake data or iceberg data, it applies to both. And the way we've calibrated the pricing, you all should be getting roughly 50% better economics.
Mileage may vary. Some of you are going to get way better numbers. Some of you of you are going to get around this 50%.
But we're incredibly incredibly excited to simplify the model but also lower the economics or improve the economics for all of you to bring or make more data accessible to Snow.
So I I said we wanted to help all of you be able to manage your spend get better economics.
We do this through governance controls and monitor.
We do this through a new compute model.
The adaptive is going to be revolutionary.
At some point, you're going to hear some of our competitors saying, "Oh, we invented or this adaptive thing.
" And you're going to say, "Yeah, it was Snowflake Summit 2025.
I heard about it like 5 years ago.
" And of course, we want to simplify the pricing of everything of what we do and we're as much as possible give you great economics.
But the other thing that we hear from all of you is I want to govern all my data.
Governance is probably one of the areas where we collectively spend a ton of time.
And the answer the collection of technologies that help you manage and govern your data is Snowflake Horizon Catalog.
It has capabilities to both help you understand and know your data as well as have policies and govern your data.
It is the most capable enterprise platform for Snowflake data and it is fully interoperable with iceberg based REST cataloges.
But we keep innovating.
We are not standing still and we work with many of you on a regular basis to advance the state of Horizon catalog.
Let's look at some of the enhancements.
There's a hour plus session here at summit on what is new in in horizon.
But let me highlight a few things.
Sensitive data. We help you with automatic classification and automatic tagging.
We're introducing sensitive data insights.
We're doing automatic tag prep propagation.
So if someone copies data that you've said is sensitive, the tag will co will follow and know that that data continues to be sensitive.
Lineage is one of the fastest growing or most adopted capabilities of of Snowflake Horizon.
And in the topic of data quality, we're incredibly excited to introduce expectations where you can say here's what we expect in terms of freshness or volume or other characteristics of the data and get anomalies or alerts when that is not true.
That is in private preview.
Similar we're introducing AI model arback where you can choose based on role-based access controls what models are allowed to be used by who in your organization.
Security is another important aspect of Horizon Catalog and we will not stop until we together make sure all of our Snowflake environments are secure. Show of hands.
Does everyone here know that we're deprecating password only signins?
Yeah.
If you have not heard this, get going.
Get going. We will force you to be secure.
We will encourage you in every ma mean that we have to make sure that your Snowflake environment is secure.
We've introduced also programmatic access tokens, pass keys, authenticator apps.
We're helping with the monitoring.
If we see credentials from your account that are out in the dark web, we'll let you know and potentially take action and disable accounts.
If we see access to your account from bad, well-known bad IPs, we will block that. We are here to serve you.
We are here to help you be secure.
Big priority for us.
And the other pillar is compliance.
Street mentioned yesterday the IL5 certification of the department of defense.
And many of you may be I am not the department of defense but when you talk to your security teams say if it's good for the three-letter uh agencies probably it's good for us. That's how we keep advancing the state-of-the-art.
We have dozens of new certifications on a regular basis and revalidations because we want to make sure that we comply with your requirements but also we help you comply.
Today, we're also excited to introduce the Horizon Co-Pilot.
If you've seen a campaign out there that says you don't have to give a click, that's what this refers to.
You can ask questions on which objects are not protected or are there any tables or columns that are missing tags?
And this is now in private preview.
All in all, Snowflake Horizon is very capable. But I I mentioned something.
I said, "Oh, it's the most capable catalog for Snowflake data.
" But also many of you said, "I have some data that is not in Snowflake, and I would love to manage it all in the single pane of glass." Today, we're introducing Horizon's capability to discover and expose and manage assets outside of Snowflake.
Yeah.
Yeah. You see on the screen the initial set of sources that we will support and we will continue adding to that list.
But it's not just data sources.
We're also adding support for dashboards and reports from PowerBI and from Tableau.
Someone is woo yay.
And also because lineage matters to all of us when we're managing uh data.
We enable importing lineage.
bring your own lineage. Uh, which also helps us bring data from DVT or from Airflow.
Cool.
Another aspect of what we've been doing that creates a lot of excitement with many of you is our internal marketplace where you can curate and publish data products for the rest of your organization to discover and leverage.
Whether it's a notebook, a machine learning model, just a data set, you can do either one of that.
We're doing request approval flows to for when someone doesn't have access to a data product, says I found it. Can I contact the owner and get permission?
We're simplifying data product management inside of Snowflake.
And we're we're launching at the at the conference a beautiful refresh of the UI for the internal marketplace.
This is now in public preview.
[Applause] So when we say we want to help you with governance, we mean it. Every aspect of governance security compliance discovery, external metadata, copilot, internal marketplace, we will continue to invest and innovate to simplify governance for all of you.
Now let's talk about integration of data.
The reality is we all still have too many silos out there and when data is in a silo it makes it difficult to have better decisions, better insights.
We are on a mission to eliminate both silos as well as eliminate copies of data and from that perspective today we are incredibly incredibly excited to announce snowflake openflow.
[Applause] [Music] It's a managed service helps you bring and process data. Supports a variety of sources, variety of destinations.
And you can see the list here.
I have a a slide with is both structured data sources as well as unstructured data sources.
You have data in SharePoint, Slack, Google Drive. You can make it all accessible and available to Snowflake.
I want to call out a partnership we're doing with Oracle where we will be integrating the Xreme API to make it seamless to have near realtime CDC from Oracle into Snowflake.
And because OpenFlow is built on Apache Knifi, we have an even larger set of connectors and processors that make it easier for you to bring data and make it available to Snowflake.
Last, we are also offering deployment choice.
There's a version of Snowflake OpenFlow that can run on Snowflake managed resources and Snow Park container services, but also a version that runs in custom managed VPCs where you can bring your own cloud. And I want to do a shout out to many of our partners in the room that are helping with the launch and implementation of OpenFlow that will help all of you integrate variety of data sources.
And let me invite another of our amazing customers onto the stage, Brian Dumman, CEO of Astroenica.
Welcome Brian.
[Music] Thank you for being here, Ryan.
This is now your first time on the Snowflake stage, right? Oh, last time was about 5 years ago. A little smaller audience, a little different venue, a lot more energy here. I was with McKessan at the time uh in the distribution side of the pharmaceutical industry.
Uh but while I was there, I lost my dad to cancer and I remember feeling powerless and feeling like there was more that I could do. Three and a half years ago, I joined Astroenica to help unleash the power of data and AI as we discover and develop new medicines in areas like oncology, respiratory, cardiovascular, and rare disease.
About three years ago, we started our our big journey to improve the quality and speed of data by selecting Snowflake as a platform. Uh we now have about 120 data products, 150 actually running in the Snowflake platform. Uh we have found that we can now deliver data products at about twice the speed and 30% lower cost.
And so thank you for the partnership and unleashing the power of data and you're using openflow.
So when we joined when I joined there were a dozen or more different technologies we were using to integrate technology.
That legacy environment is complicated.
You end up with multiple vendor contracts.
Some of those relationships you're not really managing.
Some of them we try to double our costs over the last couple of years because we haven't managed those relationships.
We end up with different skill sets to maintain and we end up with all of these dependencies slowing down our ability to tap into data and for for pharma and for patients that need us every minute matters. Yeah.
So speed matters and so we've been working with OpenFlow now for a couple of months and I have to say having an integrated platform inside of Snowflake with the same management capabilities, same console is really the key to unlocking the efficiencies that are going to bring speed and efficiency and cost improvement to our data integration space.
And so we're excited and and encouraged by the innovation and the partners that are going to help build with us on that platform. Awesome.
How is AI playing into all of this?
How does it change the the problem or the solution?
So, the pharma industry is going to be transformed through the power of data and AI. Uh for us, it's about making sure our data is AI ready.
I think that starts with getting our structured data into a common place.
We're already well on our journey there.
But one of the things we learned with OpenFlow is that it's really good at managing unstructured data movement.
And so now we're exploring how we can build our unstructured data in the same methods and and build data products in the same methods that we manage our structured data.
And the other piece of AI that's important, it's really costly and slow to move data out of an environment, do AI on it, and push it back.
And so we're excited about Cortex AI and the partnership ecosystem around it where we can bring AI to the data.
And so making data AI ready and making sure the platform can do AI at the layer of the data really the keys to unlocking value.
And you manage one of the most sensitive types of data, healthcare data.
How does that fit in into what you do?
So many of us work in regulated industries.
We all have sensitive data that we've got to protect and maintain.
And I think you've built good tools and all of us have developed really good capabilities to manage and secure that data.
But it's really about how do you play offense?
How do you lean in and take advantage of the data and unlock that sensitive data for use?
So, we're using capabilities around synthetic data, trying to be bold to find ways to better partner with others to to get that data accessible. But I also want to give a big shout out to Snowflake and some of the work that you've invited me to participate on with the end data disparity efforts.
I think that, you know, as we look at trying to tap into data and using data for good, I'm encouraged to see the hackathon yesterday around data.
And I'm encouraged that you have a track around data for good because it's really about how can we tap into the power of data in our organizations, not just to help our companies do more, but to really do more for the world and ultimately change the world.
Brian, we we love the cause that you're on.
We love to be able to help and we love having the partnership with you.
Thank you for being here today.
Thank you. Thank you, Brian.
So with OpenFlow you can ingest data make data available to Snowflake.
Today we're also introducing a full new rev of Snowpipe streaming new SDK access from different clients stateless transformation pre-clustering at ingest very high throughput up to 10 gigabytes per second.
data is queryable in five to 10 seconds from the moment that it got ingested.
And this is now in public preview.
But you ingest data, you have it in in Snowflake, but we also want to help you transform it.
And we're incredibly excited to announce today DBT projects in Snowflake.
Cool. What is this? It's an authoring environment.
You can have pipelines from idea all the way to production and we'll help you build, test, deploy DBT pipelines inside of Snowflake.
Of course, there's observability.
I saw feedback from our early preview customers.
One of them had a bunch of words that I cannot repeat. And for the context, it was a positive context.
And this is now in public preview.
You will all be able to use it pretty soon.
Yeah. Yeah.
But if you caught the news last week, DBT Labs announced the next generation of DBT, the new DBT Fusion engine.
And shout out to Tristan, the DBT Labs team.
We've established a partnership, an agreement where the next version of DBT will also come to the DBT projects in in Snowflake.
And today we're also introducing workspaces to our user interface to snowside.
What are workspaces?
Modern development environment.
All your editing and data curation happens in this new workspace.
Supports worksheets, streamlit notebooks.
It's filebased.
It has folders. source code source code control integration and I'll tell you for the worksheets is incredibly fast.
It's in public preview now. You all can can leverage it and how about we see it in action.
I've been speaking and speaking and speaking.
Any want to see a demo?
Yeah. So, let's invite onto the stage Amanda Kelly and Dash are going to show us a demo. Welcome Amanda and Dash.
Oh, Dash is here.
[Music] Thank you, Christian.
Hello everyone.
I'm Amanda Kelly, director of product here at Snowflake.
And with me, we have the amazing, the wonderful Dash Desai.
Let's give it up for him, our principal developer advocate, can be taking us through our demos today. All right.
To illustrate some of these real problems in data engineering, let's take a look at how we can simplify data engineering for a global concert and festival company.
Now, they have a whole lot of diverse data types much like you.
They've got structured data like sensitive customer PII in Postgress inside of VPC and they've got unstructured data like phone calls, PDF and images inside of SharePoint.
Now, this prevents a few challenges already.
PII in a VPC demands a highly controlled integration pipeline.
That often means a tough trade-off between control and simplicity.
The multimodal data adds another layer of complexity.
Each one comes with its own structure, its own property, its own pattern. And what we really want is one easy and flexible pipeline that's going to be able to handle it all and make it AI ready.
So, in the next couple of minutes, we're going to walk through how we can leverage Snowflake for data engineering in order to build and perform ETL, deploy the pipelines, and activate the data.
So, Dash, let's take it over to Snowflake OpenFlow and see how they're going to make it really easy for us.
So, we're going to start with OpenFlow from Snowite.
Here I can see the connector menu already loaded with featured connectors like Postgress and SharePoint.
And there are a lot more connectors available out of the box, right?
And it's not just about bringing data into Snowflake. We can also set targets to other destinations and persist open data formats such as Apache Iceberg to fully facilitate interoperability.
And if you need a custom connector, it's easy.
You can build your own with Snowflake and Apache Knifi processors.
All right.
So, Dash has already set up some of our connections for us and we can see them here in OpenFlow. Now, notice here that there are three runtimes. Two of them are running in our own VPC in AWS.
Now with this deployment option, we can run integration closer to the source without opening up holes in our firewall and we still get the full power of OpenFlow as a managed service. For our other connections, we can run them even more easily.
We can have them inside directly in Snowflake with Snowark container services.
All of this integration services is managed by OpenFlow but can be executed in different deployments that gives you simplicity and control with no trade-offs.
All right, let's drive into this progress connector.
All right, what we can see here is the initial snapshot.
We can see the incremental loads and we can see change data is being captured continuously.
Openflow includes builtin data provenence and observability.
So every step is tracked and auditable.
Super easy for you. Now we can see that we have the CDC pipeline set to populate two raw tables, events and ticket sales.
We can see that the CDC pipeline is already running and it's populating that events table.
Let's take a look at it and prove it to you. All right, we're going to run it. And there we go.
It's already populating that table.
Let's take a look at the tickets table.
Prove us right here.
All right. And there you go.
It was easy to go straight from pipeline to data.
It's worth an applause, I think, for Dash right there. All right.
All right. We're not stopping there, though.
Okay, now that we have the data loaded, we can transform it using powerful capabilities like snow park and dynamic tables.
However, building those data models is just the tip of the iceberg, no pun intended. Deploying the data models can be surprisingly challenging with issues like dependency management and data quality. These complexities, as you know, lead to sprawling systems, painful debugging processes.
That's why DBT is such a popular open source project.
And as Christian said, now you can run dbt projects natively in Snowflake.
No extra setup required.
Yes, we can get excited about it again.
It's okay. It's awesome.
All right, so let's take a look at where we have it right here. Now, as Christian mentioned, we have workspaces now.
And here we are in a workspace.
It's that lightweight development environment for authoring, organizing, and managing all of your code artifacts in Snowflake.
And you already saw it when we did those sideby-side CDC tables and we saw how quickly they ran and executed.
That was in a workspace. Now here we're showing I saw a workspace fan there. I like that.
Yes. All right. Here we can create or import DBT projects and then we're going to be able to easily edit, test, and deploy them.
So Dash has already imported and compiled a DBT project, but starting a new one would be just as easy.
We can see the project YAML here, the DBT project file, the data sources including right those two raw tables from the CDC pipeline, the data model and the logs. It's all there. And in DBT projects, we can also make edits.
Yeah, we can compare those changes side by side. And when we're ready, we can push it directly to Git.
All right, let's hit run right there to get it started. Do it live.
And while that's running, you might be wondering where is the DBT project running.
Well, typically we'd assign a warehouse, manage it ourselves, but that can get tricky sometimes.
And that's where adaptive warehouse comes in.
With adaptive warehouse, Snowflake automatically selects and scales the rightsiz compute resources with intelligent routing.
This eliminates complexity and maximizes efficiency for me and the team. My data engineers can focus on the data in pipelines and leave Snowflake to optimize the infrastructure intelligently.
Zero ops data engineering.
All right, looks like that run completed and we can see the new files and views that are created directly, right?
We can see that in the interactive DAG below.
It's good. This was it.
This was the big thing.
All right. With uh with DVT projects, Snowflake brings the best of breed of open source tools that you already love directly to where your data is all in one platform.
All right. So now that we've ingested and prepared that structured data, what about the unstructured data?
Well, let's head back into that SharePoint connector.
All right, so we have a bunch of PDFs, but it could be images, audio, videos, and we need to deliver them as a table to our other data teams. Normally, this would require a lot of complex transformations, and I'm probably going to have to call in the favor to someone in engineering, but not with OpenFlow. With this SharePoint connector, as you can see, we get a solution to not only extract and load, but transform and activate the data.
The processors are using Snowflake Core LLMs to parse Yeah.
and chunk the PDF, all with just a few clicks.
We can also easily customize this integration by choosing different processors.
Dash, can we take a look at that populated table inside of Snowite?
All right, there we go. And just like that, really in just a few clicks, we have built an integration from SharePoint directly into Snowflake and landed that PDF in table formats with the change data captured.
Yes, we did it.
All right, let's wrap up. In just a few minutes, we were able to connect to any data source with any target using Snowflake OpenFlow with the flexibility to customize and extend. We deployed and executed pipelines inside of VBC or in Snowflar container services. Right?
Both of these are managed by Snowflake, balancing control and simplicity.
And we bridge that gap from building to production with DBT projects natively inside of Snowflake, making it easy to view, edit, change, modify, manage our pipelines.
And we even tackled AI ready data using Snowflakes OpenFlow's unstructured data connector with AI enabled parsing and chunking built into the ETL pipeline. All in one unified managed platform.
It's easy, connected, and trusted.
With that, thank you guys so much.
Thank you, Dash. Back to you, Christian.
Awesome.
It's much cooler to see it than to hear me say things over and over, right?
Yeah. You enjoy the demo?
Yeah. Hey, let's recap. We saw lots of capabilities to integrate data.
Open Flow streaming DVT workspaces.
We are committed to helping you bring your data state and manage it and make it accessible. Now let's talk about the next I want statement which is I wanted to deliver more business impact and sometimes you hear things like that you need more business impact and it's not clear what it means but we have clarity that business impact is enabling more and easier use of data. We want to help you consolidate and integrate data.
We want to help you connect data sets and connect users to those data sets.
And that is where zero ETL data sharing has been a capability we introduced in 2017 from the very early days.
How do you enable collaboration on data without having to make copies of data?
All of this is powered by snow grid enabling sharing cross region and cross cloud.
And of course, one of the key use cases is the ability to do an application to a consumer or a customer that application be able to unlock and access that data.
That's the foundation of our partnerships with Salesforce, with Service Now, and many others.
But I also want you to hear a story of how data sharing, serial copy integration helps in the real world.
I want to invite two of our Snowflake customers, Brandy Wood and Abby Ready.
Please welcome Brandy and Abby.
[Applause] [Music] Welcome Brandy and Abby.
Welcome as well.
Okay, so you're with Ferve, you're with PayPal.
Maybe let's start with you Brandy.
Tell us about Ferve.
I don't know that everyone here is familiar.
Yeah, absolutely. Everyone here probably interacts with FServe every day without even knowing it. We're a global technology provider for payment solutions and financial institution solutions.
So my team, I look after client experience products at FISERve and so we have responsibility for all the data analytics products that we provide to merchant clients such as PayPal.
Okay. So PayPal Aby tell us how does this relate to you? We are a global fintech.
We have over 400 million consumers that we service.
But most importantly, we move $1.4 trillion dollars worth of volume per year for millions of merchants. PayPal is also a culmination of multiple products and brands that have for a very long time been bifocated and that's part of the story.
My team looks after platform and experiences and that really is the enterprise payment service provider.
Okay. And Brandon, you leverage zero copy sharing.
Absolutely. So, we started on our Snowflake journey at Fiserve a little over four years ago. We moved all of our data into Snowflake.
Like many companies that have been around 40 plus years, we had a lot of information stored in various onrem systems. So moving all of that in allowed us not only to use zero copy integration internally across business units but it has also allowed us to build products and services on top of that that we can offer to customers. So we can leverage zero copy integration with clients such as PayPal which not only reduces our expense by around 2/3 through that integration but it also allows us to grow new revenue from a product perspective.
So we've been able to scale product to around $150 million in our long range plan around data products and services and a lot of that goes into the services that we can then offer to clients such as PayPal. Okay. So you do zero copy data sharing with PayPal.
What does zero copy mean to you? So zero copy integration for PayPal allowed us to complete in 2024 the biggest migration in the payments industry. That was a migration of PayPal, Venmo, and Brainree, which are our three brands, um, for all of our US volume.
It improved our merchant SLA from 96 hours down to 12 hours. And most importantly, it allowed us to actually stop adding resource to handle the complexity of our data.
And all of that was offset to Snowflake and Fiserve. Yeah.
what what you're doing between the two of you, you and your merchants is a fascinating story.
Both of you are increasing business value, business impact by making data available in context without copies.
Thank you so much both of you for being here. I love your stories.
I love the partnership. Thank you.
Thank you.
Underpinning a lot of the collaboration and data sharing is our Snowflake marketplace which continues to grow.
It continues to get momentum. We have over se 7 750 providers companies that are listing on the marketplace over 3,000 products and this continues to grow and you see here the the categories whether it's computer vision or geospatial and we enable many of you to purchase products from the marketplace drawing down from your capacity commitment to Snowflake.
Today we're also introducing one of the things that we've heard the most from many of you and from many of our partners which is the concept of private offers where you can have custom terms and custom um pricing specific to any one offer any one business and commercial relationship that all of you may have.
And the other thing that we heard from many of our partners wanting to build applications, wanting to build agentic products either to publish in the marketplace or to publish in the internal marketplace is the need for Postgress compatibility.
Streer announced it yesterday, but we're delight delighted to announce snowflake posgress.
[Music] So what we will deliver the acquisition got announced yesterday is a managed postgress service but with the enterprise capabilities that you all come to expect from snowflake things like customer manage keys or running the security perimeter boundary of snowflake.
Of course, we want a amazing amazing developer developer experience and the use cases are varied from application development, pipeline state all the way to agent state, agent personalization.
We're incredibly excited and also this rounds up our offering quite strongly.
Of course, we've had analy analytical data for a long time.
We've had hybrid data with unis store where it's hybrid transactional analytical and now with snowflake postgress we have a pure transactional store.
We are very very excited about this.
So let me recap. We want to help you create business value, increase business impact, making the right data available, whether it's zero copy sharing or whether it's leveraging products from our marketplace.
But one of the questions that we hear very very consistently from many of you is I just want faster insights.
Sometimes it's pure performance, but sometimes it's just how do I get to performance? What is the concept of ease of performance?
And over the years, we have been helping you migrate from a number of legacy systems onto Snowflake.
As part of that process, we developed Snow Convert, which is the tool that helps move data and code over to Snowflake.
And we're very, very excited that we have now made Snow Convert, Snow Convert available to all of you free of charge.
It's a mature tool. We battle tested it quite a bit. It's robust and it helps you bring data from a variety of data sources.
But in the world of AI, we are even more excited about introducing snow convert AI which is taking the power of AI to make migrations easier.
And it's not just moving data, it's testing.
How many hours go into validating the two systems behave the same and you all know that when they don't match then now that's a journey where AI and agentic uh task have the ability to go help you with that.
We're also introducing a migration assistant that will take the output of snow convert and will guide you through how do you make migrations to snowflake easier. We're very excited about this, but we also want just pure raw performance.
And that's why we're very, very excited about introducing generation to warehouses.
Yeah, some of you may already be saying, well, but you just talked about adaptive and what is this other thing?
Yeah, adaptive is the true future.
That's where the the world goes in terms of compute model.
But Gen 2 is another iteration of our traditional warehouse.
You still specify sizes, but it has faster hardware, a lot of software optimizations, and we have the best capabilities to go and scan open data, specifically parquet files in iceberg tables.
I'm the first one to be skeptical when anyone puts numbers on on a screen, but I'll tell you, we tested Gen 2 warehouses against our Snowflake benchmarks from roughly a year ago.
We see 2.1x faster performance, twice as fast.
And we Yeah. Yeah.
Go, go. And we benchmarking against a managed Spark and we saw something on the order of 1.9x. My request to all of you is don't believe me.
Don't believe what this says. Just go try it.
And I'm pretty sure that in many instances you're going to see numbers than better than what we have here.
Another aspect of faster insights is what we've been doing with notebooks. We have our notebooks are generally available.
The container runtime is generally available.
But we're also bringing distributed ML APIs. And one of the most interesting things is we're starting to hear from many of our customers that I put a model in production and the end toend life cycle is faster, but also the model runtime is faster. I have a story of someone that said it went from 2 hours to 15 minutes. We're excited about what we're doing here to help accelerate insights on this front.
We're also introducing today a data science agent which is an agentic capability an assistant that will help you build ML pipelines from inception from the idea all the way to when it's productized.
This is going to be into prior preview very shortly.
And now for one of the most interesting things we're doing in terms of helping you bring faster insights. I want to invite Mo Kiss. She is the director of data science from Canva.
Let's welcome Mo.
[Music] Oh, thank you for being here with us.
Thanks for having me, Christian.
Okay, tell us more about what you do.
I think Canva, a lot of people knows you, but tell us about your mission.
Uh, Canva's on a mission to empower the world to design.
So, with over 230 million users on Canva every month, we're really here to help you achieve your personal and work goals.
And Snowflake has been such a big part of our journey to scale.
We've tried everything from Snowpipe to Cortex uh to dynamic tables.
It's really been a big part of our stack.
I hear you're also the co-host of an analytics podcast.
Yeah, that's right.
Uh it's called the Analytics Power Hour, and it was designed to be like the hotel lobby bar after a conference where you have real conversations about what's happening in the world of data.
How much of those real conversations are now about AI?
It's coming up more and more.
We're all trying to figure out how to work smarter, not harder. How to make AI work for us.
And that's definitely the case at Canva.
We're really thinking about how we can answer bigger questions at a lower cost, how we can answer more complex questions faster. And the one we've probably been grappling with the most is answering questions with greater certainty.
And if you were able to do that, what would that mean for you? Uh, I think really when it comes to data, it's about helping the business make better decisions.
So just think about the speed of social media and creative really like when I work with our marketing team they all want personalized marketing and content at scale and we like to say at Canva it's about creativity plus productivity and that's really the goal.
And then I think you heard about something we're working on and you flew all the way from Sydney Australia to talk about it. I certainly did.
When I heard what the Snowflake team are working on I actually nearly fell off my chair.
I was pretty excited, so I had to fly all the way here to share the good news.
So, you want to share what we're doing?
I would love to.
I'm incredibly excited to announce that Snowflake is introducing Cortex AI [Applause] [Music] SQL.
Now, we can use natural language in our SQL queries. Christian, can you tell us some more about it? Yes.
So, Cortex AIS SQL takes the work that we did with Cortex Function to the next level.
It's a collection of functions that let you do things like entity extraction or maybe aggregation or maybe filtering, but it can do it for multimodal data, not just text.
Imagine images, audio. There's a transcribe function that lets you transcribe directly in SQL from Snowflake.
And all of this is possible because we're also introducing a new file data type that can store references to unstructured data whether it's in external storage or in internal storage. Want to tell us about the sample use cases? Yeah.
So we've been having a really fun time playing around with this and a big part of it has been AI classify. So we can now understand the call to action, the core message or evenly even how visually engaging an ad is just with a few lines of SQL.
We are so excited for this, Christian.
We're incredibly excited.
Do you want to see a demo? Oh, I'd love to.
Do you all want to see a demo? Yes.
Okay. So, we're going to invite um Renee Wang and Dash who is always here.
He's he's always sneaks in.
I don't know. I don't know how you get in here.
Uh we're going to see a demo AI SQL.
And I also want to thank Mo who is also award winner of the snowflake superhero all the way from Sydney, Australia.
Thank you Mo. Come on in [Applause] [Music] Renee.
Thank you Christian.
Thank you Mo. Hi everyone. My name is Renee, a product manager at Snowflake.
But before becoming a PM, my whole career has been about data and analytics. So, as a Power SQL user myself, I'm super excited to introducing you to our latest capability, Cortex AIS SQL, an AI query language that uplevels all the SQL analysts into AI superheroes.
I have Dash, my colleague, who will be hands- on keyboard, and I will be the analyst for our an for our music company today.
In the next couple minutes, we'll do three things together.
First, we'll consolidate customer complaints across multimodel data covering text, image, and audio.
Next, we'll use the power of AI to semantically join customer complaints into the solutions.
And lastly, we'll get aggregated insights across multiple rows of data in a large table and we are doing everything with a few lines of SQL. Let's get started.
So, as we are a datacentric company, we want to utilize every single bit of information our user has shared with us.
However, customer might send in complaints by all sorts of channel.
It can be email, it can be voicemail or even screenshots.
As an analyst, how can I even understand those voicemail and screenshots?
Well, let's see how AISQL can help you here. Let's try out our latest superpower function, AI Complete, that not only takes tax as an input, but also image and audio files.
Behind the thing, we're running the world's best large length model that can understand image, transcribing audio, and summarize insights.
And what you have to do is to point this function to your data.
All right, let's take a look at output.
And in case you didn't notice, we are now literally saving text, image, and audio in one single table with the insights summarized right next to it.
So now I can clearly tell that customer running into issues like rejected transaction or like double charging.
All things can indeed be really frustrating for our users. So next step, I really want to address roast user concerns.
Now I know we have an internal solution library with answers to frequently asked questions.
What I want to do is to map roast solutions into the complaints.
But wait, how can I do that systematically?
Now with AIS SQL we can simply try a join operation using AI.
Instead of joining on predefined key this join connects two table based on a natural language prompt that AI can reason on asking whether the solution can address the user concern. So join can be expensive solution by itself but Snowflake has done query engine optimization for you.
We dynamically route easier task to a smaller model and harder task to a larger model making my query much faster and much cheaper to run.
Let's check the output here.
So we can see that for most of the complaints we already we already mapped to a solution thanks to the join.
I still have some tickets that do not have a solution yet.
So I can now talk to my support team and get their dedicated focus.
All right, moving on.
Another common question we are getting from the business.
What's the aggregated insights across all the support ticket over time?
Now, you may be wondering, can I just dump everything into an AI?
Well, that might work if I only have 10 complaints.
In a real world scenario where I often have tens of thousands of complaints, I'll quickly run into a context window limitation for my model.
So Snowflake has solved the problem for you with the first ever AI aggregate function.
Snowflake handles the context window limitation by multiple step map reduce behind the scene and what you get is the summarized insights directly.
Let's check what Dash is doing here.
So now we are grouping the data by months and we are sending multiple rows of the data of that months into AI act function asking our lanch model was the top user insights and there we go we can now see the qualitative insights for all the data within that month and it also allows us to monitor for any months over month change and this is literally as easy as writing sum or average on my table.
So, how cool is that? But if we turn back the clock, five years ago, if I were to do such analytics, it would be either a giant NLP project or a lot of things are simply mission impossible.
Now si now even with the language model that enable a whole new set of capability before cortex AISQL I still have to write lensy script multiple files calling out to multiple systems and I will need to figure out all the system and access myself. Now thanks to the snowflake cortex platform I have direct access to all the latest and best frontier models.
So as new models coming out every single months, which they literally are right now, I don't have to be anxious about not getting access and fa being left out. And AISQL also helped me condense my whole complicated AI pipeline into just a few lines of code.
What's even better if we compare the performance of AISQL. Thanks to all the query engine optimization Snowflake has done, Cortex AIS SQL is three to seven times more performant than my previous solution and making my query much faster and much cheaper to execute.
All right.
All right. Let's recap now.
So, in less than a couple minutes, we were able to run three things together.
We did multimodel analytics across text, image, and audio files. We were able to use the power of AI to semantically join together complaints into solutions.
And we got aggregated insights across multiple rows of data in a large table.
And we did everything with a few lines of SQL.
Snowflake is now my multimodel and AI database.
the unified platform that connect all type of data into the best AI models.
And we foresee a whole new era of AI powered data analytics to drive the future of your business.
Thank you, Dash. And back to you, Christian.
It's pretty cool, isn't it? Yeah.
Obino says, "Yeah." Anyone else? Pretty cool.
[Applause] We love the stay true that if you want to program in SQL, you can program in SQL.
Of course, there's multiple choice, but where we where I started, we want to make it easy for all of you to adopt the variety of technologies and AI is turbocharging everything we do.
Let me recap what I said around our ability to give you faster insights. We want to help speed up migrations.
We're delivering tools, AI powered tools for all of you to consolidate onto Snowflake single data estate faster warehouses, Gen two warehouses.
Go and try it. If you have not done it, you owe it to yourself.
You're going to show show up in your organization and say, "I went to the conference and everything is now faster just because I move workloads to Gen two.
" We want to help you with productivity with AI. And of course, we're very excited about the AI SQL where we bring AI for multimodal data into Snowflake itself.
Now, let's talk about getting your data ready for AI. This may be the single biggest question we hear from many organizations out there is that, I'm ready for the AI, but the data is not there.
when AI first started like being mainstream I don't know a couple of years where we where language models were front and center for all of us I think many people said how hard can this be you put a language model on top of structured data generate SQL and done and I think many of us realize it's easier said than done I think all of us collectively learned that the more context you provide to those models the better the results especially semantic context and today we are very very happy to introduce semantic views.
What's a semantic view?
It's a type of view that is geared towards that specific use case of capturing the context around a data. What are the metrics?
What are the dimensions?
What are the definitions used by your business users?
and how do those translate to physical schemas?
And of course, we're already integrating this with a number of partners, HEG, Sigma, Omni, all of them to be able to leverage semantic views.
But semantic views in and of themselves, they they don't do maybe a a whole lot.
What is most powerful is that we're also introducing semantic SQL which is a richer more powerful query set of query constructs where you can query the view and specify those dimensions and those metrics as context. It leads to better performance.
It leads to a more accurate responses.
Of course, it's integrated with cortex analyst. But the important thing what's unique of how we're doing is the same semantic view provides context for AI use cases as well as for BI use cases.
So okay we have now a way to provide context when someone wants to chat or talk to their data.
But the reality is that there's not only one data set in reality you have many data sets and you may want all of those data sets to be AI ready.
So today we're also announcing the ability to share semantic views as part of the either our snowflake marketplace or the internal marketplace. So now you don't just publish data products in your organization, but you publish data products that are data AI ready data sets that are ready to be queried by technologies like Cortex Analysts or Cortex agents.
But the same thing happens with unstructured data.
You may have multiple unstructured data sets whether it's from your organization or from third parties and we also want to make sure that unstructured data is AI ready for you to consume and leverage AI.
Today we're announcing that Cortex knowledge extensions are generally available.
Cortex knowledge extensions are the ability to publish data that is already vectorized already available to be queried by Cortex search.
We're also incredibly excited about this collection of partners that are bringing content onto the Snowflake marketplace for you to augment your use cases.
In particular, news providers like the USA Today, the Associated Press, the Washington Post, all have data sets live on the marketplace ready to be queried on Cortex Search via AI.
So the notion of getting data ready for AI is very important for us.
Whether it's first party data, third party data, structured or unstructured, that is a big big uh initiative for us to help all of you.
And this brings me to the next I want statement which is I want to accelerate business growth with AI agents.
I know there's a lot of talk about agents but the disruptive disruptive potential is very real.
The ability to improve workflows, business pro processes is amazing. And at the end of the day, accelerated business growth is let's get more productivity.
Let's bring productivity to everyone in your organization.
Let's pick up where I just left off.
We said, okay, we have Cortex Analyst to query structured data that is already AI ready. We have Cortex Search that lets you query unstructured data that is AI ready. But the most common need when you're starting to build this type of solution is I want to provide orchestration and I want to provide instructions on how to reason and answer questions based on the data sets that I have.
This is why we introduce cortex agents that are now generally available for very soon cortex agent that help you orchestrate and understand based on the different backends and data sets.
How do I answer questions? And of course, we have seen many of of the agent solutions that get dramatically more powerful through the ability to invoke other tools to invoke other agents. And all of this is part of Cortex agents.
This lets you create agentic applications.
So you can embed it into whatever is your internal tool or commercial products.
We are doing something like that with Microsoft where we are bringing Cortex agents to Microsoft Teams. Here's a a screenshot.
The the goal of this is to be able to have the power of Cortex agents, the power of your data available to where business users already are. This will be available in the next two to three weeks.
And we're seeing a number of partners bringing agentic products onto the marketplace.
So the snowflake marketplace evolves to it is not just data and apps but agentic products, agentic apps.
We already have a number of partners doing this.
It's super exciting the types of use cases that we're seeing.
Before showing you one last announcement, I want to share a video from a customer of ours, Luminate, which is doing something very cool with Cortex.
Please let's roll the video.
[Music] Limit is the most comprehensive analytical platform for the entertainment industry.
Our customer are labels, studio exec, artists.
We are powering the billboard chart as well as the top 50 movies and TV shows for varietgen platform has been rebuilt using snowflake AI data cloud.
Luminate built our conversational AI interface within our products. We wanted to move from being able to construct ranked list essentially and allow our customers to come and actually gain insights from that data and democratize that availability.
We needed to ensure the accuracy of the data. We needed to make sure that it was easy to use.
Great thing about building AI agents in Snowflake is the ability to bring structured and unstructured data together in one place.
Data quality, data control, permissioning, data security are all extremely important to us and our customers. With Cortex AI, we've been able to use best-in-class models to process our data.
Snowflake allowed us to do that a lot easier than would have been. We use a different agendic framework.
Using Cortex AI allows us to build data agents to retrieve accurate data and provide high quality insights.
One element is to bring this data together. But the other way is to make sense of it and that's really what new technologies and this new platform help us to [Music] achieve.
It's always so inspiring to all of us at Snowflake when we see many of you doing the types of things we saw in the video which is creating value, accelerating your business. We we love it.
We love hearing from you.
Recapping, sorry, recapping, we were on Cortex agents let you build applications, but we also want to make sure that it's not just Cortex agents, Cortex Analyst, Cortex Search available through a third party user interface. Something we have heard from many of you is I would love an out ofthe-box experience where Snowflake makes it easy for my business users to interact with the data without me having to go and write code and build applications and deploy and all of that.
That is where we are incredibly incredibly excited to introduce Snowflake Intelligence.
[Applause] [Music] So what what is Snowflake intelligence?
It's that firstparty experience from Snowflake powered by state-of-the-art models.
The absolute best combination of models that we can put together and put together to work in the context of your data.
We do it. Of course, it's way more than a chatbot. It's an agentic product, an agentic capability. It has thinking, reasoning capabilities.
The goal is that it enables business transformation for all of you.
Productivity. How do you improve those business outcomes?
And very important for us like when when Benoa and Sria talked about easy, connected, trusted.
The trusted part of what we do at Snowflake matters a lot. How do you get verified results? How to get citations and quotations so that you know these agents, data agents are not making up data.
Snowflake Intelligence will be in public preview very shortly.
Are we ready for the last demo of the day?
Yeah. Let me invite on to the stage Jeff Holland Dash.
I'm not going to introduce you anymore.
Jeff, welcome to the [Applause] [Music] stage.
Oh, good morning everyone.
I am so pumped to be here today with you all and it's incredible and energizing to see a room filled with the leading global data leaders from across the world.
And I'm so excited because all of you here know the power and potential of all of that data across your organization.
But you also know the challenge.
How do you surface those insights from the data and make them available to business users who need those insights right now to make critical decisions?
So, think of any complex operation.
In this demo, I'm going to be running a global music festival and all of the data involved in that type of operation. I have ticket sales data that is in Snowflake.
I have contract information that I'm using inside of SharePoint, supply data in an ERP, marketing data in SAS apps, fan sentiment that's pouring in through social media.
Data is everywhere.
But traditionally for me to get access to any insights for my data as a business user, I'm jumping between each one of those applications.
I'm digging through complex dashboards to try to dig out those insights on my own. Or I'm pinging an already overwhelmed data team for a set of new insights or new reports.
How many of you can relate with the experience of starting your week off on a Monday morning, you already have a full plate and you sit down and first thing you see is an email from an executive or a business stakeholder saying, "What's going on with this dashboard?
Can you dig into the data and tell me why? There goes my plans for the week.
" Right? Because they need that answer right now. So, what if we could change all of that? What if we could prevent opportunities being missed and prevent delays and insights?
And that's the vision behind Snowflake Intelligence, which brings secure AI agents on top of your data, accessible for anyone in your organization.
So, let's show it. Now, what Dash is showing right now is Snowflake intelligence.
This is a brand new experience that lives at a.
nowflake.com for anyone in your organization to securely access.
Now, this is built on top of the Snowflake platform that you already know and love.
But here I have access to various agents that my data teams can configure and provide access to so I can start to get answers right away. So let's start with a simple question. I want to understand how are ticket sales trending for our music festivals.
And you'll see that right away the agent gets to work.
It's thinking about my question.
It has access and knows all of the various data sources that I have available to me and it's determining the right place to provide the answer for it. Now in this case you'll notice it's recognized this is the right set of tables and using cortex analyst the agent has automatically written the precise SQL needed to answer this question.
Notice that green shield at the top too.
We're going to come back to that in a second.
That's an important piece. Now as the agent thinks through this, it's able to go through all of the processes and provide the answer. Now while the agent has access to all of the data in my organization, it also knows who I am.
So, it knows only the data that I should have access to and it's only pulling from that in this answer. So, check this out.
In a few seconds, now I have an answer even visualized in a beautiful line chart.
So, I know exactly what's going on with ticket sales.
Now, I mentioned that green shield.
If I'm curious, can I trust this data?
Is this accurate? Well, here I see a visual indicator that I know this is using a verified query from my Snowflake organization.
So the data team has said any ticket sell data this is the source of truth.
So I don't have to worry if I can trust this data or not. I see it right now and know that this is the right way to answer this type of question.
And it's not just about data that can be queried through SQL.
What about document data? So here I want to understand some information about a contract agreement that we have with one of the performing artists.
Now Snowflake Intelligence actually came up with the artist name here, The Algorithms. I got to say this is the coolest and geekiest band name I have ever heard. It does not exist today, but if anyone starts this band, I promise you I will be your number one fan and go to all of your concerts.
But here you'll see the agent is not writing SQL. It's actually pulled context and references from my SharePoint account and it's giving me the immediate answer to my question.
And again, if I want to trust this answer, it includes citations to all of those SharePoint documents so I can verify that this is the right answer.
Now, I know many of you leaders might be watching this and thinking, you know, Jeff, this looks so magical, but our data is messy. Our business is complicated.
There's so many different types of terminology that people might be using.
How in the world can we light up something like this AI agent on our own data?
And we hear you.
And we focused incredibly hard on making this easy for every organization to light up these types of AI agents.
So, what powers this conversation is that semantic view that Christian just talked about a few minutes ago. This is where I can capture all of the critical business context, terminology, and relationships on top of my data. Now, we want to make this easy for every organization to create and optimize. So, Dash actually will show us through what it takes to create one of these semantic views.
I can go ahead and create a semantic view.
I can choose the data sets that I want to have access to, and it will even ask me questions about different context that I have in my account. So here for this data set that I want to create in a semantic view, it will go ahead and ask me, hey, I'm going to look at your query history to see the types of data that this is joined to. Do you already have something like a Tableau dashboard?
And if so, you can upload those Tableau dashboard files directly into this agentic experience.
We will use AI to say, okay, this is how you talked about it in Tableau. So, I'm going to use that to generate for you a great starting point for a semantic view so that I get accurate results right away. This is so powerful.
Makes it easy for everyone to get a semantle up and semantic view up and running.
Okay, I want to show one more very exciting thing because we still have some incredible state-of-the-art AI technology to show off.
If we go back up to that first chart, Dash, and you'll notice in ticket sales, there's this green line here.
Something's going on here in Europe.
Something happened between March and April where I had this huge spike in ticket sales.
This is ripe for me to send off one of those emails to my data team and say, "Go spend the next two hours and tell me what's going on here.
" But wouldn't it be incredible if my agent could help me answer some of those more complex or ambiguous questions?
So, let's ask it. Let's just ask Snowflake intelligence what happened in March.
Now, this is incredibly powerful state-of-the-art agentic reasoning that's happening.
You'll see in real time as the agent thinks through multiple different data sets.
It's actually going to be doing deep analysis on my data. Not just running one simple query, it's going to explore across a number of different data sources.
You'll see in the thinking it can actually blend them together. Now, one awesome thing here, you'll notice a few citations popped up at the bottom.
That's actually not SharePoint documents.
Those aren't news articles because I use the snowflake marketplace and that cortex knowledge extension to integrate with Ganette and the USA Today network of publishers. This is over 200 publications.
So my agent has access to real time news as it's coming to answer my question.
And you'll see here in the thinking here it's blending this all together.
It's saying I've looked at marketing data.
I've looked at news information.
I've looked at sales information.
Just as if I had a trusted analyst who deeply understood my data sitting right next to me that's going ahead and powering this. Now one more thing as this completes its deep analysis I mentioned before. All of this is built on Snowflake Horizon. So all of those access policies, data masking, access controls, I don't have to replicate them to the agent.
The agent just knows about it and it's enforcing it automatically out of the box for me.
So I have confidence in knowing only the data that I should have access to.
I see. Now, what's incredible now is you'll see in just a few seconds after this agent has done deep analysis on industry-leading frontier models like Anthropics Claude and OpenAI all running securely in my Snowflake account.
Now, I have the answer to my question.
I know what's been going on here in March.
It's saying, "Hey, it looks like you had a marketing campaign.
Uh there was this Spring Blast marketing campaign.
It had a huge amount of impressions.
There was also some music festivals that I picked up from the news that might be correlating.
" Think of how much work I would have had to spend, the amount of applications and back and forth that I would have gone through that snowflake intelligence just surfaced right for me.
Now, finally, I want to take action on this.
So, let's go ahead and I want to send a message to my marketing team and say, "Guess what? I just learned our marketing campaign actually led to a huge amount of sales. Let's try to do something just like it." And again, my agent's incredibly flexible.
So, I've even integrated it with tools.
So whether I want to communicate with other agents or other tools or in this case, it's going to integrate with Gmail to send an email to my marketing team based on this draft that it's helping me do, Snowflake Intelligence is the purpose-built assistant for everything that I need to bring those insights to light.
So think about all that we were able to do in just a few minutes.
I went from the chaos of managing a global music festival to just having a conversation.
I was able to access structured insights, unstructured data.
I was able to pull all of this in through Snowflake's built-in governance.
So for all of you data leaders here today, imagine empowering your entire organization with this type of capability.
I'm so excited that Snowflake Intelligence will be in public preview very soon. It's in private preview right now. Spend some time this week.
Learn more about it and understand how you can start this journey of transforming how your business interacts with data and how your data can do more for you.
Thank you all so much.
Christian, back to you.
It's exciting and inspiring to see what Snowflake Intelligence and Cortex agents can do for all of you. So, at Snowflake, we're very excited. We can't wait to get the public preview out to all of you.
And of course, we always welcome and love your feedback on how we can improve things for all of you.
This brings us to the end of the questions that I had used for framing our announcements today.
And from our perspective, hopefully it's not just I want to do stuff and good luck. This is not I want in our perspective is you can with Snowflake with Snowflake you can have a futureproof data architecture.
You can have better economics.
You can have govern all your data.
You can integrate all types of data.
You can deliver more business impact.
Favor of all of you.
You can have faster insights.
We talked about AI radi AI ready data and how you can leverage AI with your data. And of course, how do you accelerate business growth with AI agents?
Let me go back to where I started.
I said from a platform perspective, we want to help you through the entire life cycle of data. This is a simplified schematic of life cycle. Don't hold me to it.
Of course, the real world is more complex and not as linear. But our goal is to innovate through the entire life cycle from ingestion of sources, processing, governance, and consumption.
And if I were to map the different announcements that we've shared in the last hour and a half or so, you could see or you can see how we are making every aspect of that life cycle better.
And that is our commitment to all of you.
Our commitment is to do continued innovation and to bring new capabilities that help you succeed in your organization.
At the end of the day, this is the AI data cloud.
This is the platform.
This is all of us. I hope all of you are inspired by what you saw today.
I love saying I hope you are excited about what you've seen today. We only cover here a small subset of all of the innovation and announcements that are we're making at Summit.
Yeah.
Okay. Before we adjourn, I have one final demo for you. For any of you that were here last year, we did something that was high-risk. We invited someone from the audience to try a demo because we said it's so easy to use. What can go wrong?
She did very unpredictable things for those of you that remember it.
And today we're going to do something that in my mind has a much higher probability of failure because I have no idea how I agreed to conduct a demo myself.
We are introducing an inline copilot where you can activate with a hotkey command I or control I.
Yeah. Yeah. Yeah.
Good news, we're enabling it today for half of you.
Bad news, half of you are going to wait a little bit longer. It's based on capacity and other things, but hopefully you get excited about what we're about to show you. So again, I'm doing it myself and because I am so excited about our live orchestra. Adam, are we ready?
Let's do this. Let's go.
[Music] Heat.
Heat. Heat. Heat.
[Music] Heat.
[Music] Heat.
[Music] Heat.
[Music] Heat.
Heat.
[Music] Heat.
[Music] Heat.
[Music] Heat.
Heat. Heat.
[Music] [Music] [Music] Heat.
Heat.
[Music] [Applause]
Loading video analysis...