LongCut logo

The Most Important AI Lesson for Businesses From 2025

By The AI Daily Brief: Artificial Intelligence News

Summary

Topics Covered

  • Redesign Operations for Agents
  • Legacy Systems Kill Agent Projects
  • Build AI-Native Tech Organizations
  • Inference Usage Explodes Despite Cost Cuts
  • GEO Replaces SEO in AI Era

Full Transcript

Today we're exploring the most important AI lesson that businesses learned in 2025.

Welcome back to the AI Daily Brief. In

case you haven't been able to tell, I love endofear content. It's a time that as the barrage of news slows down a little bit, people take a little bit of extra time to reflect on the year that was, they start to prepare and look

forward to how they want to operate in the year to come. And from a content perspective, everyone is putting together their best trend reports and ideas and thoughts for the future. And

that creates a really interesting lens through which to explore some of the big issues facing AI and all of the people in industries who are trying to figure out how to adopt it and adapt to it.

Now, the latest of these reports is Deoid's 17th annual tech trends report.

This is a monster 72page document that goes deep across six different themes.

Now, rather than getting into each and every theme, I want to highlight what I believe is the big idea and the clear throughine throughout all of this that I think is perhaps the most important

lesson for enterprises and businesses who tried to adopt AI in 2025. The

lesson is of course that to fully take advantage of AI, it is going to require much more than just simply dropping a chatbot on the head of your employees and saying, "Go be more productive." If

you've spent any time on LinkedIn over the past couple of weeks, you've probably seen some version of this iceberg image, which has AI strategy at the top. And then, of course, underneath

the top. And then, of course, underneath the water are all the things that make AI more challenging in its quest to transform, but also more powerful if you can address them. That's things like legacy systems, data pipelines,

integration, debt, and undocumented code. So, today we will use the Deote

code. So, today we will use the Deote tech trends report as a lens to explore that exact topic while also covering some of the other interesting things that they're surfacing. Now, the two areas of their six big themes where they

address this are first the agentic reality check preparing for a siliconebased workforce and section four the great rebuild architecting an AI native tech organization. Let's pop over

to agents first. 2025 was supposed to be the year of agents and in many ways I think it was. We certainly saw from studies like KPMG's pulse survey that enterprise adoption of agents was

significant throughout the year. What's

more, whereas I think it could have easily devolved into a bunch of showcase pilots and experiments, I think we pretty quickly skipped over that step and went right on into inproduction

agents that were actually meant to have purpose. At the same time, it's pretty

purpose. At the same time, it's pretty clear that agents didn't fundamentally upend everything inside the organization as some thought they might with the possible exception of course of software

engineering where coding agents were the most disruptive and powerful force of the year. So again, Deote calls their

the year. So again, Deote calls their section on this the agentic reality check and write despite its promise, many agentic AI implementations are failing, but leading organizations that

are reimagining operations and managing agency's workers are finding success.

And here in a single sentence is the theme which I will be effectively beating you over the head with for the next 2 weeks. True value comes from redesigning operations, not just layering agents onto old workflows. What

does that mean? While Deote says it means building agent compatible architectures, implementing robust orchestration frameworks and developing new management approaches for digital workers. It also they say means

workers. It also they say means rethinking work itself. As organizations

embrace the full potential of agents, not only are their processes likely to change, but so will their definition of a worker. So let's talk some numbers.

a worker. So let's talk some numbers.

Gardner predicts that from a starting point of zero in 2024. By 2028, agents will make 15% of work decisions autonomously and a third of software applications will have Agentic AI

integrated in some way. And yet, there are challenges. One of the things that's

are challenges. One of the things that's really interesting to me is that you can find a wild range of organizational self-reporting on Agentic deployments.

For example, I mentioned the KPMG pulse survey. In their Q3 poll survey, they

survey. In their Q3 poll survey, they found that 42% of organizations had deployed at least some agents, which was up from 11% in Q1. Deoid's 2025 emerging technology trends however which based on

my research was conducted between June and July of this year found that 30% of organizations exploring agent options and 38% piloting solutions but only 11% actively had agents in production. Now

the self-reporting bias of all of these is real and also the terminology isn't super precise but I think it's fair to say that wherever on that spectrum from the Deote numbers to the KPMG numbers the truth actually lies. We're still

really early and that I think is embodied in this other stat from Deote where 42% of organizations said that they are still developing their agentic strategy roadmap with more than a third

35% having no formal strategy at all.

The survey also identifies three big barriers and for those of you listening at home I'll pause for just a second to see if you can guess them.

I'd be willing to bet that you got at the very least one if not two or three of these. One is legacy system

of these. One is legacy system integration. In other words, previous

integration. In other words, previous enterprise systems that were not designed with agentic interactions in mind. Gardner in fact predicts that over

mind. Gardner in fact predicts that over 40% of agentic AI projects will fail by 27 because of the challenges of legacy systems that can't support modern AI execution. Next issue bing bing bing

execution. Next issue bing bing bing data. The vast majority of enterprise

data. The vast majority of enterprise data even now, even after we've had a couple years under our belt of facing down these issues, still is not ready and set up to be used by agents to understand vital business context and

help them make better contextual decisions. In another survey earlier

decisions. In another survey earlier this year, 48% of organizations cited the searchability of data and 47% the reusability of data as challenges to AI strategy. The third issue is governance.

strategy. The third issue is governance.

And this is another one that's really easy to identify and much harder to actually put into place better practices. As Deote writes, traditional

practices. As Deote writes, traditional IT governance models don't account for AI systems that make independent decisions and take actions. But again,

here's the more important point. The

challenge extends beyond technical control to fundamental questions about process redesign, many organizations attempt to automate current processes rather than reimagine workflows for an agentic environment. Now, one

agentic environment. Now, one interesting thing that they point out is that to the extent that terminology matters here, it's really in what organizations expectations are of how the new process is going to work. If it

is just a better way of doing the exact same process that happened before, but with digital workers instead of people, that's an automation. Agents on the other hand might have an entirely new

way of doing things that isn't just about automating the existing process.

And so here the distinction between agent and automation becomes actually relevant in terms of how much work the organization faces to figure out the best new way to do something. Now what

does it look like when organizations are actually figuring out all these issues?

What is the shape of an organization that is actually taking advantage of the agent opportunity? The first part of

agent opportunity? The first part of this is process redesign. As they write, most businesses existing processes were designed around human staff. Agents

operate differently. They don't need breaks or weekends. They can compete with high volume of tasks continually.

When organizations realize this, the opportunities for process redesign become compelling. That's why

become compelling. That's why enterprises that are succeeding with Agentic AI are looking at their processes from end to end. And it turns out that looking at those processes end to end tends to lead to some desire for

legacy system replacement. There may be some amount of core modernization work to be done before agents can really be fully taken advantage of. Next, there is a new approach to management. And while

for the vast majority of implementation so far, AI is still a tool more than a colleague, forward-looking organizations are starting to think about that future where digital workers sit alongside human workers and it creates the need

for new types of management. In these

ecosystems, human roles move away from execution and towards things like compliance and governance and growth and innovation. There are a bunch of other

innovation. There are a bunch of other questions as well. And even when an organization locks in on the fundamentals, there are still going to be highly dynamic questions. For

example, in this report, they talk about why successful deployments focus on quote specific well- definfined domains rather than attempting enterprisewide automation, saying that broad automation remains possible but requires multiple

specialized agents working in an orchestrated fashion rather than a single monolithic solution. And while I agree with that wholeheartedly right now, what you can tell when you look at the strategy of the foundation model companies that they are trying to move

towards a world where the base agents are generalized and then they can become specialized in the context of a particular set of work. This is what the whole episode about anthropic skills mechanism was earlier this week.

Ultimately, this is going to require new infrastructure such as something that we spend time on at super intelligent HR for agents. Deoid points out that while

for agents. Deoid points out that while there are certain aspects of HR that will be totally inapplicable to digital employees, things like worker motivation and employee loyalty, there are going to

be other areas that apply to agents in new ways. Onboarding, performance

new ways. Onboarding, performance management, life cycle management. We've

been building agent planning tools for more than a year now. And while most of 2025 was about 0 to one implementations and starting to get organizations feet wet, performance management, life cycle management, and ongoing planning is

exactly where I see this all heading.

Now this all feeds into another section from the report which they call the great rebuild architecting an AI native tech organization. And in many ways this

tech organization. And in many ways this is almost a deeper dive on what it takes to do the sort of redesign that was talked about in the agentic section. In

other words, you could almost reframe this as what does the technology organization need to do to support the agent native organization as a whole.

Part of it is just that it needs to grow. Almost 70% of tech leaders they

grow. Almost 70% of tech leaders they surveyed said they plan to grow their teams in direct response to Genai with the number for example of AI architect roles expected to double in the next 2 years. The way that organizations look

years. The way that organizations look at their technology organization is changing as well. If in the past it was a service center that supported the rest of the organization, 66% of large

organizations now view the tech organization as a revenue generator. In

2015 41% of CIOS reported directly to the CEO. that is now up to 65%.

the CEO. that is now up to 65%.

And a lot of those questions of modernization and organization redesign are going to live inside that technology organization. 71% of organizations that

organization. 71% of organizations that were surveyed are in the midst of modernizing core infrastructure to support AI. And nearly a quarter of them

support AI. And nearly a quarter of them are investing between 6 and 10% of annual revenue in modernizing those core enterprise systems. But of course, it's not just their systems, it's also how they design their organizations as a

whole. Deote writes, "In the years

whole. Deote writes, "In the years ahead, traditional project teams will likely shift into lean crossunctional squads aligned to products and value streams, tightening the loop from concept to customer and hardwiring

ownership of outcomes. 57% of

organizations report they're already shifting from project to product models to bring business and IT closer together. In this model, product lines

together. In this model, product lines deliver user focused features via shared customerf facing platforms. Agile pods govern ways of working and tool choices, and forward deployed engineers work alongside product or customer teams to

shorten the path to value. I think what we're seeing is actually a birectional technology integration where there is yes this side where the technology organization comes to the rest of the organization and actually embeds with

them in some way but then also the rest of the organization is via AI getting more technical as well speaking for example in code for the first time this is of course not a one-time transition

but an ongoing and perpetual evolution in which as they write change becomes a core capability not a one-time event now as I said at the For me, this organization redesign

message is the most important AI lesson for businesses from last year. And

luckily, I think that organizations are much better prepared now looking into 2026 than they were looking into 2025 when it comes to what it's going to take to actually take advantage of the new power of AI and agents. Still, while I

don't want to go through all of what else is in the report, there were a couple other interesting notes that I wanted to quickly mention before we get out of here. One is there's an interesting enterprise version of the

conversation around inference economics.

What do I mean by inference economics?

Well, if you pay any attention to the AI bubble conversation in markets, one of the things that the AI market bears often believe is that a reduction in the cost of inference and the ability to put

inference on device could fundamentally undermine all of this big infrastructure investment that the market is pricing into these AI companies over the next 5 years. VCSA Shangv writes, "Ondevice

years. VCSA Shangv writes, "Ondevice inference breaks the AI capex trade."

That was reposted by Compound's Michael Dempsey, who shared a graph from Epic AI suggesting that Frontier AI performance could become accessible on consumer hardware within a year and said that the image might be the most underappreciated

chart in technology right now. Like I

said, Deoid takes this into the realm of enterprise. They have a section called

enterprise. They have a section called the AI infrastructure reckoning, optimizing compute strategy in the age of inference economics. They write, "The mathematics of AI consumption is forcing enterprises to recalculate their

infrastructure at unprecedented speed.

While inference costs have plummeted, dropping 280fold over the last 2 years, enterprises are experiencing explosive growth in overall AI spending. The

reason is straightforward. Usage in the form of inference has dramatically outpaced cost reduction." Another way to think about this is Javvon's paradox, but at the business level where a reduction in cost actually increases the

overall consumption. Now, in a lot of

overall consumption. Now, in a lot of ways, inference cost coming down is not the story here. Increased usage is. And

all the inference cost coming down really is discussing is the idea that even with that dramatic cost reduction, the overall bill continues to go up because usage is growing more. That

leads organizations to have to think strategically about a variety of issues around compute. Cost management, data

around compute. Cost management, data sovereignty, latency sensitivity. In

other words, understanding which use cases and workflows need real-time decision-m versus which others can be designed to not require that sort of speed. They point out that in many

speed. They point out that in many cases, organizations are finding that there is a fundamental infrastructure mismatch here. Now, like I said in the

mismatch here. Now, like I said in the context of this particular episode, it would be going way too deep to get into that, but it's interesting to see that even as the inference and infrastructure debate is happening on the macro level in markets, it's also happening on the

organization by organization enterprise level. The other section I thought was

level. The other section I thought was interesting because it is so clearly going to be a massive theme going forward but is still dimminimous right now at least in terms of the mind share it has is what they call AI going

physical. Basically the convergence of

physical. Basically the convergence of AI and robotics which some refer to as embodied AI. And while many think about

embodied AI. And while many think about this category of AI right now as just task specific robots, Deoid points out that actually this is changing very quickly and the integration of AI with

physical devices is dramatically expanding the relevance of embodied AI outside of just factories and supply chains into other areas of the business.

They point to quadripeds, drones, autonomous vehicles, humanoid robots, and autonomous mobile robots as different form factors, which all have different implications and use cases for other parts of the business. Now to me

this feels more like a 2728 conversation than a 26 conversation but it's interesting to see how much emphasis deoid is putting on it having it in fact as their first chapter overall. Finally

from their last section which is a set of quick hits on what they call tech signals worth tracking as AI advances.

There is just one that I wanted to point out which they characterize as GEO overtaking SEO. Users they write are

overtaking SEO. Users they write are increasingly turning to AI chatbots over traditional search engines. The race is on to appear in AI generated answers. A

shift from search engine optimization to generative engine optimization. AI

generated answers already dominate search results across major search engines, reducing click-through rates to conventional websites by more than a third. AI platforms now drive 6.5% of

third. AI platforms now drive 6.5% of organic traffic, projected to hit 14.5% within a year. GEO differs fundamentally from SEO, prioritizing semantic richness over keywords, author expertise over

backlinks, and being cited in AI responses over page views. Just as paid search to find the 2000s and social media advertising dominated the 2010s, AI generated responses are becoming the

most critical marketing channel of the 2020s. Look, everything surrounding AI

2020s. Look, everything surrounding AI related commerce and AI related advertising is much less glamorous and exciting in some ways than a lot of these other topics that we've hit on today and which are the normal fodder

for this show. However, I'm watching these numbers of, for example, the growth in advertising around some of these vertical AI companies that are hitting hundred or $200 million in ARR.

I'm seeing the numbers around intent when a chatbot refers someone to a commerce website as opposed to a Google link and it's wildly impressive. The

point is that if you are someone who is looking for where AI is going to impact business in the very short term, everything surrounding marketing, e-commerce, and customer discovery is

going to take a major front seat in 2026. I am quite sure of it. And I tell

2026. I am quite sure of it. And I tell you what, to try to circle back to this idea of the most important AI lesson for businesses, the companies that are going to take advantage of those changes and are going to have the best GEO

strategies are certainly the ones that are going to be thinking systematically across the whole organization, not just dropping AI in on top of what is already built. So that is the story to me.

built. So that is the story to me.

That's the big takeaway from the tech trends 2026. I am looking forward to

trends 2026. I am looking forward to lots more of this big think type content to round out the year. For now, that is going to do it for today's AI daily brief. I appreciate you listening or

brief. I appreciate you listening or watching as always.

Loading...

Loading video analysis...