OpsLevel Logo
Product

Visibility

Catalog

Keep an automated record of truth

Integrations

Unify your entire tech stack

OpsLevel AI

Restoring knowledge & generating insight

Standards

Scorecards

Measure and improve software health

Campaigns

Action on cross-cutting initiatives with ease

Checks

Get actionable insights

Developer Autonomy

Service Templates

Spin up new services within guardrails

Self-service Actions

Empower devs to do more on their own

Knowledge Center

Tap into API & Tech Docs in one single place

Featured Resource

OpsLevel's new MCP Server powers your AI Assistant with real-time context
OpsLevel's new MCP Server powers your AI Assistant with real-time context
Read more
Use Cases

Use cases

Improve Standards

Set and rollout best practices for your software

Drive Ownership

Build accountability and clarity into your catalog

Developer Experience

Free up your team to focus on high-impact work

Featured Resource

Production readiness checklist: An in-depth guide
Production readiness checklist: An in-depth guide
Read more
Customers
Our customers

We support leading engineering teams to deliver high-quality software, faster.

More customers
Hudl
Hudl goes from Rookie to MVP with OpsLevel
Read more
Hudl
Keller Williams
Keller Williams’ software catalog becomes a vital source of truth
Read more
Keller Williams
Duolingo
How Duolingo automates service creation and maintenance to tackle more impactful infra work
Read more
Duolingo
Resources
Our resources

Explore our library of helpful resources and learn what your team can do with OpsLevel.

All resources

Resource types

Blog

Resources, tips, and the latest in engineering insights

Guide

Practical resources to roll out new programs and features

Demo

Videos of our product and features

Events

Live and on-demand conversations

Interactive Demo

See OpsLevel in action

Pricing

Flexible and designed for your unique needs

Docs
Log In
Book a demo
Log In
Book a demo

Just launched: OpsLevel MCP

‍

Share this
Book a demo
Table of contents
 link
 
Resources
Blog

AI coding assistants: what engineering leaders want, and what's holding them back

AI
Tooling
DevOps
DevX
AI coding assistants: what engineering leaders want, and what's holding them back
John Laban
|
June 24, 2025

AI coding assistants are everywhere: 94% of the companies I surveyed have at least some teams actively using these tools. Yet despite widespread availability, adoption among individual developers remains surprisingly low: only around a third of companies have seen adoption of these tools, with a majority (>50%) of their developers. This gap got me wondering: are these tools delivering on the promises that engineering leaders expect?

While nearly every organization uses AI coding tools, adoption is low across the board.

I surveyed engineering leaders and Platform teams from dozens of tech companies, ranging from smaller startups to big tech. After diving deeper into these results and having 1:1 conversations with engineering leaders, clear patterns emerged about the benefits they were aiming for and what’s holding them back from seeing those results.

The benefits they’re aiming for

Velocity is king

Faster developer velocity and time to market was the top desired benefit of AI coding tools.

This should NOT be surprising: When I asked engineering leaders what they hoped to gain from AI coding assistants, one answer stood head and shoulders above the rest: speed. Over 70% of respondents pointed to developer velocity and reduced time-to-market as their top priorities.

This tracks with what I hear every day in conversations. Shipping faster is a universal goal. AI coding assistants offer some big promises: cut down on boilerplate, unblock developers faster, and reduce cycle time without adding headcount.

Reducing toil and improving focus on “higher-level” work

While velocity dominated the responses, another (related) benefit showed up in about 34% of them: reducing toil. This means cutting down on tedious, repetitive work and letting engineers focus on higher-value or higher-level tasks. This goes hand in hand with speeding up developers but speaks to a goal of improving the overall developer experience while they’re at it.

What was super interesting here is that this benefit was seen predominantly among teams with higher AI adoption (i.e., with AI coding adoption > 50% of their developers). So, the companies with higher adoption—and as such, the ones more likely to have actually REALIZED the benefits they were looking for—were the ones seeing their developers focus on higher-level work.

It suggests a pattern: the deeper teams go, the more they begin to see improvements in developer experience, not just output speed.

Ok, so what’s holding back adoption of AI coding tools?

Everyone wants to speed up their developers, and reduce toil. So why is adoption still so early? I asked these folks what some of the biggest barriers to adoption have been, and what’s slowing them down.

It’s not the cost!

Cost is usually some sort of factor in adoption new tools, but with AI it’s minor. Only about 22% of respondents brought up cost as a concern.

In fact, many leaders I spoke to are actually letting their engineering teams choose between a set of AI tools, with a fair bit of redundancy. (There are, on average, ~3 coding assistants in use at most eng orgs at the same time.)

One SVP of Engineering I spoke to was actually slashing his dev-tool budget pretty heavily in general, but still shrugged off the price of these AI tools:  “If they speed up my engineers, spending $20/month is nothing.”

Another VP of Engineering worried that today's prices are artificially low due to market competition and investor subsidies. They foresee potential future increases - that could eventually become more significant barriers.

Learning curve: the new way to code

A bit larger of a blocker: 28% of respondents mentioned the learning curve as slowing them down. This one was a bit ironic, as these tools should, in theory, be “intuitive”, with a more English-language interface. 

But developers are used to coding in code… not English. And vibing, prompting, refining, and guiding an AI isn’t the same as typing out lines of Python. It takes time to adjust, and that friction slows adoption.

The real barriers: security, quality, and maintainability

In the end, the biggest things I heard that were slowing down adoption were: 

  • Security: 59%
  • Code quality: 53%
  • Maintainability: 38%++ (see note below)
The overarching reasons leaders state why adoption is moving slower than desired.

Security concerns are important, but surmountable

While 59% of respondents mentioned Security concerns (and 44% mentioning the somewhat-related IP concerns), they were often surmountable with the right controls in place. Many organizations have strict rules on which AI tools are approved for usage, on turning training and memory off, and what kind of data can be shared with these tools.

Nevertheless, several folks still brought up wariness around new threat vectors, especially surrounding MCP. MCP is still a bit of a wild west, without a lot of guardrails around where the MCP servers are coming from, what’s being run when they’re executed, or what local data might be getting exfiltrated.

Code quality concerns remain

Diving deeper into the 53% of respondents that cited code quality concerns, I saw more deeply-rooted issues around the AI tools generating incorrect code, or using bad libraries or bad patterns in their code.

These types of concerns frustrate developers and platform engineering teams alike. Developers are frustrated when AI coding assistants make basic mistakes or hallucinations. Platform teams are frustrated when AI-generated code imports old libraries they’re trying to deprecate.

One Staff Engineer I spoke to recalled Github Copilot repeatedly hallucinating parameters to Github’s own public CLI. Which, as you can imagine, was intensely frustrating, given they’re both owned by Github.

These quality issues speak to a real tradeoff between time saved with AI coding assistants, vs time wasted going down rabbit holes, fixing hallucinations, or taking on unwanted technical debt with bad libraries.

Maintainability: a critical concern for those with low adoption

Maintainability concerns of AI-generated code was only cited by 38% of respondents, and so at first blush don’t seem to be the biggest issue.

However, taking a step back and focusing on just those companies with lower AI adoption rates within their organization (25% or less), a very interesting pattern emerged: suddenly maintainability concerns jumped to a big barrier at 67%.

(Logically, focusing just on those companies with lower AI adoption rates gives us more signal on which factors have the most “slowing impact”.)

This distinction is important, and it suggests that maintainability concerns are an early-stage barrier. Teams dipping their toes in the water are especially cautious about long-term code health. They worry that AI-generated code might “work” today, but longer-term lead to codebases that are difficult for their teams to understand, debug, and maintain over the long term.

Conclusion

Engineering organizations are buying AI coding assistants with the hopes of real, tangible benefits, especially related to velocity and developer experience. But the fears holding teams back are just as real.

In the next post in this series, I’ll go beyond diagnosis and into treatment. I’ll share actionable strategies from high-adoption teams, including how they trained developers, handled concerns, and made AI a regular part of their daily workflows.

‍

More resources

AI coding assistants are everywhere, but are developers really using them?
Blog
AI coding assistants are everywhere, but are developers really using them?

AI coding tools are at maximum hype, but are teams actually getting value from this new technology?

Read more
Fast code, firm control: An AI coding adoption overview for leaders
Blog
Fast code, firm control: An AI coding adoption overview for leaders

AI is writing your code; are you ready?

Read more
March Product Updates
Blog
March Product Updates

Some of the big releases from the month of March.

Read more
Product
Software catalogMaturityIntegrationsSelf-serviceKnowledge CenterBook a meeting
Company
About usCareersContact usCustomersPartnersSecurity
Resources
DocsEventsBlogPricingDemoGuide to Internal Developer PortalsGuide to Production Readiness
Comparisons
OpsLevel vs BackstageOpsLevel vs CortexOpsLevel vs Atlassian CompassOpsLevel vs Port
Subscribe
Join our newsletter to stay up to date on features and releases.
By subscribing you agree to with our Privacy Policy and provide consent to receive updates from our company.
SOC 2AICPA SOC
© 2024 J/K Labs Inc. All rights reserved.
Terms of Use
Privacy Policy
Responsible Disclosure
By using this website, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Data Processing Agreement for more information.
Okay!