Skip to main content

Insights from EverOps’ CTO Jose Mercado on AI, AWS, and Multi-Cloud Strategy

10/16/25 | Jose Mercado

Before joining EverOps as Chief Technology Officer, Jose Mercado was a customer. He initially brought the team in to solve urgent infrastructure needs at a scaling startup. That early engagement left a lasting impression, which later led to his transition into a leadership role at the company. With firsthand knowledge of how the company operates from both sides of the table, this experience has made him an integral part of the team now. 

Today, he leads cloud strategy, engineering delivery, and platform partnerships across a broad portfolio of client engagements. With deep experience across AWS, GCP, and Azure, he has shaped how EverOps approaches multi-cloud work and how the company evaluates new technologies like AI. His perspective reflects both client priorities and operational realities, especially in high-risk environments where infrastructure stability and performance are critical.

In this exclusive interview, Jose briefly outlines EverOps’ evolving relationship with AWS, shares lessons from real-world AI deployments, and explains how thoughtful engineering practices continue to drive results for clients facing constant change. For engineers and technical leaders, this interview delivers the kind of clarity that only comes from hands-on experience and industry wisdom.

EverOps & AWS: A Strategic Partnership 

Many companies treat cloud adoption as a formality. It might be more of a logo on a slide signaling technical alignment, but the partnership often lacks depth. At EverOps, AWS plays a much more integrated role.

“We’ve worked with AWS since day one, even before official partnerships,” Jose says. “Today, about 80% of our complex projects run inside AWS.”

That experience is now gaining renewed focus as EverOps has recently brought on Matt Meyer, a new member of the sales team with deep knowledge of the AWS Partner Network. His background includes time at AWS and other partner organizations, giving EverOps added momentum as it strengthens this relationship and improves the path to results for clients.

AWS, however, is only part of the picture. The majority of EverOps customers operate in hybrid or multi-cloud environments, with a mix of AWS, Azure, and Google Cloud services. According to Jose, this is where the team’s structure makes a difference. The TechPod teams are composed of engineers who bring real-world experience across all three platforms, and that expertise shows up inside client environments (not just through pitch decks and false claims). 

While larger consultancies may offer multi-cloud support, actual hands-on depth across providers still remains uncommon. With EverOps, this is a core part of their model and framework.

The Current State of AI in DevOps

Artificial intelligence has captured the attention of the tech industry for years, especially in areas like product development and customer-facing applications. But when it comes to implementation within IT operations and DevOps environments, adoption has been much slower and more cautious as of late. According to Jose, this hesitation is not due to a lack of interest but rather a realistic assessment of the risks involved.

“There’s a lot of excitement, but very real concerns. AI can make mistakes, like hallucinating or taking a wrong step, which could bring systems down and undermine a business case instantly.”

The idea of using generative AI to support or automate infrastructure work raises understandable hesitation. Many of the systems that DevOps teams manage are tied directly to uptime, performance, and security. A single misstep, especially one made by an unpredictable or opaque model, can lead to disruptions that outweigh any perceived benefit. For most organizations, the stakes are simply too high to rely on AI without clear controls in place.

“There’s still a concern around IP and privacy,” Jose states. “Even though most of our customers have enterprise agreements with vendors like OpenAI or Anthropic, there’s uncertainty around whether the codebases actually support strict data protections. People worry about sensitive information leaking, even unintentionally.”

Cost also plays a significant role. While AI can deliver strong returns in revenue-generating applications, infrastructure teams are typically evaluated on cost control and operational efficiency. Without clear value metrics, the unpredictable pricing of AI usage has created hesitation.

Despite these risks, Jose still sees real opportunities when AI is applied with intention. At EverOps, engineers are beginning to integrate AI tools in controlled settings to help reduce repetitive workloads and lighten the cognitive load on teams responsible for highly detailed systems.

AI in Practice Through EverOps Client Success

Theory only goes so far today. The real proof is in the projects and client success stories. Jose shares key examples that showcase both the complexity and the benefits of practical AI throughout the interview, specifically calling out a few specific client stories. 

At Life360, for example, EverOps introduced “Victor,” an AI-powered developer support system. The company had tried a previous AI tool and abandoned it after much disappointment, which eventually led to the development of this AI solution. Today, Victor accelerates problem-solving, boosts engineering productivity, and keeps teams productive. 

As Jose mentioned, “It isn’t just about using AI…but focusing it on measurable outcomes.”

This example underscores how AI is being adopted within EverOps environments. Tools are not deployed to chase trends but to deliver practical value, reduce friction in daily engineering work, and help teams operate with greater clarity and efficiency. Jose also mentions welcoming the opportunity to learn from failures, seeing them as crucial to lasting improvement.

A Word of Caution on AI Use from the Field

For all the interest surrounding AI in infrastructure, Jose is quick to emphasize that meaningful adoption requires more than curiosity or ambition. It calls for clear-eyed decision-making, well-defined safeguards, and a willingness to move slowly when needed.

“You don’t need to give up discipline for innovation,” he states. “Acknowledge the risks, put strong controls and monitoring in place, and scale up what works.”

This mindset reflects years of working in production environments where stability is essential and small errors can carry real consequences. At EverOps, any AI implementation is preceded by rigorous testing, scoped carefully to avoid unintended disruption, and evaluated not just on potential but on actual outcomes.

Jose’s guidance is not about resisting innovation, but about being intentional. He encourages teams to build systems of oversight, stay transparent with stakeholders, and only scale what is proven to deliver value. In his view, AI does not remove the need for engineering judgment. It raises the bar for how thoughtfully that judgment must be applied.

Looking Ahead with Purpose and People in Mind

At the end of the day, Jose sees the future of AI in DevOps not as a path toward replacing expertise but as a way to expand the capabilities of every engineer. 

He explains that “AI raises the baseline for skill,” and “Someone familiar with a given tech stack can move faster with fewer mistakes. It is not about replacing deep knowledge, but helping every team member be more productive.” This is especially meaningful for smaller teams, where the right tools can amplify impact and allow limited resources to achieve much more.

What excites him most is the speed of progress. AI has already become easier to deploy and less expensive to experiment with than it was a year ago. The direction of travel suggests that fully customized AI solutions will soon deliver measurable returns, giving organizations the ability to move beyond basic automation and into new product development and operational innovation.

Even with these advances, he insists that technology must always serve people. Automation should strengthen engineers, not diminish their roles. Partnerships must be designed around shared outcomes, not surface-level agreements. EverOps’ TechPod model reflects this belief, combining multi-cloud fluency and AI capabilities with an unwavering focus on client needs and lasting value.

For Jose, the future of DevOps will not be written by marketing slogans or the latest “hype cycle.” It will be defined by professionals who pair innovation with discipline, who understand when to push boundaries and when to hold steady. The work is demanding, but when approached with integrity, the rewards are significant for both engineers and the organizations they support.

Turn AI Potential into Proven Outcomes with EverOps

If your organization is considering how to bring AI into DevOps, or if you need a team that knows how to turn technical ambition into measurable results, EverOps is ready to support you. 

Since 2012, EverOps has helped companies improve DevOps, ITOps, and SecOps performance by combining strategic insight with hands-on execution. We are fully equipped to help with AI and machine learning integration, developer productivity improvements, cost control across multi-cloud environments, infrastructure modernization, and security enhancement.

If you are ready to explore AIOps or want a clear assessment of your current systems, our team offers focused checkups that identify issues and provide actionable roadmaps. Whether your priority is efficiency, reliability, or innovation, we are prepared to guide the next step.

Contact us today to schedule a conversation and begin building toward stronger operations with AI and cloud expertise that delivers lasting results.

Frequently Asked Questions 

What are the main risks of using AI in infrastructure?

Jose identified three consistent concerns during our interview, including reliability, data protection, and cost. AI models can make mistakes with confidence, and privacy safeguards are not always trusted, even under enterprise agreements. Unpredictable pricing also makes some organizations cautious about large-scale use.

What does the future of AI in DevOps look like?

Jose expects AI tools to become easier to deploy, cheaper to experiment with, and more customizable. Over time, this will raise the baseline for all engineers, allowing even small teams to work with greater speed, accuracy, and confidence.

What’s next for AI in DevOps, according to Jose Mercado?

Stronger, embedded AI that helps engineers, automates key operational tasks, and boosts team output. With AI helping the team, less time is lost to simple problems, and more is spent on building new technology.

How fast can AI make an impact in IT operations?

With clear cases, such as developer tools like Victor, which we referenced in the article, results appear within days or weeks, freeing up critical talent and driving productivity.

What makes EverOps unique in the AI and cloud space?

EverOps stands out for its real-world expertise across AWS, GCP, and Azure, and by embedding multi-level, multi-cloud engineers in ongoing partnerships. This grounds progress firmly in your circumstances.

How does EverOps personalize cloud and AI solutions?

Everops works directly within your current workflows as a true collaborator, not just as outside advisors. Each suggestion fits your current situation and long-term vision.

What makes EverOps different from larger consultancies?

While big consulting firms often divide expertise across separate teams, EverOps engineers typically have hands-on experience across AWS, Azure, and Google Cloud. This multi-cloud fluency shows up directly in client work, giving teams practical flexibility and faster results.