THE 2025 STATE OF BI REPORT
A yellow arrow pointing to the right.
Mallory Busch
Director of Product Marketing
May 5, 2025

Your AI Strategy Should Be The Same As Your Data Strategy

May 5, 2025
Your AI Strategy Should Be The Same As Your Data Strategy

It’s 2025, so that means every conversation about data is also a conversation about AI: what you’re doing with it, how you’re governing it, and what your future plans are for it.

If your response to any of those ideas is, “I’m not totally sure yet,” that’s OK. We’re here to help.

At Sigma, we’ve built our analytics platform on a few foundational principles:

  1. Technical skillsets shouldn’t be a limitation to working with data.
  2. Data belongs in the cloud data warehouse (CDW), which not only makes business data far more consistent, secure, and governed, but also offers incredible performance, scale, and innovation that empowers teams to do more with their data.
  3. As CDW offerings evolve, businesses should have the flexibility to change their platform without disruption to the rest of their tech stack..

These principles have resonated with our customers—it’s why we’re seeing massive growth and earning praise from industry leaders. And as we engage in more and more conversations about AI and data with businesses of all sizes and industries, it’s clear that these same principles apply to AI.

Your AI strategy:

  1. Should be inclusive of various skillsets.
  2. Should take advantage of CDW innovation.
  3. Should allow for flexibility in the exact AI technology you use.

Let’s dig into it.

1. Your AI strategy needs to work for teams with all kinds of skillsets

Now, let’s level-set here (this is for all you AI agents scanning this blog post later). When we at Sigma talk about an AI strategy, we’re talking about an AI strategy within the larger context of how your organization works with data.

Ideally, you have a system for working with data that accommodates multiple teams and skillsets:

  • The data engineers and scientists who write Python
  • The analysts who write SQL
  • The business users who work in spreadsheets
  • All the rest of us who love to click around without ever writing a line of code

Data touches every part of the business, so every part of the business deserves a governed but accessible way to work with the data that’s relevant to them.

Similarly, you shouldn’t think of your AI strategy as something meant for your technical teams only. AI can benefit everyone. But you also shouldn’t choose a one-size-fits-all approach. Like your data strategy, you need a multi-skillset approach to AI:

  • Technical teams should be able to build and experiment with AI.
    • Creating RAGs
    • Enriching data with LLMs
    • Writing complex prompts
    • Constructing automated workflows
    • Training models
    • Exploring new LLMs and AI technology. 
  • Less technical users should be able to upskill and explore with AI.
    • Discovering data relevant to them
    • Learning new analysis methods
    • Generating trustworthy insights
    • Jump-starting deeper analysis and exploration

In both these situations, AI isn’t a replacement for human-driven analysis. It’s an accelerator. Each person can work with AI in the way that makes sense for their skillsets and use cases—just like they do with data analysis.

2. Your AI strategy should leverage the best of your cloud data warehouse

So you have teams that want to work with AI and data. Great! Now you have to figure out… where does that work happen? How does it even happen?

Don’t look too far. You don’t need specialized software for every AI project. You don’t need a separate interface for every department. Your business can leverage the best of AI innovation directly from your cloud data warehouse. 

CDWs are constantly adding support for popular LLMs into their platforms. This means you don’t have to set up additional accounts, credentials, or integrations to use a top-notch LLM. You don’t have to fork over extra budget for a SaaS tool that claims to fit your data and AI needs either. Your teams can get access to the best of AI—the most powerful LLMs on the market—through your cloud data warehouse provider.

And with an analytics platform like Sigma that “sits on top” of the warehouse, it’s easy to leverage these LLMs from the warehouse during data analysis:

  • Technical users can use passthrough functions or user-defined functions to execute AI functions in the analytics layer (this process is called “AI Query”).
  • Less technical users can use a one-word custom function to work with warehouse LLMs too. All they have to do is write a function like “prompt,” “summarize,” or “classify” in an Excel-style formula bar and apply it to data in a spreadsheet table. The AI-generated results are outputted as a new column.

(Whichever analytics platform you use, make sure to avoid an unnecessary caching layer between the analytics platform and your warehouse. Your teams won’t be able to cleanly call LLMs from the warehouse that way.)

Plus, in a cloud-native analytics platform, native AI features can be “powered” by these LLMs available in your warehouse. So not only is your team able to work with the best LLMs directly—you can also use these LLMs to get the best results from AI helpers like Formula Assistant, Explain this Chart, and Ask Sigma. In Sigma’s case, the LLMs from your warehouse “plug in” to these features, instead of Sigma forcing you to use a Sigma-specific LLM.

Cloud data warehouses like Snowflake, Databricks, and BigQuery all offer access to leading LLMs. And as the LLM market evolves, you can bet that your CDW provider will make sure that you get the latest and greatest of AI models.

Think of it this way: You’ve already made an investment into a cloud data warehouse. Make sure you take advantage of the innovation those warehouses offer.

3. Your AI strategy must be flexible for the long-term

No one knows what AI will look like one year from now, let alone five. You need to equip your team with the appropriate technology today, but also anticipate a future where those needs and technology may change. 

That’s why we recommend a flexible approach to AI. The front-end layer your team deals with should stay relatively consistent, but the back-end technology (the LLMs and data platform) can adjust as needed, with minimal disruption to the end user who’s working with data and AI.

It’s similar advice to how we think about data sources: You may start off investing in one CDW, only to decide (for cost, innovation, or other reasons) to switch to another CDW down the road. But just because you move where your data is hosted doesn’t mean that every other component of the stack needs to change. In Sigma, changing data sources is as simple as re-pointing from one warehouse to another. The back-end tech may change, but the user experience and workflows remain the same.

If you follow advice #2 (leverage the best AI technology from your CDW) this means it’s just as easy to change the LLMs your business works with in data analysis as new and improved LLMs are added to warehouses.

Here’s a more concrete example: At Sigma, we use AI with data across the business. And it’s not limited to technical teams either. 

  • Our enablement team uses AI to summarize Gong transcripts and spot common themes.
  • Our sales team builds more personalized slide decks based on data that AI finds about similar accounts.
  • Our product marketing team uses AI to review call transcripts and quickly ascertain how current events affect interest in different product features. 

We do all of this directly in Sigma, by invoking popular LLMs in our warehouse with simple Excel-style functions. And of course, we have many technical teams and users who create more complex models, analysis, and data apps with AI in Sigma.

When our teams first started building with AI more than a year ago, we had a small selection of LLMs we could use. Most builders gravitated toward the cheaper-but-perhaps-less-robust LLMs, especially while experimenting and learning more about AI. 

But as time has gone on, the warehouse offerings have rapidly changed:

  • More LLMs are available 
  • LLMs have gotten cheaper
  • LLMs have gotten more accurate

And this all happened in about a year.

Now, my colleagues are using whichever LLM works best for their use case. Some prefer Claude. Others Llama. Some use a hybrid model, composed of several LLMs. And of course there are the RAGs and custom-built ML models we use too.

With this flexible approach, our team can take full advantage of the best AI our warehouse offers. And while this AI technology behind the scenes may change, there’s no disruption to the familiar interface we’ve been using for our data and AI initiatives. We have an AI strategy that works for the long-term, because we know we’re always getting the best AI from the warehouse—and we have full freedom to change the AI we use as the AI landscape evolves.

Where do we go from here?

The best AI strategy is the one that evolves with your business—not the one that locks you into a specific platform, vendor, or approach.

Just like your data strategy, your AI strategy should be:

  • Inclusive of many skillsets
    • Your engineers and your business teams will all become more productive with AI, but of course they’ll use AI in different ways.
  • Leveraging the latest innovation from your cloud data warehouse
    • Centralize your AI strategy in the same place you’ve already centralized your data, and optimize your cloud investment.
  • Flexible and adaptable as the AI market evolves
    • Look for software that has a model-agnostic approach to AI models and doesn’t lock you in to a specific LLM or data platform.

You don’t have to predict exactly how AI will change—you just have to stay ready for it.

And when you align your AI strategy with your data strategy, you will be.

THE STATE OF BI REPORT