An individual viewing glowing numbers on a screen, symbolizing technology and data.

 

TL;DR Summary:

Here’s something we don’t talk about enough: technology doesn’t run your business. People do.

But if you’ve ever felt like AI is directing you instead of the other way around, you’re not alone. We recently had a fireside chat with Amy Bittle, founder and CEO of Amy Bittle Consulting, to talk about what it really means to govern AI in the real world. There’s plenty of theoretical frameworks around data governance for AI, but realistically, how and what should companies do to ensure their AI implementation plans meet compliance & governance demands. See some of the key challenges existing in organizations today and how they can be approached with a strategic AI governance framework.

In this piece, we’ll address the challenges with AI governance Amy highlights as pressing issues that need attention before organizations can safely implement AI in their operations. We’ll also explore her approach to solving them.

Most AI governance conversations start with theoretical conversations on frameworks.

WinPure Whitepaper

Amy Bittle starts with a practical assessment of the organization’s readiness for AI, the culture, and how teams work with AI. Amy’s perspective is refreshing because she doesn’t treat governance like a rigid framework. Instead, she encourages organizations to focus on building accountability into their daily operations, ensuring teams know who owns what data, how it should be used, and where it lives across systems. Governance should work like any other operational process: consistent, predictable, and embedded into how teams actually work.

And after years of consulting work, she’s seen firsthand what happens when data quality problems collide with AI adoption.

Spoiler: it’s not pretty.

Let’s take a look at some of the key challenges with AI governance.

What happens when two amazing cooks follow different variations of a recipe? Chaos. Obviously. Amy uses an analogy that hits home. Imagine you’re in your kitchen trying to cook from three different recipes at once. You’ve got ingredients everywhere, instructions competing for attention. The reality? You can’t possibly follow all three. You have to pick one recipe and stick with it. Otherwise, it’s just chaos and your outcome is a dish that doesn’t really belong anywhere.

That’s what happens in organizations every single day. Two brilliant departments, both working hard, both trying to solve the same problem in multiple ways, but each working from different data sets. That’s not innovation. That’s chaos.

When governance is built into your business, AI becomes something that you direct, not something that directs you,” Amy explained. And that’s the shift leaders need to make where AI governance isn’t treated with red tape but creating a rhythm or culture where everyone’s aligned to same goal, and cooking from the same recipe.

amybittle1

If you’ve been in tech or compliance for a while, you’ve probably lived through traditional data governance. That’s the world of access controls, permissions, and making sure the right people can see the right data. Important stuff, absolutely.

But AI governance adds a new layer: fairness, accountability, and transparency. It’s here that frameworks like NIST’s AI Risk Management Framework and ISO 42001 can be implemented to help organizations build clarity into how they use AI tools; and to help teams understand what their systems are actually doing.

Here’s a scenario Amy shared that probably sounds familiar – and perhaps where the implementation of AI governance would make a massive difference to business objectives as well as to organizational security.

An HR team starts using an AI tool to screen resumes. It’s efficient. It saves time. But here’s the question nobody asks: What data is teaching this system? What story is it telling about candidates? Is this tool an approved technology that meets standards and compliance guidelines?

When teams start asking these questions, governance becomes part and parcel of the culture instead of red tape.  Because the real governance problems aren’t even technical most of the time; they’re actually human! People save files locally. They create spontaneous updates. They use five different systems to do the same job. Governance creates the guardrails that ensure everyone accesses the right data, understands their responsibility for it, and knows how to handle it properly.

Let’s talk about something every sales and HR team deals with: messy CRMs.

You’ve got Robert Smith in your system. Then Bob Smith. Then R. Smith Jr. Maybe even Bob R. Smith. Same person, four different records. It’s funny until you realize those duplicates are costing you real money in missed follow-ups, double billing, and wasted effort. If your AI systems are learning from data that thinks Bob Smith is four different people, what decisions are they making? What patterns are they finding that don’t actually exist?

entity resolution

This is where a secure, on-premises data quality solution like WinPure with its entity resolution capability can be used to identify and merge duplicate records across systems. It uses sophisticated matching logic to understand that Robert Smith and Bob R. Smith are the same person, even when the data is inconsistent. And when you clean up these duplicates, you’re not just tidying up your database. You’re giving your teams and your AI systems a single, accurate view of customers, vendors, and operations.

Resolve Complex Duplicates with Confidence!

WinPure’s on-premises entity resolution identifies and merges duplicate records across systems. Get a single, accurate view of every customer and vendor.

Amy pushes organizations to go deeper than just cleanup. “Don’t just stop at the cleanup process. Look at the data. How did things actually get messy?” she said. Because when you understand why your data got messy (usually a result of no standards, no ownership, no accountability), you can build processes that keep it clean long-term.

That’s the difference between a one-time data cleaning project and sustainable data quality management for AI systems.

Here’s something that catches people off guard: privacy laws like GDPR, HIPAA, CCPA, and others all say the same thing. Data belongs to people, not organizations.

For businesses, this means you can’t just collect data and do whatever you want with it. You need to know what data you have, where it lives, who can access it, and how it’s being used. This is foundational to enterprise data governance AI systems require.

Amy’s advice? “Make compliance a rhythm, not a rescue mission.” Instead of scrambling when an audit comes up, build regular reviews into your operations. Know your data. Understand your risks. Document your processes.

When organizations treat compliance as something that happens quarterly during panic mode, they’re always playing catch-up. When they build it into their operational rhythm, it becomes part of how they work, not something that interrupts the work.

In many organizations, data quality and AI governance are handled by completely different teams. Operations over here. IT over there. Legal in another building entirely.

Amy’s take? “You have to come together because you have to understand high level what everyone does, what everyone’s responsibility is.

When teams understand what everyone else is responsible for, they can build integrated processes instead of duplicative efforts. They can create a single-handed plan instead of five departments working at cross-purposes.

This is especially critical for data preparation for AI systems. If your IT team is cleaning data one way, your operations team is using data differently, and your compliance team is documenting something else entirely, your AI models are learning from inconsistency.

Here’s one of the biggest governance gaps Amy sees right now: leadership doesn’t know what AI tools their teams are using.

Employees are using AI for prompts, analysis, content creation, all kinds of tasks. But nobody told the CIO. Nobody ran it through security. Nobody checked if it complies with data protection policies.

Amy shared a story that probably sounds familiar: “I’ve had emails from senior figures in organizations saying, ‘We don’t use your product,’ when they do, but they don’t know they do.”

This is the shadow IT problem, except now it’s shadow AI. And it’s creating serious risks around data security, compliance, and system integrity.

The solution isn’t banning AI tools but to get everyone on the same page. Have the conversation. Understand what tools people want to use and why. Create a process for evaluating and approving tools so you don’t have dozens of unauthorized systems running on your network.

When people understand why governance matters, they’re more likely to follow the rules. When they’re just told to do something without context, it becomes a chore.

Amy’s closing thought stuck with us: “When your systems are steady, your confidence returns.

Governance isn’t a group of compliance police coming to knock you down. It’s your friend. It’s the structure that lets you lead with confidence instead of constantly wondering if something’s about to break.

At Amy Bittle Consulting, she combines cybersecurity, compliance, and data protection to help organizations build confidence into their systems that works in real life, not just on paper for audits.

And here’s the thing about master data management AI governance depends on: when you trust your data, you make better decisions. When your teams are working from the same information, they move faster. When your AI systems are learning from clean, consistent data, they deliver more reliable insights.

This isn’t about perfection. It’s about direction. It’s about choosing to take control instead of letting technology run wild through your organization.

You can have the best frameworks in the world. You can document every process and policy. But if your data is a mess, your AI governance strategy is built on sand.

Clean data isn’t just about compliance checkboxes. It’s about operational confidence. It’s about knowing that when your team makes a decision based on AI insights, those insights are coming from information you can trust.

If you’re looking at your CRM right now and seeing duplicate records, inconsistent formatting, or gaps in critical information, that’s not just a data quality problem. That’s an AI governance problem waiting to happen.

Watch the Full AI Governance Webinar Here 👇

This article is based on insights from our webinar “Taking Back Control: How to Govern AI in the Real World” featuring Amy Bittle, founder and CEO of Amy Bittle Consulting.

Want to talk about how data quality fits into your AI governance strategy?

We’re here to have that conversation. Get in touch with us via the form below and let us know how we can help make your data turn into a trusted asset for better AI governance outcomes.

 

Authors

  • farah
    : Author

    Farah Kim is a human-centric product marketer and specializes in simplifying complex information into actionable insights for the WinPure audience. She holds a BS degree in Computer Science, followed by two post-grad degrees specializing in Linguistics and Media Communications. She works with the WinPure team to create awareness on a no-code solution for solving complex tasks like data matching, entity resolution and Master Data Management.

  • amybrittle
    : Reviewer

    AI Governance Specialist & Strategic Advisor

    Amy Bittle is a leading voice in AI governance and risk management, helping organizations align emerging technologies with ethical, legal, and operational standards. With a background in regulatory compliance and enterprise tech strategy, she advises both public and private sector teams on how to build trust into AI systems—from data governance frameworks to responsible deployment.

Start Your 30-Day Trial!

Secure desktop tool.
No credit card required.

  • Match & deduplicate records
  • Clean and standardize data
  • Use Entity AI deduplication
  • View data patterns

  • ... and much more!
Index