As organizations implement new facility management platforms, consolidate existing systems, and try to embrace AI, many are focused on technical integration while missing a fundamental requirement: establishing a shared language for how their organization thinks about data.

Teams across departments might use the same words but mean completely different things. For example, β€œdata quality” has one meaning for IT, another for facilities, and a third for finance. When definitions depend on departments, the overall organization lacks the cohesive understanding that pulls people, processes, and technology together.

In a recent episode of the Workplace Innovator podcast, “What Lies Ahead? β€” AI’s Role in Solving Key Challenges in Facility Management,” Dean Stanberry, SFP, CFM, described AI’s potential for predictive maintenance, energy management, and space optimization. He also, however, shared a warning: “AI is all about analyzing data of all different kinds, and if you feed it bad data, you’re not going to get a good response.”

Key takeaways

  • Organizations implementing new technology platforms often focus on technical integration while overlooking the critical need for a unified data language that ensures everyone across departments interprets information consistently
  • Data quality directly determines the reliability of AI-powered insights and automated decision-making systems, making governance frameworks and quality controls prerequisites rather than optional enhancements
  • Building data literacy across all organizational levels transforms technology investments from expensive dashboards that nobody trusts into strategic tools that drive confident operational decisions

The solution requires disciplined architecture, strong governance, quality controls, and day-to-day management. When these elements are in place, the same tools and integrations start producing data that organizations can use, define, improve, and trust.

Building your data vocabulary: foundation and structure

Before diving into implementation checklists and governance frameworks, your organization needs a shared understanding of what these data terms mean.

Foundation layer: architecture and organization

At the base are the structural elements that define how your data is organized and stored.

  • Data architecture: The specification system that defines what each field means, how it’s measured, and where it goes. Data architecture forms the foundation layer upon which everything else rests
  • Data standards: The building codes that ensure everything is built to specification. These are the agreed-upon formats, naming conventions, and rules that enable true interoperability
  • Data dictionary: The universal translator ensuring everyone speaks the same language. Your data dictionary defines every term, every field, and every code used across your systems. Without this shared reference, “Building A” in one system might be “Facility 001” in another, and “Main Campus North” in a third
  • Data warehousing / data lakes: Centralized storage systems for aggregating information from multiple sources. However, these are only useful when everyone agrees on what the data represents

Also included are data models, the structured representation of your data elements, their relationships, and the rules governing them. Effective data models must align with industry standardsβ€”like those being developed by Building LABS, a Stanberry and Autodesk initiative β€” and support seamless integration of information across all building lifecycle phases.”

Without these foundational elements in place, everything built on top becomes unstable.

Governance and roles: who owns what

Data governance is the regulatory body the establishes policies for what must be done. It functions as your organization’s building codes for data, defining who can create data, who can modify it, what format it must follow, and what to do when there are conflicts.

A common issue is when organizations treat governance as paperwork instead of operational discipline. They create governance documents that sit in SharePoint folders while the actual work continues with no standards, no consistency, and no accountability.

A data steward owns the standards, resolves conflicts, ensures consistency, and acts as the bridge between technical implementation and business requirements, while a data custodian is anyone who enters, updates, or maintains data in your systems is a data custodian.

For a facility manager, your entire frontline staff are data custodians, whether they know it or not. Every technician generating work orders, every space planner updating occupancy information, and every facilities coordinator logging asset information is a custodian.

Data management is the full lifecycle approach from creation through archiving. In facility management, teams should be meticulous about asset lifecycle management, tracking when assets are acquired, maintained, upgraded, and retired. They budget for their entire lifetime and plan for their eventual replacement.

The same level of discipline is required for the data we use to make important decisions. When it comes to implementation, you’re not just moving data once. You’re establishing how it will be managed forever β€” who maintains it, who updates it, how long you keep it, when you archive it, and what happens when it conflicts with other sources.

When these roles are formalized with real authority and accountability, data quality stops being an aspiration and becomes an operational reality.

Quality control: keeping data clean

Prevention and monitoring are essential to maintaining data integrity, and include:

Data validation is the security checkpoint ensuring information has proper credentials before entering the system. It prevents bad data from entering at the point of entry. The principle is simple β€” it’s far cheaper to prevent bad data from entering than to clean it up later.

Validation in practice can mean required fields that cannot be skipped, dropdown menus instead of free text, and format checks that reject invalid entries.

Data quality is the ongoing measurement of accuracy, completeness, consistency, and timeliness.

Here’s the critical question for this concept: If your occupancy data is 70% accurate, are you comfortable making real estate decisions based on it? What about capital planning? Renovation priorities?

We measure equipment reliability in facility management. For example, we track mean time between failures. We monitor performance degradation. We need to measure data reliability with the same rigor because the decisions riding on that data is as critical as the assets themselves.

Data auditing involves the systematic inspection to find drift before it becomes a crisis. Humans make mistakes, which is why auditing cannot be a one-time event.

Make sure the organization can answer the following questions:

  • What gets audited?
  • Are teams following standards?
  • Where are the gaps? What’s changing without documentation?
  • Who’s updating records and why?

Without audit trails, you can’t improve processes, hold anyone accountable, or maintain trust.

The mantra here has to be β€œHope is not a data strategy.”

Together, these quality controls create a system where bad data gets stopped at the gate instead of discovered during critical decisions.

Remember, systems drift. Integrations break. Standards slip. The only way to maintain data quality over time is to watch it continuously and catch problems early, before they compound. This becomes even more critical during system consolidation, when you’re combining data with different standards, different definitions, and different quality levels.

Understanding and action: making data work

Once your data is well-governed and high-quality, these capabilities turn it into organizational advantage:

  • Data lineage: Chain of custody showing where data originates from its single source of truth. When a dashboard shows space utilization at 73%, can you trace that number back to its source?
  • Data analytics: Studying patterns to find hidden insights for smarter decisions. This goes beyond reporting β€” it’s the systematic examination of data to find patterns and drive decisions. In facility management, this means analyzing occupancy trends to optimize space allocation, examining energy consumption patterns to identify efficiency opportunities, and studying maintenance cycles to predict equipment failures before they happen
  • Data literacy: A critical challenge because commercial real estate was the last industry to go digital, resulting in low technology skills across all organizational levels

If your teams don’t know when to trust data, when to question it, or how to explain it to executives who control the budget, actionable insights start to disappear.

How to prepare your organization for system implementation

Create quality controls, then trust the data they produce. Build literacy, then deploy analytics that require it.

Implementation checklist






Get the specifications right, build the quality controls, establish the standardsβ€”then your integrated systems become truly interoperable ones.

Master your data to unlock AI

Whether you’re just starting to build data literacy or have governance frameworks in place, there’s always room to strengthen your foundation. Organizations that establish shared data vocabulary don’t just avoid implementation failures β€” they unlock AI’s full potential to make confident decisions based on trusted information.

Ready to learn more about AI in facility operations? Get the guide on AI in the modern workplace

Frequently asked questions

  • What is the most common mistake organizations make when implementing new facility management systems?

    Organizations typically prioritize technical integrationβ€”connecting APIs and displaying dashboardsβ€”while neglecting to establish a shared data vocabulary and governance framework. This results in systems that work technically but produce data nobody trusts because teams lack agreement on what the information means or where it originated.

  • How does poor data quality specifically impact AI performance?

    AI systems analyze patterns in data to generate insights and predictions. When fed inconsistent, inaccurate, or poorly defined data, AI produces unreliable outputs that can lead to flawed decisions. Quality data inputs are essential for AI to deliver meaningful results in areas like predictive maintenance, energy management, and space optimization.Β 

  • What's the difference between a data steward and a data custodian?

    A data steward owns the standards, resolves conflicts, ensures consistency, and bridges technical implementation with business requirements. A data custodian is anyone who enters, updates, or maintains data in systemsβ€”such as technicians generating work orders or coordinators logging asset information. Both roles are essential for maintaining data integrity.

  • How can organizations measure whether their data is actually usable for important decisions?

    Organizations should systematically measure data accuracy, completeness, consistency, and timeliness. Ask critical questions like: “If our occupancy data is 70% accurate, are we comfortable making real estate decisions based on it?” Establish regular data quality reviews, automated alerts for data drift, and audit trails to maintain trust over time.

  • What practical steps should we take before implementing new integrated systems?

    Start by socializing core data terms across your organization, assign formal data stewards and custodians with real authority, build validation into every data entry point, establish ongoing auditing practices, invest in data literacy training for all users, and require data lineage documentation for critical datasets. These foundational elements prevent implementation failures and unlock system potential.

Avatar photo

By

As a content creator at Eptura, Jonathan Davis covers asset management, maintenance software, and SaaS solutions, delivering thought leadership with actionable insights across industries such as fleet, manufacturing, healthcare, and hospitality. Jonathan’s writing focuses on topics to help enterprises optimize their operations, including building lifecycle management, digital twins, BIM for facility management, and preventive and predictive maintenance strategies. With a master's degree in journalism and a diverse background that includes writing textbooks, editing video game dialogue, and teaching English as a foreign language, Jonathan brings a versatile perspective to his content creation.