I recently had an opportunity to run our BIChart assessment on a client’s production Tableau server. To my surprise, the Tableau semantic model disconnect was a bigger problem than I anticipated. While processing Tableau stats and meta-meta, BIChart assessment surfaced and quantified something that naturally occurs with all self-service analytics: duplicate work. The action plan to rectify these issues was thankfully non-invasive. However, for large enterprises with many data assets, such an action plan would be far more complicated.
A common culprit for “multiple sources of truth” is duplicative work. Even sophisticated enterprises with robust data governance, catalogues, and promotion management suffer similar problems. Self-service analytics tools like Tableau provide tremendous flexibility and speed to create analytics assets. However, managing a balance of speed, quality, and governance across organizational units is easier said than done.
That is why we created our BIChart Assessment! We are arming partners and customers with tools to assist with migrations. A byproduct of our migration tools is deeper insights into problems hiding in plain sight for Tableau server health checks.
The semantic disconnect issues I found initially surfaced data connection problems. In this article, I focused on Tableau Connection findings and solutions. Here are common deployment scenarios related to the Tableau connections:
Tableau Connection Types
1. Embedded Connection (in Workbook)
- Definition: The data connection is embedded inside the Tableau workbook (.twb or .twbx).
- Storage: Within the workbook file.
- Live or Extract: Can be either live or extract.
- Visibility: Not visible to others unless they open the workbook.
- Common Issues: Hard to govern, hard to update globally, duplicate logic across workbooks.
2. Published Data Source (PDS)
- Definition: A data source published as a standalone object to Tableau Server/Cloud.
- Live or Extract: Can be either live or extract.
- Access: Multiple workbooks can connect to the same published data source.
- Governance: Supports row-level security, data source certifications, and refresh scheduling.
- 2a. Published Live Connection
- Direct, real-time connection to the database.
- Used when data freshness is critical.
- Common for Snowflake, Redshift, SQL Server, etc.
- 2a. Published Live Connection
- 2b. Published Extract Connection
- Great for performance, less ideal for real-time needs
- Pre-aggregated or filtered data snapshot.
- Scheduled refreshes via Tableau Bridge or Server.
Variations and Hybrids Create Varying Levels of Complexity
From standard connection types, the deployment patterns vary from business to business. As we have learned, sometimes short-term solutions can result in long-term pain points:
Copied / Cloned Published Data Source
- Someone downloads a published data source and republishes it with edits.
- Results in duplicate published data sources with slightly different logic.
- Creates governance drift and versioning issues.
Workbook Connected to PDS + Embedded Calculations
- Hybrid: connects to a PDS but includes workbook-specific calculated fields or filters.
- Appears consistent, but introduces divergence from the original PDS.
- Hard to track downstream logic differences.
Multiple Published Versions of the Same Source
- E.g. sales_data_live, sales_data_v2, sales_data_live_2025, etc.
- Often created by well-meaning analysts who want their own variant.
- Results in sprawl and inconsistency.
Data Source Aliases or Repointed Connections
- Same underlying database, but with different naming, connection strings, or schemas.
- E.g., one user connects to db.sales and another db.sales_view.
- Creates semantic mismatches in calculated fields and metadata.
Cross-Database Joins / Blended Data Sources
- Workbooks that pull in multiple data sources (e.g., Salesforce + Snowflake).
- May blend across live and extract, or multiple PDS and embedded sources.
- Hardest to manage in terms of lineage and performance.
Transforming Chaos into Action
Assessing and consolidating these scenarios requires understanding, cultivating data, and delivering actionable insights. Think of the BIChart Assessment as BI on BI. Our team at BIChart understands friction points intimately, and that is why we embarked on a challenging problem to assist enterprises that need to potentially transpile existing Tableau assets into other BI platforms.
Our BIChart Tableau assessment was created independently from our Tableau migration solution. That allows us to work with all enterprises that need to migrate or simply assess, archive, and clean up an aging Tableau server.
My Tableau Semantic Model Findings and Action
Data Source Findings:
To recap my findings, the stats jumped out at me, surfacing 2 problems. First, I saw 15 linked variants of the same underlying Snowflake view into 12 dashboards as a published data source.
The published version of the connection was the least adopted. Digging deeper into the semantic model, we found mismatches in fields that indicate the slight differences in the semantic model that easily could produce multiple versions of truth with the same underlying data.

Create an Action Plan:
In this specific case, my assessment was executed within a small enterprise that is not migrating to Power BI. The action plan we developed for Tableau was:
- Consolidate connections and sources, giving priority to high adoption/utilization instances.
- Deploy and re-link a published data source.
- Retire connections and workbooks that are no longer in use, along with workbooks.
- Use this result as empirical proof and an anecdote of why governance is so important.
Assessing your Tableau Org Health
To work with BIChart, you don’t need to migrate away from Tableau! We are happy to help your org assess and help you create a plan to perform a Tableau server spring cleaning year-round! Contact us for more details how to get ahold of our Tableau Assessment