Museum professionals are well acquainted with the discipline required to produce high-quality administrative content. Grant proposals, exhibition labels, and marketing materials are typically reviewed, edited, and standardized before they are shared. Collections data, however, does not always benefit from the same level of editorial rigor, particularly when institutional cataloguing standards are incomplete, inconsistently applied, or undocumented.
Over time, these gaps lead to variations in terminology, formatting, and structure that accumulate across records. As datasets grow, these inconsistencies make it increasingly difficult to search, report on, reuse, or govern collections data in a consistent and sustainable way. While many institutions recognize the importance of standardization, identifying practical and scalable approaches remains a persistent challenge.
To explore how museums address this work in practice, we spoke with Renee Bomgardner, Project Manager at Gallery Systems. With more than 20 years of hands-on experience in collections management and a career shaped by roles at institutions including the Philadelphia Museum of Art, The Barnes Foundation, the National Museum of American Jewish History, and the National Society for the Daughters of the American Revolution, Bomgardner brings a practitioner’s perspective to her work. Since joining Gallery Systems in 2022, she has supported museums through data migrations, inventories, collection moves, and data standards initiatives. As she notes, when institutional guidelines or established standards are absent, users “ultimately end up entering data in various formats,” allowing inconsistencies to compound over time.
This blog outlines practical strategies for identifying, addressing, and managing inconsistencies in collections data, with a focus on tools and workflows that support standardization at scale and lay the foundation for long-term data governance.
Making Inconsistencies Visible
One of the first steps in standardizing collections data is making inconsistencies visible. Variations in terminology, formatting, or data entry are often introduced over time through different staff practices, legacy systems, or evolving standards. These issues can be difficult to identify without tools that allow records to be reviewed side by side. Improving visibility enables institutions to assess the scope of inconsistency, determine preferred standards, and decide whether issues can be addressed manually or require bulk action.
A common example is the date field, where entries such as “not dated,” “nd,” and “n.d.” may coexist within the same dataset. While these values may convey similar meaning, they introduce inconsistency that can affect searching, reporting, and data reuse. When these variations are viewed together, institutions can make informed decisions about standard terminology and formatting.
Within TMS Collections, the list view provides a straightforward way to surface these inconsistencies. By sorting records using column headings, users can group identical or near-identical values together for review. As Bomgardner explains, “when looking at object records in the list view mode, the column headings can be sorted,” allowing patterns and discrepancies to become immediately apparent. For a small number of records, manual correction may be sufficient; larger datasets, however, typically require more systematic approaches to standardization.
Using Packages and the Importer/Update Tool for Scale
When standardization efforts extend beyond a handful of records, grouping records into an object package becomes a powerful next step. After identifying anomalies through searching, users can collect affected records into a package and apply updates at scale. As Bomgardner explains, “having all of the records in an object package will permit you to use the TMS Importer/Update Tool.”
Working through Excel, the Importer/Update Tool enables bulk updates while leveraging familiar spreadsheet functionality such as copying and pasting. For institutions managing thousands of records, this method balances efficiency with a controlled review process, making it particularly well suited for large-scale normalization projects.
Batch Updates and Collaborative Workflows in TMS Collections
Batch updating is a core strategy for standardizing collections data at scale. Rather than correcting records one at a time, batch updates allow institutions to apply consistent changes across groups of records that support faster clean-up, improve accuracy, and strengthen long-term data governance. This approach is especially important when addressing legacy data, enforcing controlled vocabularies, or preparing records for reporting, integrations, or public access.
TMS Collections supports batch actions across many fields and modules, allowing updates to be made directly within the system once records are identified. According to Bomgardner, this approach “empowers users to make the changes quickly,” reducing the time often associated with large data-clean-up initiatives. often associated with large data-clean-up initiatives.
In addition to batch updates, TMS Collections enables collaborative workflows through dashboards and saved queries. These queries can be shared across teams, allowing multiple users to work simultaneously on standardization efforts. Bomgardner notes that saved queries “allow for cross-collaboration between curatorial departments in institutions or provide a way to set up projects for interns or volunteers.” This shared-work approach not only accelerates clean-up but also distributes responsibility for data quality across the organization, reinforcing standardization as an ongoing practice rather than a one-time task.
![]()
Global Changes and Institutional Oversight
Some institutions choose to make updates directly at the database level using SQL, particularly when working with very large datasets. Bomgardner notes that “updating records in SQL is an efficient way to make changes to a large group of records,” especially when records have first been isolated in a package.
However, this efficiency comes with important governance considerations. One key limitation is that “these changes will not appear in the audit trail,” which may be a concern for institutions with strict documentation, compliance, or provenance requirements. For this reason, SQL updates should be carefully planned and reserved for scenarios where speed outweighs the need for detailed change tracking.
From Clean-up to Culture: Sustaining Standards Over Time
While TMS Collections offers multiple tools for standardizing data, long-term success depends on more than technology alone. As Bomgardner emphasizes, “the challenging part is ensuring that there is a catalogue standard to implement along with staff members to administer the standards.”
From a leadership perspective, standardization should be understood as an ongoing practice rather than a one-time corrective effort. Clear cataloguing guidelines, regular training, and designated data stewards help ensure that newly cleaned data remains consistent going forward. When embedded into daily workflows, standards become a foundation for better reporting, stronger integrations, and more reliable public access to collections information.
![]()
Strengthening the Data That Supports the Mission
Whether through manual edits, batch actions, importer tools, dashboards, or database-level updates, each method supports the same objective: a more consistent and trustworthy collections database. Standardizing mass updates in TMS for Windows and TMS Collections is ultimately about enabling museums to use their data with confidence while laying the groundwork for future initiatives.
In this way, data standardization moves beyond housekeeping. It becomes a strategic investment in the integrity, sustainability, and impact of the museum’s collections information. To learn more about how TMS Collections can support data standardization at your institution, complete the form on our Contact page and a Gallery Systems expert will be in touch.