FeaturesHow It WorksArchitectureIntegrationsPricingBlog
Product10 min read

PRISM Architecture: Bronze, Silver, Gold for Financial Data

By Cupel Team
prismdata-qualityfinancial-services

Financial services firms handle some of the most demanding data workloads in any industry. Transaction records arrive in high volume and must be processed with zero tolerance for error. Customer data flows in from dozens of systems and must be reconciled into a single, accurate view. Regulatory reporting requires auditable lineage from raw source to final output. The stakes are high: a data quality failure can mean regulatory fines, mispriced risk, or eroded client trust.

The PRISM architecture -- a layered approach that progressively refines data through Bronze, Silver, and Gold stages with quality gates between each -- is designed to bring structure and reliability to this complexity. It is not a new idea. Medallion architectures have been discussed in the data engineering community for years. What matters is how the layers are implemented, how quality is enforced at each boundary, and how the architecture handles the specific demands of financial data.

The Three Layers

Bronze: Raw and Append-Only

The Bronze layer is the landing zone. Data arrives here exactly as it was received from the source system. No transformations are applied. No deduplication occurs. No schema normalization happens. The raw data is persisted in its original format -- whether that is JSON from an API, CSV from an SFTP transfer, Parquet from a data lake, or relational rows from a database replication stream.

The Bronze layer is append-only by design. Records are never updated or deleted. This creates a complete, immutable history of every piece of data that entered the platform -- critical for audit trails in regulated industries.

For financial services, the Bronze layer serves several purposes beyond simple storage. It is the canonical record of what was received, when it was received, and from which source. When a regulator asks "show me the raw transaction data you received on March 15th," the Bronze layer provides the answer without ambiguity. When a downstream quality issue is traced back to a source problem, the Bronze layer provides the evidence.

The append-only model also simplifies reprocessing. If a Silver-layer transformation is corrected or improved, the pipeline can reprocess from Bronze without any concern about whether the original data has been modified. The source of truth is always intact.

Silver: Cleaned, Typed, and Deduplicated

The Silver layer is where raw data becomes usable. This is the transformation stage where Cupel applies cleaning rules, type casting, deduplication, and schema normalization. The goal is to produce a dataset that is structurally consistent, free of obvious errors, and aligned with the team's data model.

For financial transaction data, Silver-layer processing typically includes several critical steps. Deduplication removes repeated records that arise from source system retries or overlapping extraction windows. Type casting ensures that monetary amounts are stored as decimals with appropriate precision, not as strings or floating-point numbers that introduce rounding errors. Date normalization converts timestamps from various source formats into a consistent timezone-aware representation. Currency code validation confirms that ISO 4217 codes are present and valid.

For customer data, the Silver layer handles name standardization, address parsing, phone number formatting, and the resolution of conflicting records from multiple source systems. This is where the "single source of truth" for customer identity begins to take shape -- not through complex entity resolution algorithms (which belong in a later stage), but through basic cleaning and standardization that makes entity resolution possible.

The Silver layer also applies schema enforcement. Every record that exits the Silver layer conforms to a defined schema. Columns have specified types, nullable constraints, and value ranges. Records that do not conform are not silently passed through. They are caught by the quality gate between Silver and Gold.

Gold: Aggregated and Analytics-Ready

The Gold layer produces the datasets that business users, analysts, and BI tools consume. This is where aggregation, star schema construction, KPI calculation, and business-specific logic are applied. Gold-layer outputs are designed for a specific analytical purpose.

In financial services, Gold-layer datasets often include daily transaction summaries by product line, customer segment, or geographic region. They include Assets Under Management (AUM) calculations that roll up portfolio positions with current market prices. They include regulatory reporting datasets that conform to the specific schema and calculation rules required by BCBS 239, MiFID II, or other regulatory frameworks.

Gold-layer datasets are not general-purpose. Each Gold dataset is built for a specific consumer: a Power BI dashboard, a regulatory submission, a risk model input, or a client-facing report. This specificity is intentional. It allows the Gold layer to optimize for the access patterns and calculation requirements of its consumers rather than trying to be everything to everyone.

Quality Gates: The Enforcement Mechanism

The layers alone are not what make PRISM architecture valuable. The quality gates between them are. Without enforcement, a layered architecture is just a naming convention. With quality gates, it becomes a system of progressive assurance.

Quality Gate 1: Bronze to Silver

The first quality gate validates the raw data before any transformation is applied. This gate checks structural requirements: Does the data conform to the expected schema? Are required fields present? Are data types parseable? Is the data fresh -- received within the expected time window?

For financial data, this gate catches common source-system problems. A trading system that suddenly starts sending timestamps without timezone information. A payment processor that changes its CSV delimiter from comma to pipe. A regulatory feed that introduces a new column that breaks downstream schema expectations.

When a record fails a quality gate, it is not dropped silently. It is quarantined -- moved to a separate storage location with metadata about which check failed and why. The pipeline continues processing valid records while alerting the data team about the quarantined ones.

Quality Gate 2: Silver to Gold

The second quality gate validates the cleaned data before it is aggregated into Gold-layer outputs. This gate applies business rules and referential integrity checks. Do all transactions reference valid customer IDs? Are monetary amounts within expected ranges? Do foreign key relationships hold across related datasets?

For regulatory reporting, this gate is especially critical. A BCBS 239 submission requires that risk data be accurate, complete, timely, and adaptable. The quality gate between Silver and Gold is where these requirements are enforced programmatically -- not through manual review, but through automated checks that run on every pipeline execution.

Statistical distribution checks also belong at this gate. If the average transaction amount suddenly shifts by three standard deviations, that is a signal worth investigating before the data flows into Gold-layer aggregations and from there into a risk model or a client report.

Quarantine and Alerting

Quality gates do not just pass or fail records. They produce a rich set of metadata about data quality at each boundary. Pass rates, failure reasons, failure distributions by source, and trends over time are all captured and surfaced in quality dashboards.

When a quality gate quarantines records, the pipeline does not stop entirely. Valid records continue through the pipeline on schedule. Quarantined records are held for review. The data team receives an alert with the failure details and can decide whether to fix the source data and reprocess, adjust the quality rule, or accept the quarantine as expected behavior.

This approach is essential in financial services, where pipeline downtime has direct business impact. A quality issue in one source should not block the processing of data from all other sources. The quality gate provides selective enforcement without all-or-nothing pipeline behavior.

Financial Services Applications

Transaction Monitoring and KYC/AML

Anti-money laundering and Know Your Customer pipelines are among the most compliance-sensitive in financial services. Raw transaction data arrives in Bronze from multiple payment systems, each with its own format and conventions. The Silver layer normalizes transaction records, deduplicates across systems, and applies currency conversion. The Gold layer produces the aggregated views that feed transaction monitoring rules -- daily volumes by account, cross-border transfer patterns, and velocity metrics that flag suspicious activity.

The quality gates ensure that every record that reaches the monitoring rules has been validated. A false negative in transaction monitoring -- a suspicious transaction that is not flagged -- can have severe regulatory consequences. Quality gates reduce this risk by catching data issues before they reach the monitoring logic.

Customer Master and MDM

Master Data Management for customer records is a classic data quality challenge. Customer information arrives from CRM systems, onboarding platforms, trading systems, and contact centers. The Bronze layer captures all of it in its original form. The Silver layer standardizes names, addresses, and identifiers, and begins the process of matching records that refer to the same individual or entity. The Gold layer produces the Customer 360 view -- a single, reconciled record for each customer that downstream systems can rely on.

Quality gates between Silver and Gold enforce referential integrity. Every customer in the Gold layer must have a valid account relationship. Every account must have a valid product assignment. Orphaned records are quarantined rather than passed through to the master dataset.

Regulatory Reporting

Regulatory submissions require precise data lineage. When a regulator asks how a particular number in a report was calculated, the institution must be able to trace it back through every transformation to the original source data. The PRISM architecture provides this lineage naturally. The Gold-layer report value traces back through the Silver-layer transformations to the Bronze-layer raw data, with quality gate results documented at each boundary.

This is not just a compliance convenience. It is a structural requirement for frameworks like BCBS 239, which mandates that risk data aggregation be transparent and auditable. The layered architecture, combined with quality gates that produce metadata at each boundary, creates the audit trail that regulators expect.

Template-Driven Setup

Building a PRISM pipeline from scratch for every use case is unnecessary. Common patterns in financial services -- transaction processing, customer data management, regulatory reporting -- follow well-established structures. Cupel provides pre-built PRISM templates that encode these patterns as starting points.

A Financial Transactions template, for example, pre-configures the Bronze layer with common transaction source schemas, the Silver layer with standard cleaning and deduplication logic, and the Gold layer with typical aggregation patterns for transaction monitoring. The quality gates come pre-configured with financial-services-appropriate thresholds -- null checks on mandatory fields like transaction amount and currency code, range checks on monetary values, and freshness checks on transaction timestamps.

Templates are not rigid. They are starting points that teams customize using the visual pipeline builder. Add a new source, modify a transformation, adjust a quality threshold, or add a compliance step -- all within the same canvas. The template provides structure; the team provides the domain-specific refinement.

Progressive Assurance, Not Wishful Thinking

The fundamental value of the PRISM architecture is that data quality is not assumed. It is measured and enforced at every stage. Raw data is preserved for auditability. Cleaned data is validated before aggregation. Aggregated data is produced for specific analytical purposes with documented lineage back to the source.

For financial services teams managing complex, compliance-sensitive data workflows, this layered approach with embedded quality gates transforms data reliability from a hope into a guarantee. Cupel's PRISM architecture, combined with its visual pipeline builder and pre-built financial services templates, provides the structure to build these workflows efficiently and the enforcement mechanisms to keep them trustworthy. If your team is working with financial data and needs a platform that treats quality as a first-class concern, take a closer look at how Cupel approaches data architecture.

Ready to build your data platform?

See how Cupel can streamline your data engineering workflows.

Explore Features

Related Posts