If 40 teams customize the same button, your Design System has a problem (and how to detect it)

Design Systems teams in complex contexts carry a silent burden:

No one knows for sure how much of what they build actually impacts production.

You have documentation. You have components. You have governance.

But you don't have visibility.

One day you review 3 different products and find that the same component was implemented in 4 different ways in each one.

And it happens silently. Without anyone seeing it as a problem.

That is the reality of scaling a system in contexts with multiple autonomous teams.

It's not chaos. It's invisibility.

#

#

🧩 The metrics dilemma


In complex financial contexts, the situation is more complicated than in other sectors.

It's not just that you have multiple products: you have regulations, audits, compliance, and constant pressure for efficiency.

A change in your design system isn't "fast and furious"; it's coordinated, measured, tracked.

So, how do you measure if your system works?

Here's the important part:

#

#

🎯 The trap of counting components


I've seen leaders say: "We used 450 instances of our button last month."

Sounds impressive.

Means nothing.

Why?

A button inside a header that appears on 30 pages: does it count as 1 or 30?

A component copied and pasted in 15 places, does it count towards your system or count as 15 failures?

More importantly: if you build 200 components but only 15 are seriously used...

Do you really have a system or do you have a giant library that no one maintains?

The brutal truth is that most teams spend months measuring numbers that say nothing.


You need to change the approach.

From "absolute frequency" to "breadth of use".

Not "how many instances of a component".

But "how many different teams use this component."

#

#

📊 The minimum viable to start (and not overcomplicate)


First sprint (Week 1-2):

Forget 50 charts. You need 4 metrics. Period.

1️⃣ Visual coverage: What % of your interface uses DS components?

2️⃣ Adoption by team: How many teams use the system?

3️⃣ Consistency score: How many deviations are there per product?

4️⃣ Time to component: How long do teams take to implement a new component?

That. Is. It.

#

#

#

How to create your baseline in less than two hours:



* Select 3 real products from your company.

* Count manually (yes, old school):

Example: "In the onboarding flow, 7 out of 10 screens use proper DS components.*"

* That is your baseline: 70% coverage in onboarding.

* Define a realistic 90-day goal, for example: raise it to 85%.

* Calculate your "consistency score":

* Choose 20 representative screens (pick 3-4 key flows: onboarding, transfers, payments...).

* Visually review and note inconsistencies:

* "This button has 12px padding here, 16px there."

* "This input has a border, in the next one it doesn't."

* If your DS covers only 65% in user signup, the remaining 35% is visual and functional debt that will grow with release.

* Record every deviation (that is your initial score).

Trick to scale without going crazy:

* You can accelerate this process by automating part of the analysis:

* Upload screenshots of those 20 screens to a folder.

* Run a simple script (for example, in Cursor) or use a tool that detects visual or code-level inconsistencies.

* In the end you will get something like:

* "In these 20 screens I found X deviations. The most critical are: Button padding (8 times), Input border (6), Spacing in cards (4)."

#

#

🔥 The metric that nobody measures but should


Here comes what makes you shine:

Every time a team detaches or customizes an existing component (instead of using it directly), they record it.

Simple: a comment in Figma or a ticket.

"We used Button but with custom border-radius because our use case is different."

Now multiply.

If 40 teams detach the Button in 5 different ways = same problem solved 40 times.

Translation: massive waste.

Wasted hours. Inconsistency in production. Accumulated technical debt.

But here is the fascinating part.

Every detach you record is pure insight.

It's not an adoption failure.

It's a design failure.

Because now you know exactly where to iterate.

#

#

💡 The script that changes everything


Here is the final copy to explain code scanning (without Figma) and with your clear and technical tone:

How to know how many teams really use your standard component and how many go rogue?

A simple script gives it to you in minutes.
It doesn't touch Figma or docs—only real code.

Automatically scans the entire codebase:
Looks for imports of the type:


import { Button } from 'design-system';

Counts each import and groups it by folder, team, or context (you can map this to your monorepo or project naming).

Detects "copies" and customizations:
Locates components that are named the same but redefined locally, or that extend the original with styles, props, or duplicated code.
Example:


import { Button } from 'design-system';

// I create a variant for special cases
const DangerButton = (props) => (

Each case adds up as a "customized instance".

It gives you a real report:


Onboarding Team: 11 standard imports, 2 custom
Accounts Team: 17 standard imports, 7 custom
Payments Team: 13 standard imports, 1 custom

So you see, without interpretation, which teams adopt the pattern 100% and which ones "break" it.

You can extract extra info:
The script also lists the props used, most frequent variants, or props suspicious of being a hack:



What do you gain?

* Actionable data to know which components work and which invite custom.

* Solid arguments to prioritize refactors ("this team always breaks it, let's talk to them").

* A true view of the health and adoption of your Design System—the metric that really matters.

Zero human effort after setup (2 hours max).

#

#

🧠 Adoption prediction before it fails


Here is where simple machine learning shines (and I don't say this as AI hype).

The pattern is clear:

If you see that similar components were detached 3+ times in the last 90 days...

The new component you are going to launch will probably ALSO be detached.

Why?

Because there is an unresolved pattern in that component domain.

So, before launching that new component, you can:

* Automatically detect the historical detach pattern

* Correlate it with the new component

* Generate alerts: "This Button has 8 deviations in history. Your new component has similarities. Review these 3 use cases before launching."

Basically, you predict if something is going to fail before it fails.


The designer doesn't get frustrated. The component comes out better.

Adoption goes up automatically because you solved the real problem.

#

#

📈 How this grows without breaking


Month 1: 50% coverage in 2 pilot products. Baseline feo but honest.

Month 2: Leaders see the problem. Assign resources to improve rejected components.

Month 3: Coverage goes up to 68%. Teams start asking for new components (good sign).

Month 4-6: Scales to other products. Every new product that enters already starts at 65% coverage because they understand the pattern.

Month 12: 80% global coverage. System maintained by 2 people. Estimated annual savings: €200k-400k in design/dev hours.

#

#

#

❌ What NOT to do


Don't compare with other companies.

Every bank is different. Every context is different. Every design system is different.

Here is where most fail:

* Figma says the best practice is to have 80% reusable components.

* Another bank has 6 designers and 50 components.

* Design Systems Collective says you should have X metric.

* Forgetting to ask teams why they detach components.

* Measuring only frequency instead of understanding the pattern.

* Comparing your system with banks that have different regulatory and business realities.

Forget everything.

Your Design System is subservient to the project, not the other way around.

Figma's rules are not yours.

The best practices of another bank are not yours.

Why?

* Because your context has specific regulations.

* It has teams with dynamics that are not those of another bank.

* It has code histories and decisions that another company never faced.

Invest in knowing your context.
A simple log of when and why a button is detached can save thousands of euros/month in support and QA.


Then, design your metrics from there. Not from what others do.


Common myths in banking DS metrics:


* "More components = more maturity". False. (I've seen banks with +200 components that don't achieve even 60% visual coherence.)

* "Standard only matters in UX". False: in banking, the legal auditor uses your screens as evidence.

* "Comparing with benchmarks from other companies". False, your context rules.

#

#

🚀 Your first move (next 7 days)



* Take 1 product. Look at 3 complete flows.

* Count: how many components are DS, how many are custom.

* That % is your baseline.

* Set a 90-day goal: +15%.

* Meet with 2 teams. Ask: "What component would you need that doesn't exist? Which of our components is a problem?"

That is enough to move the needle.

Key Takeaways / Actions for managers:

Measure teams, not just instances.*

Record every custom, it is a strategic signal.*

Ask teams why they adapt.*

Adjust the score regularly: banking products evolve every quarter.*

If you want your team or your manager to see the impact, print this checklist and use it next quarter.


To dig deeper:

* Automated audits in Design Systems:
[Design System Metrics: From Adoption to Impact (by Zeroheight & DS expert Nathan Curtis)](https://blog.zeroheight.com/design-system-metrics-adoption-impact/)

* How to analyze real breadth of use with code:
[Measure Adoption of a Design System with Automated Scripts (Medium, Ryan Clark)](https://uxdesign.cc/measuring-adoption-of-a-design-system-32293ffb351c)

* Real metrics and tooling for Product Managers:
[Product Metrics: Find the North Star (by Intercom)](https://www.intercom.com/blog/product-metrics-north-star/)

* How to detect "custom forks" in component libraries:
[Detecting UI Consistency and Component Forks in Large Codebases (by Engineering @Airbnb)](https://medium.com/airbnb-engineering/design-system-science-cc0dc49dec10)

* Advanced governance guide for complex environments:
[Governance of Design Systems in Large Organizations (by Sparkbox)](https://seesparkbox.com/foundry/design_systems_governance_perspectives)

What would be the savings in hours (or reduction of technical debt) if you could visualize all the customizations that are hidden today?

#DesignSystems #Banking #ProductLeadership #Metrics #Efficiency