Scaling Design Systems with Living Tokens [Draft]
In most design systems, tokens are implemented as static variables. While effective, this approach becomes increasingly difficult to maintain as products grow adaptive features like density modes, dark/light themes, accessibility requirements, and white-label customization.
Here, I present a case study comparing two token management strategies: Static and Living token definitions and their impact on token volume, maintenance effort, and compliance.
1. Static Tokens
When implementing static systems, each token variation in the design system becomes a concrete value. For example, given the card component above, to support different density modes (comfortable, compact, regular) and different schemes (light, dark), we would need to define individual tokens for each combination.
1.1 Implementation Mapping
Here is a representation of padding and colors mapping to CSS variables:
1.2 Challenges
This approach leads to several challenges, specially if token definition is part of the implementation cycle, some of the potential issues include:
- Token Explosion: The number of tokens grows exponentially with each new variation, leading to a bloated token set that is hard to manage.
- Maintenance Overhead: Updating a design decision requires changes across multiple tokens, increasing the risk of inconsistencies and errors.
- Higher Regression Risk: More tokens mean more potential points of failure, increasing the likelihood of regressions during updates.
- Platform Synchronization: That same set of tokens must now be re-implemented per platform(Web, iOS, Android), increasing the effort to keep them in sync.
1.3 Static Token Lifecycle
Adding new variation requires multiple steps across design and engineering teams, increasing the time and effort needed to implement changes.
Example: Introducing dense mode (ultra-compact) to existing system.
Each new variation multiplies future maintenance.
1.4 Complexity
Static tokens encode rules like:
- Units scale by 4px increments
- compact is ~75% of comfortable
- Dark mode needs stronger shadows
- Text must meet contrast ≥ 7
Rules and relationships are hidden in token names and values, manually documented making the system hard to use. Token count is not the problem.
Change amplification and Usage complexity are.
2. Living Tokens
To allow for more adaptive systems, we can define tokens as functions that compute values based on parameters like density, theme, and accessibility settings. This could be for example a rule systems based on a TypeScript API that generates the necessary values at runtime or build time.
2.1 Implementation
Instead of defining each token variation explicitly, we define a set of base tokens and rules for how they should adapt based on context. This drastically reduces the number of tokens needed shifting complexity from token count to token logic.
type Density = 'comfortable' | 'compact' | 'regular';
type Schemes = 'light' | 'dark';
const baseUnit = 4;
const padding = {
small: {
confortable: baseUnit * 3, // 12px
compact: baseUnit * 2, // 8px
regular: baseUnit * 4, // 16px
},
medium: {
confortable: baseUnit * 5, // 20px
compact: baseUnit * 3, // 12px
regular: baseUnit * 6, // 24px
},
large: {
confortable: baseUnit * 7, // 28px
compact: baseUnit * 5, // 20px
regular: baseUnit * 8, // 32px
},
};
2.2 Living Token Lifecycle
When adding new variations, we only need to update the token logic, not create new tokens. the token logic can be packaged to produce platform-specific implementations, ensuring consistency across platforms.
3. Conclusion
This article shows the challenge in design token systems is not usage or rendering, but where complexity is expressed. Static tokens encode adaptation by duplicating values, which amplifies maintenance and drift across platforms. Living Token systems centralize logic, allowing a single change to propagate safely and consistently everywhere.
For highly customizable, multi-platform systems, the most effective design tokens are not static variables, they are rules that generate variables. Design systems that scale treat tokens as decisions, not just values.