The Value of Design Systems

When the topic of a design system was first brought up at my organization many of the senior stakeholders saw little value in dedicating full-time resources to the effort. In fact, prior to standing up my team a previous effort to create a design system had failed, largely because of the aforementioned lack of resources. That system’s designer and developers were still attached to other projects, ones that brought measurable value to the company, and accordingly could only dedicate a limited amount of time to maintaining the system. As the old system became more robust and more neglected it became a point of frustration for teams. By the time I was brought in to revive the project there was so much scar tissue that “design system” had become a dirty word.
Developing the System and Defining the Value
Measuring the value of a design system is a stumbling block many design departments struggle to overcome. Chances are both designers and developers can tell you why a design system has value. It gets developers and designers speaking the same language. It aligns color palettes and branding for products or product suites. It allows product teams to spend less time thinking about the user interface (UI) and more time thinking about the user’s experience (UX). It enables designers to wireframe and prototype faster. All of these are great reasons to develop a design system and there are countless additional reasons I won’t mention. When it comes time to fight that fight with leadership there is a great chance what they’ll ask about are the tangible benefits. Give me some metrics! Show me some proof!
I would be lying if I told you this wasn’t a problem I struggled with for a long time. I knew there was value in this effort and I knew it would be a win for my organization. I knew that getting a design system in place, developing it, growing it, and teaching it would be a benefit to everyone at the company. I knew we could develop better practices in our designers as well as deliver more consistent products, faster. Still, as I pulled together a rag-tag team of developers and designers I knew measuring our success would be a tough nut to crack.
Over the next several months our days were preoccupied with defining the design system. We worked with the company’s designers to define the principles of the system. We dissected color palettes before determining an appropriate light and dark theme. We created a component library for our company’s design tools. We built an internal documentation website with working JavaScript and TypeScript examples of all of our components, design recommendations, developer notes, and a code sandbox. This work, I should add, continues to this day and will never truly be complete.
As my team prepared to release our system into the wild I can’t help but admit we still weren’t sure how big of an impact we’d make. Early adopters of the system experienced a lot of pain converting their products from fully customized front end Frankensteins to unified and consistent interfaces. At times, their front end work would grind to a halt while they dealt with the pain of refactoring features that touched new and old components. Countless hours were spent troubleshooting with these teams to teach them the ins and outs of the system and help them convert their products.
With adoption underway again the question of tangible benefit began to creep into the picture. Stakeholders saw their product teams slow down while they went through the pains of converting to the system. It was only natural for leadership to begin questioning the value of the system. After all, at this point my team’s work seemed to be costing the company money. Consensus was that the products started looking a little nicer, but at what cost? Was a small facelift and some consistency really worth the lift established product teams needed to undergo to adopt? It was time for us to deliver on that promise of proof.
Measuring Impact, the Olympia Case Study
We held several internal discussions on what type of metrics we could try to measure. We had the ability to read through team’s backlogs and look through their commits, but did not have the tools in place to automate the process. Accordingly, we had to find a product team to use as a case study. One that could be seen as an accurate representation of the products my company builds. We settled on a team I’ll call Olympia. Olympia was an established team with a product in production, active users, fully custom front end designs, and a nonlinear user flow. They underwent the pain of design system adoption at the insistence of their portfolio leadership, who thought the design system would be a great way to unify the “look and feel” of their products.
My team began combing through Olympia’s backlog. We tracked down and took notes on every single front end story in their team’s history. We wanted to know when their product manager requested a story, how complex (on a scale of 0–3) their developers thought that story was to implement, and how long it took for that story to be completed. We also began sorting through their commit history for front end commitments. We were looking for how many files were touched, how many lines of code were inserted and how many were deleted.
In Olympia’s first 10 months of existence they utilized no design system and built everything from scratch. During that time Olympia had an average of 3.80 design stories requested per week and it took each story an average of 22.18 days to be completed. Prior to adoption, on average Olympia’s developers touched 5.87 files per commit, with an average of 93.07 lines of code inserted and 61.23 lines deleted. Olympia’s developers rated design story complexity at 1.00 on average.
After adopting Olympia’s design story requests were nearly cut in half, dropping from 3.80 to 1.95 stories requested per week. Story complexity increased 36 percent, but stories took an average of about 3 less days to complete (19.26, down from 22.18). Per commit the developers were touching less files (4.73 on average, down from 5.87), inserting 20% less lines (75.15, down from 93.07), and deleting 35% less lines (39.55, down from 61.23).
In short, developers were doing more complex design stories faster, while the product manager requested them less often.
The impact becomes even more impressive when you extrapolate the data over a full year of development. Prior to adoption Olympia was on pace to see roughly 197 design stories requested in a 12 month period. At their pre-adoption development pace of 22.18 days per design story that’s over 4,214 developer days of work dedicated to front end development. After adopting our design system Olympia’s design story pace dropped to 101 stories in a year. Their increased velocity meant those stories would require 2,336 fewer developer days to complete. This was the impact the design system had on one team at my organization, where we manage several portfolios and over three dozen products.
The reasons to implement a design system are innumerable but the single biggest one might just be the time (and money) saved.