Welcome to the Jira for Designers series brought to you by Design at Scale™ – Academy. In a previous article, we discussed Figmas Segmenst(001↘︎), and we designers help our engineering team locate the right/final things to build. Meanwhile, we will support the ongoing development by versioning our creation. To enhance the impact of the design function in your organisation, we’ll look into Velocity and KPIs. No need to panic. This is a design-centric explanation with further references if you wish to dive deeper into the topic.

Design delivery
Delivery is about the outputs(002↘︎). No outputs, no delivery and no outcomes. Design delivery has often been classed as colouring the wireframes in an advanced sense or, in a loose sense, just a template creation. It took us almost a decade to gain recognition amongst the business and engineering to include the UXR, behaviour-driven CX and automated UI generated by design systems teaming.
Today, design delivery is managed by Waterfall, Agile, Scrum, and Kanban, which have to subserve under the methods that originally didn’t consider the complexity of the reach and impact of “good” and “bad”. That is why the new generation of designers is now asked to measure the impact, performance velocity and KPI of their design regardless of whether the design is “good” or “bad.”
Let’s separate this into internal and external.
Internal
This means that the impact is measured on the team and internal operations. It defines the ways of working, handovers, dependencies, course of truth and internal communications to get things done. We measure, analyse and adjust all possible sources where we can improve the work for ourselves and our colleagues.
External
On the other hand, external measures impact the customer and the overall growth of the organisation in a specific market. Regardless of the size of your organisation, you’ll quite possibly solve internal problems first and then care about external ones, right?
Sadly, 9 out of 10 Sartup fails as they do not define SOP – Standard Operating Procedures. In other words, who does what and when? This has a significant impact on the evaluation and the overall perception from the investors, but this is another story.
Even today, the agencies of the record still ship MS Excel to the client, providing the budget calculation on the brief that does not have HLRs and UACs – you know these as “make it pop”!
The important thing for us is to focus on the internal team performance because that is what we all have in our complete control, and it’s the only thing that we can really control and improve.

Performance
Let’s look at how we can translate this for individuals, teams, and the organisation department.
Individual
Individual performance can be measured by the speed of using a single tool – for example – Figma. AKA one designer delivering 3-5 screens in a day (if you can, of course). This way, all designers will focus on perfecting their Figma skills to deliver the task and trade their speed/time for money. The coefficient for the designer here would be the number of screens in a specific timeframe, such as a day, week, or month. Regardless of the designer's excellence (or speed), it does not increase the impact of the team on the organisation.
Team
Team performance is often measured by the speed of feature delivery. A complex equation can be simplified by a number of people delivering a feature over a period of time. This approach combines individual skills and increases the focus on handover and quality of outputs. If the first chain is defective or incomplete, the impact on performance and deployment can be catastrophic. Equally, if not agreed upon (in the absence of SOP), many team members do everything and deliver nothing—constant change. As we embrace constant evolution, constant change is counterproductive. That means that everyone can change the objectives, outcomes of User research, size of the button or language for deployment, or the date for the release.
Performance can be high, but the impact on customers or businesses is low. Therefore, at Design at Scale™ Academy, we focus on optimising the SOPs first and then building the appropriate knowledge base to reduce miscommunication and empower informed decision-making.
Departments
Measuring the impact of the department can be complex, yet it can simply be broken down into two factors. One of them is the ability to respond to change, and the second is upskilling and automation, which can be translated into the measured impact of one IC on the organisation.
Let’s break this into a simple scenario where we have 20 UXR across the business because we have 20 + feature teams, or we have a UXR function that is well managed and integrated into the business and uses advanced automation to fulfil all testing required.
This way, the organisation empowers the technology run by the high-value team where the expertise and its impact are well integrated, wishing the basic process and measured accordingly.
Despite the design automation advancements of UXR, design systems, and tokenisation, we are often buried in internal bureaucracy and delays from external teams. In Design at Scale™ – Academy, we are focusing on synchronicity and transparency between departments while empowering the design team to create and broadcast. That way, all teams are informed about changes, which essentially allows us to swap research, testing, or design among our dedicated teams.
It does not matter how smart one department or feature team is if they do not share, broadcast, and contribute; the impact is zero. Actually, the delivery of one feature is slightly faster.

Velocity
This brings us to velocity. In Agile, the velocity is a simple calculation that measures the unit of work completed in a specific timeframe. We often see the units of complexity, user stories, or story points. The same applies to the timeframe. We can see measures in iterations, sprints, or weeks.
Most Scrum teams measure the number of user points in a given sprint. That means all tasks are given story points. These are then conducted as “in a given timeframe, let’s say a week, our team can deliver 40 story points”.
This gets' measured across a few (3-5)sprints to understand the team's capacity. Allowing the CPO to predict how many story points they should plan to complete the sprint. Ultimately, this reveals the number of sprints it will take to complete a project or specific release.
This dictates the team's efficiency and velocity, which can be communicated upwards and downwards. Upwards to resourcing, HR, finance and planning greater features for additional releases. Downwards to day to day delivery product team improvements.

Measure what matter
Why do we measure it? And why does it matter? Only some design studios of the last century were renowned for their great work. That way, no one has questions about the impact of Creswell Munsell, Schubert & Zirbell (1974), Hutchins/Darcy (1977), Frye-Sills (1975), and Bruce B. Brewer (1974), plus B2B powerhouse Marsteller (1979).
Digital technology brings properties that can be measured, and measurement (some say) has killed the design industry. Nowadays, it’s more about customer retention, speed of delivery, savings, cost reduction and impact. Externally – we drive the click rate and number of engagements. Social currency goes over the roof, where advertisers get paid for the display of their products and services. Internally – we gave up on creativity and embraced automation in order to deliver more screens that mean less in the hope that the analytics will tell us what is better.
Do you know what the good looks like – if you do, then you can measure it. If you don’t, you are spinning in the circle like Analytics rabbit to understand the numbers without having a real impact on the brand value, client acquisition and impact on your customers. Dozens of agencies and startups go through the same process to distinguish themselves from the rest – having no strategy and no plan to succeed. Actually, in fact, we all have the same plan 🙂
So, the only way you can distinguish yourself or your agency from the rest is through your unique approach to delivery. Faster, smarter, model A or model B. The combination of the methods is the only way to gain a competitive advantage. At the end of the day, we all deliver the same = screen, graphics, composition.

Design KPIs
A number of articles have been written about Design Efficiency, User Satisfaction, Design Iterations, Design Consistency and so on. If we all measure everything, who will do the work? More importantly, why measure everything where there is no change?
That is why we at Design at Scale™ – Academy help designers measure what matters in environments that are not inherently ours but make a positive impact on design delivery. Amongst many tools, we embrace Jira – a place where we can measure time to hand over the file, research, its impact, time to build the screen and its impact, resourcing allocation and many other design-related emphases.
Happy scaling through design!
Hey, I’m Jiri Mocicka.
London-based Design Director, Trusted Advisor and Author of Design at Scale™. The method that empowers individuals to shape the future organisation through design.
If you have a question, join our Community and reach out to like-minded individuals who scale design propositions. An online Academy can help you to find your feed in teams of 01, 10, and 100, supported by Grid Magazine and Supply section, where we weekly bring more insights on how to become a design leader in your organisation.