How Solid Software Became an Elite Software Team Using Git Analytics.
455%
More Throughput
89%
Faster Cycle Time
0.48%
Change Fail Rate
Solid Software builds amazing Flutter applications for startups and big tech companies worldwide. Selected for the Flutter early adopters program by Google, Illia Romanenko (CEO) wanted to make the most of their expertise by scaling Solid Software to accommodate Flutter's public release.
With the help of Matt Edwards, (the mobile native foundation) Illia started over 13 experiments to improve their team's delivery. By using Haystack to spot blockers and measure the impact of those experiments, Illia set his team on a path of continuous improvement.
"Haystack was able to introduce a new vocabulary for how we assessed team performance using data. So instead of just people's opinions, or looking at a group now, on a day-to-day level, and then zooming out on a quarterly basis, our teams are getting what they need to be successful."
Matt Edwards - Mobile Native Foundation
The Challenge: Scaling With the Right Processes.
Solid software started the year as a small team. Being one of the few early adopters of Flutter selected by google they were in high demand as flutter was introduced to the public.
“When we started we were a little bit smaller, and Flutter was just introduced to the public. We were always preparing and improving, but we wanted to have a sort of platform to bring out all of our experience to some of the projects. And we wanted to assemble the team quickly, and make sure that we are working effectively.”
Illia Romanenko - CEO & CTO, Solid Software
This meant that they had to scale fast and improve their software delivery process at the same time. With no clarity into what worked, they struggled to identify proper changes. To overcome this, Illia worked with Haystack and Matt Edwards, from the mobile native foundation. With clear insights, they tried out many experiments to start on a path of continuous improvement.
“We needed a standard definition for what a high-performing team looked like. And what I found is every team had their definition of it and their OKRs. And our planning process wasn’t quite providing clarity to stakeholders on which teams are deserving of recognition or reward and which ones need help.”
Matt Edwards - Mobile Native Foundation
Journey
With a key focus on their process, automation, and tracking, Illia ran 13 experiments with his team. Matt who was helping 30 other teams saw Illia’s team rapidly improve with their healthy approach to experimentation.
“Some of the teams outside of my work, they had given up, they had kind of fatigue around process and some elements of process resistance. So they just didn't track work on tickets, and they didn't have a process, work just kind of happened as it was assigned to them.”
Matt Edwards - Mobile Native Foundation
Solid Software first tried doing small tasks and introducing limits on the size of pull requests. This helped give their team a clear way to measure predictability and deliver with better quality, faster.
They introduced code reviews to help maintain code quality and a low change-failure rate. They added new “team rules” around poor code and documentation based on issues that would usually surface after code reviews.
They then doubled down on their team collaboration, by introducing processes like buddy systems where developers would work in pairs when testing code. They also worked on improving their knowledge-sharing systems by focusing on better documentation and tooling to allow them to work seamlessly as a remote team while also onboarding new members.
“Once we moved into the implementation phase, there wasn’t much room left for mistakes, we solved all the issues beforehand.”
Illia Romanenko - CEO & CTO, Solid Software
When it came to automation, Solid Software built and employed as many tools as they could, to minimize the time and potential errors that could occur for tasks that would have otherwise been done manually.
“Our idea was just to automate everything that we can. As we know, manual steps, they're expensive to execute and we're error-prone, and they take more time. So for example, when we wanted to reduce PR sizes, a PR size Labeler was a very handy indicator of the complexity of the increments. That helped us see if they were small enough increments or not, which helped us to do better and split better on the next one.”
Illia Romanenko - CEO & CTO, Solid Software
As Illia’s team introduced over 13 different process changes and automation, they began to track the effects of the changes they made. They would then assess which ones helped make them better and which ones didn’t, allowing them to figure out the best way to improve their team and experiment frequently with confidence.
“Haystack was very helpful in finding out, are we helping the team with the change? Are we actually kind of causing some frustration? And I think by doing experiments one at a time, and then having some data where we could all align on what success looks like, before rolling out the experiment. That was kind of how I felt comfortable that it wasn't actually too much change happening.”
Matt Edwards - Mobile Native Foundation
Results
With a drive for experimentation and access to actionable insights, solid software was able to set their team on a path of continuous improvement.
“Looking at all the changes as a whole and seeing, your work actually produces the results and having, like some data to support that. That's something that made me very proud.”
Illia Romanenko - CEO & CTO, Solid Software
Solid Software was experiencing a dip in productivity as the year began. They held an average cycle time of 7.2 days, with some PRs taking as long as 20 days to complete. With their experimentation, they saw a significant decrease by the end of the year as their cycle time reduced by 89% falling to an average of 1.1 days with much greater consistency.
This was due in part to getting better at consistently reducing the size of their Pull Requests to find an optimal size that worked for their team, which in their case was less than 500 LoC.
At the same time, Solid Software’s throughput increased by 455%, as they effectively delivered software at a much higher rate.
These improvements all culminated in a 200% faster deployment frequency, as Solid Software went from 4 deployments a week to delivering 12 deployments in the same time frame. They achieved this whilst maintaining a high standard of code quality indicated by their change failure rate of 0.48% with an average recovery time of only a few seconds meaning that their team deployed much faster with confidence, as they achieved the same level of quality as elite performers.
“There was something that we could actually see the impact with. And I think that the team was pretty happy with all of that. It gave us some meaning and purpose for what we were doing. It's not just building some software, but building it right, building it properly and improving it.”
Illia Romanenko - CEO & CTO, Solid Software
Matt shared Solid Software's success across all the other teams that he was working with, using them as an example of how elite teams should operate, and standardizing the same NorthStar Metrics that they tracked through Haystack.
“Haystack was really helpful in using a standard that everybody could understand both for the folks on the team and the managers, to just get a clear sense of what does success look like? And how do we define that objectively, In a way that can't be easily gamed, and it replaced some other methodologies that we had been using prior which were not accurate, such as counting the number of commits.”
Matt Edwards - Mobile Native Foundation