News

New Report Offers Data-backed Benchmarks on Engineering Team Performance

Continuous integration and delivery (CI/CD) platform provider CircleCI today announced the findings of its annual report on the state of software delivery ("2020 State of Software Delivery: Data-Backed Benchmarks for Engineering Teams"). In this year's report, the CircleCI researchers set the first-ever benchmarks for teams practicing CI/CD."

"The rollercoaster of 2020 has highlighted the competitive differentiator that being a well-oiled software delivery team provides," the report's introduction offers. "The minute Covid-19 hit and every organization had to become not just remote-first but remote-only, many teams were forced to reckon with the number of manual processes they had in place. Suddenly, they could no longer rely on the fact that there was a build machine under someone's desk, and if that machine had a problem, they could just reboot it. Suddenly, they needed to automate everything."

Based on an examination of more than 55 million data points from more than 44,000 organizations and 160,000 projects, the report surfaces patterns of "software engineering teams in the wild" in the age of Covid-19, and seeks to answer the question: "What do high performing engineering teams look like, quantitatively?"

The benchmarks CircleCI identified for teams practicing CI/CD include: 

  • Throughput: the number of workflow runs matters less than being at a deploy-ready state most or all of the time.
  • Duration: teams want to aim for workflow durations in the range of five to ten minutes.
  • Mean time to recovery: teams should aim to recover from any failed runs by fixing or reverting in under an hour.
  • Success rate: success rates above 90% should be your standard for the default branch of an application.

"If 2020 has shown as anything, it's that people need to be able to work in ways where they feel comfortable, with reasonable expectations and support in order to avoid burnout," said RedMonk analyst James Governor, in a statement. "One really useful aspect of the report's findings then is a focus in setting reasonable key metrics and performance benchmarks."

The list of key findings in this year's report includes:

  • Shortly after COVID-19 shut down our economy, developers stepped up. During the month of April 2020, developers pushed more code than during any other month this year and also had the highest success rate. The increase in code was also observed over the weekends. When success rates lift overall, it indicates that teams are taking a step back from innovation and focusing on shoring up their business-critical systems.
  • The majority of organizations have not mastered DevOps. Year after year, we hear in the industry that high performing teams are deploying dozens of times per day. While those stats make for great conference talks, our data doesn't corroborate this metric. Fifty-percent of workflows in the data set were run less often than once per day, at 0.7 times per day. 
  • In 2020, developers moved slower, tested more, and failed less as a result. The longest recovery times recorded in 2020 are shorter than those in 2019, signaling that workflows took longer to run. This was due to more testing, which resulted in less failure. With fewer failed builds the overall velocity of organizations increased compared to 2019.
  • Globally distributed organizations have a significant software development advantage. How fast companies can fix their mistakes and recover is the key to winning. Half of all organizations resolved their failed builds in under one hour. CircleCI's hypothesis has to do with distribution. For instance, an organization with developers all located in one town working typical business hours may not be able to resolve failed builds before leaving for home. At the same time, an organization with a global development team likely has processes in place to hand off work between teams, eliminating this daily delay. 
  • Controversial branch naming did not change despite the Black Lives Matter movement. We saw a lot of media attention and social media chatter earlier this summer around the naming that teams use as the default branch on their VCSs. The discussion revolved around the renaming of the default branch of projects from 'master' to 'main' or another option. Despite widespread agreement and intentions, the data doesn't indicate much movement from organizations renaming their default branches away from 'master'.

"As the world's largest standalone CI provider, we have a unique opportunity to investigate what software delivery looks like quantitatively," said CircleCI CEO Jim Rose, in a statement, "across tens of thousands of teams, commit by commit. So, we looked to data from 11 million workflows on our platform to see how teams were building and deploying software in practice to answer this question once and for all: What does a high performing team really look like?"

CircleCI's namesake CI/CD platform automates software delivery at scale, allowing teams to build, test, and deploy software quickly, which allows engineering organizations to focus on delivering value to users. The platform sits at the heart of the software delivery process, which gives it a unique visibility into how code moves from version control to production.

About the Author

John K. Waters is the editor in chief of a number of Converge360.com sites, with a focus on high-end development, AI and future tech. He's been writing about cutting-edge technologies and culture of Silicon Valley for more than two decades, and he's written more than a dozen books. He also co-scripted the documentary film Silicon Valley: A 100 Year Renaissance, which aired on PBS.  He can be reached at [email protected].