Plandek co-CEO: 5 areas for Agile team self-improvement

In his role co-CEO of big data analytics company Plandek, Charlie Ponsonby guest writes for the Computer Weekly Developer Network to examine how teams can get more out of Agile development projects. 

Plandek uses proprietary algorithms to synthesise complex fuzzy data-sets to provide actionable insights designed to improve productivity today and early-warning signs to mitigate against the problem projects of tomorrow.

Ponsonby writes as follows…

Now that Agile is officially mainstream, development teams must not allow old, ingrained habits to resurface and dilute its potential. This is a very real risk.

Agile, is, after all, a relative term and fairly meaningless unless qualified. So do you know how agile your development is? One-way to embed the culture change required to answer that key question is through self-improvement (SI) processes underpinned by the right agility metrics.

Agile is already closely linked to SI — let’s remember that the Agile Manifesto states: “At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behaviour accordingly.”

In other words, Agile is about continuous, team-driven SI. The fact that retrospectives is among the top five Agile techniques underscores SI’s importance (source: State of Agile report).

Nevertheless, SI efforts regularly fail due to inadequate leadership and follow-through. Teams either don’t have the right tools to collect the data or that they set the wrong metrics. The latter can be especially problematic when Agile development projects are scaling.

At Plandek and in former positions, our leadership team has had the opportunity to work hands on with a wide variety of Agile engineering teams – from start-ups to very large, distributed enterprises. Based on these experiences we have been able to identify five critical areas for effective Agile team SI.

#1 Commitment & sponsorship

Agile teams are almost always busy and resource-constrained. As a result, the intention to always improve (in a structured and demonstrable way) often loses out to the pressures of the day job – delivering to the ceaseless demands of the business.

SI calls for a serious commitment from engineering leadership in the form of a structured, long-term and well-implemented SI programme. This includes establishing a monthly cycle of team target setting, implementation and review. These must be supported with robust tools that provide the necessary metrics and reporting.

#2 Agreed metrics

Selecting and agreeing metrics is often the most contentious issue. Failing to reach consensus and buy-in on metrics is the reason why so many Agile programmes fail.

As the Agile author Scott M. Graffius put it: “If you don’t collect any metrics, you’re flying blind. If you collect and focus on too many, they may be obstructing your field of view.”

By nature, Agile encourages a myriad of different methodologies and workflows that vary from team to team. However, this does not mean that it is impossible to agree on a set of meaningful Agile metrics around which to build a self-improvement programme. Our initial research across more than 100 projects in 12 months have revealed the following Agile metrics to be deterministic of better outcomes, and can be tracked and improved in all teams:

  1. The key enabler metrics – best practice and tool use. Some team members argue that tool usage (e.g. of Jira) is so inconsistent, that data collected from within it is not meaningful (garbage in, garbage out). However, it’s possible and useful to measure the extent to which team members are adhering to simple processes that are part of your software development lifecycle.
  2. Sprint disciplines and consistent delivery of sprint goals (Scrum Agile).
  3. The proportion of time spent/velocity/efficiency of writing new features (productive coding).
  4. Quality and failure rates.
  5. The proportion of time spent/efficiency of bug fixing and re-work.
  6. Teamwork, team wellness and the ability to collaborate effectively.

#3 Data context

A metric on its own is only meaningful when viewed in the right context. This means you need to harvest and combine the right data sources. To gather meaningful insights into the team processes, data should come from systems that developers use including workflow management software like Jira; code repositories like GitHub, Bitbucket, TFS or Gitlab; code quality tools like Sonarqube; and time tracking systems like Harvest or Tempo.

#4 Tracking metrics in near real-time

Once metrics are agreed, it’s essential to make them available to teams in as near to real-time as possible without creating added work. This means not expecting people to collect data manually. Besides creating extra work, the data will be retrospective and therefore less useful. Look for tools that gives teams the metrics they need in near real-time, without requiring manual input.

#5 Celebrate success

Once you’ve agreed what success looks like, it’s important to pause and celebrate when teams reach key milestones. You want people to regard SI as a positive and motivating process. SI lends itself well to gamification and communicating recognition to the most-improved teams, competence leaders, centres of excellence and so forth. Celebrating success shouldn’t be seen as a ‘fluffy’ optional, but essential to reinforcing the right behaviours and raising morale. Also: teams that measure SI gather the data necessary to win agile development awards.

Well-implemented SI programmes can deliver profound and sustained improvement in metrics that are indicative of productivity and timing accuracy. Typical results that we see are:

  • 10%+ velocity improvement
  • 10%+ improvements in flow efficiency
  • 15%+ reduction in return rates and time spent reworking tickets (returned from QA)
  • 30%+ improvement in sprint completion accuracy (Scrum Agile)
  • Greatly improved team collaboration and team wellness.

Without question, SI calls for major cultural change and sustained effort. Technology leadership needs to give teams the tools (and encouragement) to drive the process. Without ongoing support, many teams will not have the time or inclination to drive their own SI as they strive to meet their short-term delivery objectives.

Fortunately, new tools are becoming available that help teams get SI programmes right and improve genuinely monitor and improve agility. The results we have seen in practice show that it is well worth the effort.

< class="wp-caption-text">Ponsonby: without question, self-improvement calls for major cultural change and sustained effort.

Data Center
Data Management