Capability development—still a work in progress (2)

men at workIn last week’s post we presented a potted summary of two of the four main areas of difficulty within the Defence capability development process identified by the Australian National Audit Office (ANAO). Today we’ll finish with the other two, and offer a few thoughts of our own.

We should caution that the perspectives here are necessarily in the context of the audit office report. In fairness, we’re going to talk to our friends in Defence as well, and we’ll report later on what we find. As we mentioned last week, the ANAO’s focus is very much on compliance, and is inherently rearwards looking, and we might see a different picture when we look through a different lens. But for now, on with the overview.

Improving accountability and advice during project implementation

This audit report makes the observation that ‘further work is required to improve accountability’. That’s hardly a revelation; accountability (or, more accurately, the lack of accountability) within Defence has been a recurring theme of successive reviews dating back to the Tange era. More interesting in this case is the auditor’s approach of drilling down into the implementation of previous measures intended to improve accountability.

One of the major Kinnaird recommendations from 2003, for example, was that ‘Capability Managers should have the authority and responsibility to report to government on the development of defence capability at all stages of the capability cycle’. Ten years on, the reporting that’s happening still doesn’t satisfy the ANAO, although Defence has agreed with the auditor’s recommendations for a more thorough reporting scheme. As a small insight into the Byzantine world of Defence committees, ANAO informs us that the Capability Development Reform Stream Governance Committee handed over reform activities to the Capability Development and Materiel Reform Committee in 2012. (Judean People’s Front, anyone? (video))

The ANAO concludes, perhaps a little wistfully:

In September 2013, Defence further advised that [the] Recommendation had been ‘closed by process’ (but not outcome) at a meeting of its CDMRC on 27 August 2013. However, there is no evidence of the envisaged reports having yet been produced.

Reporting on progress with reform

There’s naturally been a great deal interest in the progress of capability development and acquisition reform, and the ANAO has looked closely at DMO’s reporting on implementation of the Mortimer Review recommendations—both internally and through its evidence to parliamentary committees. As you might expect, the fun starts when the auditors compare DMO’s internal and external reporting. The following self-explanatory quote captures the thrust of the auditor’s forensics:

Defence’s August 2011 response to the Senate Committee, in which it said that it had ‘fully implemented’ 29 Mortimer recommendations, provided limited information and had the potential to give an impression of greater progress than had actually been achieved.

Once again, the issue hinges on the notions of ‘process’ versus ‘outcomes’. The Senate Committee was availed of the strides being made from a process perspective, but not told of the less favourable situation prevailing in terms of actual outcomes. Defence’s less than complete disclosure of progress wouldn’t have come as a surprise to the ANAO. In an earlier report on Defence’s implementation of audit recommendations, the ANAO exposed a worrying deficit between claimed and actual progress.

But to a point that’s not surprising. DMO’s closure framework for the Mortimer recommendations is a two-step approach. Firstly, the recommendation is implemented by introducing the appropriate process. Once the new/amended process is in place, the recommendation is considered ‘closed by process’. The next step is to gather information to ensure the process is producing the desired effect—and only when that’s available is the recommendation is ‘closed by outcome’. This is one of the difficulties of implementing and then measuring reform in a process that can be many years from end to end.

One area where DMO tries to measure and report outcomes is the extent to which projects are delivered on schedule—the message being that things have improved as a result of recent reforms. Yet here again the ANAO find fault, citing a range of issues about the metrics and benchmarks used. Fair enough, that’s their job. But the discussion misses the larger picture. To start with, it would be surprising if schedule performance wasn’t improving given the recent shift in favour of off-the-shelf purchases. More importantly, post-approval schedule performance is a poor measure of the effectiveness of reform in DMO. In most instances, schedule performance is much more a measure of (1) industry’s performance and (2) the inherent riskiness of projects, than it is a diagnostic of DMO’s performance.


We hope that the result of this audit will be positive. It’s always possible that the result of ‘reform’ is the addition of even more layers of review and reporting to an already process-heavy system. This will make it even harder for the no-doubt dedicated staff of Capability Development Group to focus on outcomes; more process and documentation is the last thing they need. In fact, where we are today in many ways reflects the response to past reviews of Defence decision making and capability development—which has been to add even more complexity to the process and more stakeholders to the committees (thus increasing the diffusion of accountability).

We note that the Capability Development Group in Defence is now well into its own Capability Development Improvement Program, which began during the ANAO audit and is aimed to address many of the points raised, as well as some self-identified problems. Between the audit recommendations and Defence’s own initiatives, hopefully we’ll begin to see better cost and schedule estimates, and better capability outcomes—which is what we really care about. To the extent that we can with limited public data, we’ll be watching the metrics for any such improvements.

Andrew Davies is senior analyst for defence capability at ASPI and executive editor of The Strategist. Mark Thomson is senior analyst for defence economics at ASPI. Image courtesy of Flickr user Red~Cyan.