Tuesday, August 13, 2013

Evidence of Testing

For K. J. Ross & Associates, it is with some relief and vindication that the Commissioners report was been publically released last week from the Queensland Health Payroll System Commission of Inquiry.

The Queensland Health (QH) Payroll System was a SAP/Workbrain system to calculate pay for QH’s 78,000 staff.  After Go Live in March 2010, the system did not perform adequately with terrible consequences for the employees of QH and equally serious financial consequences for the State.

The replacement of the QH payroll system must take a place in the front rank of failures in public administration in this country. It may be the worst. 2.15 pg 12
It is estimated that it will cost about $1.2B over the next eight years. 2.14 pg 12
Since completion of User Acceptance Testing in January 2010, K. J. Ross & Associates have maintained confidentiality in its role in the payroll system implementation under its contractual terms.  At times it has come at significant frustration, as having a record of involvement in a failed project, whilst confident that we have performed our role diligently and professionally.  Numerous times when we explain that we are software testing company and what we do, members of the public will often say “you should have tested that payroll system”.  While it is easy to say we did test the system and we recommended that the system not be accepted, we are sure there is skepticism in their minds.

It is a relief now that there is public record of our role:
Standards which had been preset to ensure that the system when delivered would function adequately were lessened or avoided so as to permit implementation. 1.24 pg 215
The risk of doing so was clear and was made explicit by KJ Ross and Associates Pty Ltd, engaged to conduct User Acceptance Testing. 1.25 pg 215
K. J. Ross’s witness at the enquiry, UAT Test Manager, Mr Brett Cowan, was significantly noted:
Mr Cowan conducted the exercise he was retained to carry out competently and professionally. Moreover, his approach to that task stands as a rare example in this Project of a cogent warning provided to the parties of problems which clearly existed, but which both parties had to some extent been reluctant to accept and to resolve. Had the warnings he gave in his Final UAT Report been heeded, then the system ought not to have gone live when it did and many of the problems experienced by the system would have been avoided. 5.61 pg 123
The analysis of testing features significantly in the report, over 14 pages of the 225 page report.  It also a story of many other project failures within which the context of testing sits.  For testing practitioners (and project managers) it is well worth the read. 

There is too much detail to already give you a short summary here.  However, we are currently preparing an article that presents the testing findings in a form suitable for educating and informing testing practitioners.  The goal here is to identify what testing obligations have to be fulfilled in a legal sense, as this was the position examined during the inquiry.

The entrails of this failure need to be picked apart, otherwise it will be a reoccurring theme in IT system implementations.  We are also planning a series of panel sessions involving participants in the inquiry to encourage discussion and debate.