John Hancock

John Hancock


By John Hancock

The editors of this publication asked me to begin writing software-related articles for The Journal, and I’ve been agonizing over the subject. My first submission was politely rejected, so I’ve been back at the drawing (or writing) board. The writing of an article is the easy part; it’s the subject and the tone for a journal publication that’s the difficult thing to decide. I make no guarantees, but I’ll attempt to make this not too terribly dry or technical. I’m writing about the way AQUA handles the Academy’s quality performance standards.

If you read my bio, you’ll know that much of my military career was as an aviation weather forecaster for the U.S. Air Force. Weather people are very much into data collection and analysis. They’re also heavily into quality control, and, for military aviation purposes, they measure specific mission critical values and, with enormous scrutiny, perform in-depth quality control on forecasts after-the-fact. When I began working for Priority Dispatch Corp. (PDC), I saw the quality control program, which measures compliance to standards, but didn’t look into it in any level of detail. I was too busy learning ProQA Legacy, Paramount, FairCom server, XLerator server, etc. But I do remember thinking about what we could be measuring and how the measurements could result in meaningful information.

Eventually, under the mentorship/co-mentorship of quite possibly the most knowledgeable AQUA/performance standards person in the known universe, my ignorance was turned around. I was astounded by not only the depth and fairness of the performance standards but by the complexity of the AQUA software and the degree of quality improvement that can be accomplished.

I made comparisons between my known world of forecast verification against the performance standards (9a/4a/4b). Their concepts of measuring deviations in terms of critical to insignificant are extremely similar. These carry weight. Aircrews rely heavily on target weather forecasts. They must see it to hit it. If you blow the forecast, that’s pretty critical for aircrews and weighs very heavily on the forecaster’s skill score. It’s non-compliant. Similarly, first responders (fire/medical/police) expect the correct address to rescue that trapped family or save an individual in cardiac arrest. If the calltaker fails to get or verify the correct address, that’s terribly critical and carries the same comparative weight on the overall call compliance—it’s non-compliant.

Once I had a fairly decent grasp and appreciation of performance standards, I learned more about AQUA. I was impressed. The software accounts for every necessary facet of the performance standards, including the complicated matrix determining overall compliance based on number and types of deviations. The reports are something else to rave about.

The quality improvement suite of reports can show a supervisor where to drill right down from overall agency compliance to teams and to individuals, and the areas that might need further training for the agency, teams, and individuals. The ACE reports put me in awe. I spent about a week agonizing over the math with the assistance of an AQUA expert to validate the calculations AQUA spits out manually. If you’re familiar with the ACE reports you’ll understand when I say it was a painful endeavor; however, it gave me a real appreciation for the programming that this involved. It was flawless and in some instances even smarter than us (not a difficult achievement). Instances where we thought it might be miscalculating were in fact our misunderstanding and misapplication of the performance standards. We were then suitably humbled when we figured out the error of our ways.

With all things, there are bound to be instances of weakness. I’d be fibbing if I stated the entire process is perfect and totally objective. There’s always room for improvement, which brings up yet another impressive part of the process: We’re always working to make it better. Needless to say, I’m now a strong disciple of the performance standards/AQUA relationship.