432 ◾ Appendix H
For summary-level metrics, the scores can then simply be averaged in each sec-
tion and overall for a grand total score. e criteria and the sections may optionally
be weighted ﬁrst because some controls are more important than others—we leave
this as an exercise for the reader.
e individual ratings for each row in the tables, along with your notes and
perhaps the evidence you gathered, may prove useful for information security pro-
fessionals tasked by management with improving the scores.
As well as using the maturity scale method to score small organizations or indi-
vidual departments and facilities directly, we have used a more detailed version of
the matrix to assess large organizations’ compliance with the ISO27k standards.
e method involves a team of qualiﬁed IT auditors assessing, scoring, comparing,
and contrasting business units using common criteria similar to those shown here
plus an accompanying ISO27k audit checklist. It works extremely well and has
proved popular with management.
By the way, please do not assume that 100% is the target or ideal score in every
case or, for that matter, that 0% is necessarily an outright fail. Risk analysis is an
integral part of the ISO27k approach, and your risks (and, hence, the appropri-
ate controls) are not the same as everyone else’s. ese are entirely generic scoring
scales. Some controls might not be appropriate in your organization, and others
might not go far enough.
Tip: If you are blessed with a progressive management, the scores lend them-
selves to the publication of corporate league tables that motivate underper-
forming business units to review their approach to information security and
encourage the transfer of good practices from their better peers. Be aware,
however, that bad scores can generate serious resentment, so be careful if you
take this approach—you might, for example, oﬀer underperforming busi-
ness units a grace period to get their act together before reassessing them and
publishing the numbers.