vested interests. Data management is the primary element for correct deployment
of statistical techniques, and ensuring transparent data collection is an important
characteristic of TQM. Data is the primary element on which statistics and statis-
tical techniques are built, and hence top management’s active interest and super-
intendence is required to ensure that there is no laxity or manipulation in data
management. Data generated by properly implemented statistical techniques give
clues and guidance to anticipate and correct non-conformity and reduce variability
in the system.
Data is used in all areas of statistics viz. descriptive, inductive and probability.
Data is also the basis on which corrective and preventive actions are taken and
reviewed, using measurement and benchmarking. Process control, variation control,
Six Sigma initiatives and even Balanced Scorecards are based on data analysis,
and the success or failure of such initiatives are also indicated by data. Customer
satisfaction or delight is indicated by data. Data is fundamental to the TQM axiom
that states that which cannot be measured, cannot be improved.
Appropriateness and accuracy of raw data is crucial as data are processed
through statistical steps until useful information is generated. Unfortunately there
is a lot of indiscipline in data planning, collection and analysis. TQM philosophy
says, ‘Speak with data and act on data’. Data reflect the real state of affairs, give
warning through trends, and throw up opportunities for improvement. Whatever
is done or not done needs to be captured in time and analysed for continuous
Accomplishment is only authenticated by data. Top management has to lay a
lot of stress on the need for honest and efficient data management and a work
culture where data is not something alien or fearsome but a valuable input for
decision-making and improvement. Top management should take extra care to
train and motivate people in data collection and relevant statistical techniques.
Management should also institutionalise systems and procedures to ensure
calibration, training and checking of measuring instruments and sensors to avoid
The frequency distribution graph is a fundamental tool that gives a representative
picture of the nature and extent of total variation and studies the variation after
improvements. The ‘central limit theorem’ discovered by Walter A. Shewhart, the
statistician and quality philosopher of the 1920s, establishes that if measurements
from a process or machine are graphed, they often form a bell-shaped frequency
distribution, which has been likened to a London bobby’s hat. This distribution
curve is also called normal or Gaussian distribution. If a distribution of measurements
is graphed from a chance (constant cause) system, it does not matter whether the
shape of the distribution of individual values (population) is triangular or trape-
zoidal, the averages of different sized samples will have a similar central tendency
and will be distributed in a bell-shape. This bell-shaped distribution occurs fre-
quently in business and nature. A normal curve has three important characteristics.
PROCESS ORIENTATION 217