Skewed right distribution
A distribution skewed right has most of its measured points on the left side of dis-
tribution and fewer on the right, suggesting that more data points are in the lower
range of the X axis.
A random distribution has no recognizable pattern and does not reveal much
The resulting numbers from the computation of ‘mean’ and ‘standard deviation’
from sample data is called ‘statistics’, in other words it is a measure that describes
the characteristics of a sample. Statistics is the science that deals with collection,
collation, classification, and analysis of data or information contained in ‘sample’
and is used to make inferences about population parameters that are unknown.
Statistics can be subdivided into three broad categories (see Table 7.6). Of the
three types of statistics, ‘descriptive statistics’ is the study of organizing, summa-
rizing, and displaying data, ‘statistical inference’ derives information of a popula-
tion based on the characteristics of a sample, and ‘probability’ is reaching conclu-
sions about a sample by studying the population.
RELEVANCE OF DATA AND DATA COLLECTION
In TQM philosophy, generating accurate data and using it for continual improvement
is sacrosanct. It is a pity that dishonest managers manipulate data to serve their
216 MACRO STRATEGY
Table 7.6: Statistics used in TQM
THREE AREAS OF STATISTICS NORMALLY USED IN TQM
Descriptive Statistics Inductive Statistics Probability
Describes the characteristics Draws conclusions on the Study of a
of a product or process
based on information knowledge of the
based on data
available about a
(the reverse of statistical
about it. inference)
Organizing, presenting The
has to be Make statements about the
and displaying data in representative of the likelihood of a
meaningful patterns for
certain features based on
study through: Marketing uses statistical information about the
z Arrays inference to find the right
z Frequency distributions, sample size to conduct
z Frequency Polygons market surveys on a new
z Pareto diagrams product.
z C-E diagrams, etc.
vested interests. Data management is the primary element for correct deployment
of statistical techniques, and ensuring transparent data collection is an important
characteristic of TQM. Data is the primary element on which statistics and statis-
tical techniques are built, and hence top management’s active interest and super-
intendence is required to ensure that there is no laxity or manipulation in data
management. Data generated by properly implemented statistical techniques give
clues and guidance to anticipate and correct non-conformity and reduce variability
in the system.
Data is used in all areas of statistics viz. descriptive, inductive and probability.
Data is also the basis on which corrective and preventive actions are taken and
reviewed, using measurement and benchmarking. Process control, variation control,
Six Sigma initiatives and even Balanced Scorecards are based on data analysis,
and the success or failure of such initiatives are also indicated by data. Customer
satisfaction or delight is indicated by data. Data is fundamental to the TQM axiom
that states that which cannot be measured, cannot be improved.
Appropriateness and accuracy of raw data is crucial as data are processed
through statistical steps until useful information is generated. Unfortunately there
is a lot of indiscipline in data planning, collection and analysis. TQM philosophy
says, ‘Speak with data and act on data’. Data reflect the real state of affairs, give
warning through trends, and throw up opportunities for improvement. Whatever
is done or not done needs to be captured in time and analysed for continuous
Accomplishment is only authenticated by data. Top management has to lay a
lot of stress on the need for honest and efficient data management and a work
culture where data is not something alien or fearsome but a valuable input for
decision-making and improvement. Top management should take extra care to
train and motivate people in data collection and relevant statistical techniques.
Management should also institutionalise systems and procedures to ensure
calibration, training and checking of measuring instruments and sensors to avoid
The frequency distribution graph is a fundamental tool that gives a representative
picture of the nature and extent of total variation and studies the variation after
improvements. The ‘central limit theorem’ discovered by Walter A. Shewhart, the
statistician and quality philosopher of the 1920s, establishes that if measurements
from a process or machine are graphed, they often form a bell-shaped frequency
distribution, which has been likened to a London bobby’s hat. This distribution
curve is also called normal or Gaussian distribution. If a distribution of measurements
is graphed from a chance (constant cause) system, it does not matter whether the
shape of the distribution of individual values (population) is triangular or trape-
zoidal, the averages of different sized samples will have a similar central tendency
and will be distributed in a bell-shape. This bell-shaped distribution occurs fre-
quently in business and nature. A normal curve has three important characteristics.
PROCESS ORIENTATION 217