Page 35 - Methodology for inter-operator and cross-border P2P money transfers
P. 35
16 GENERAL CONSIDERATIONS ABOUT ERRORS IN MEASUREMENTS
Before entering in specific considerations, the usage against the testing time, by estimating expected
and definition of the term, ‘Error’ needs to be to be sample counts and comparing them to actually
clarified as this term is used in several conceptual received ones, would be a simple first-level way of
contexts. data quality assurance.
‘Error’ can mean The way the TAL and DAL lists are constructed, and
I) A statistical error in the sense of an error margin. the usage of field test logs, are also expressions of
If a quantity is calculated from a limited number this strategy. It needs to be mentioned, however,
of data samples from a system which exhibits, that applying a pre-defined recipe alone is not
from the user’s viewpoint, somewhat random sufficient. Considering the concrete situation in
behaviour (such as a failure probability), this the field, and applying respective checking steps,
quantity will not describe the respective property are equally important parts of an overall error-
of the system exactly but only within a given reduction strategy.
margin. This margin can be calculated based on
statistical formulae; respective information can be III) Errors caused by operating errors, i.e., a special
found in ITU-T Recommendation E.840 or E.804. type of ‘human error’ but with an impact on
In short, the only way to reduce this error margin more than one data point. Examples would be
is by increasing the number of samples taken. insufficient power supply (low-battery condition)
which can cause untypical device behaviour;
II) Errors caused by incorrect reading or transmission overheating of devices due to insufficient air
of readings, i.e., “human error” in the data flow or exposure to heat sources; forgetting
collection process. ITU-T Recommendation P.1502 to activate functions on the devices etc. The
deals extensively with such errors in the context log templates and associated regular checking
of measurements on DFS. Avoiding such errors procedures are designed to provide protection
requires careful execution of testing and data- against such errors. Again, these measures need
collection steps. The check lists and procedures to be complemented by assessment of concrete
described in the present document are a tool to field situations and respective judgment and
provide robustness of the measurement process definition of additional measures based on actual
and to reduce the probability of such errors. circumstances.
However, there is always a trade-off between the
effort for such checking procedures and impact of IV) Errors in the implementation of data processing.
actual undetected errors. In general, single errors The way to reduce this risk is running test of
will decrease the accuracy of measurements. As algorithms (e.g., SQL queries) with a limited
far as such errors are effectively random in nature, number of data points, and compute reference
increasing the number of samples is also a means values manually (typically in a spreadsheet
to reduce their impact on output data quality. calculation application). Even if pre-defined
Applying cross-checks and “logical tests” is also a processing algorithms are provided (e.g.,
way to reduce the probability that such errors take by a set of SQL statements used in previous
place undetected. measurement campaigns) it is advisable to apply
such tests, unless it is assured that the processing
For instance, checking the number of samples environment is exactly the same.
Methodology for inter-operator and cross-border P2P money transfers 33