Page 36 - Methodology for inter-operator and cross-border P2P money transfers
P. 36
17 DATA VALIDATION AND PROCESSING
17�1 Overview
Figure 13 shows a schematic description of the structures and processes for post-processing (see also section
11.2 for a description of the data entities).
Figure 13 – Symbolic overview of data objects and processing
The steps for data cleansing and processing are: should be run more often, and vice versa. It is good
practice to run these tests on the “data harvest”
• Complete and validate the TAL; make sure team every day or every other day.
column headers are consistent with data base
requirements (see also 11.6). • General yield of data: Check if the number of
• Complete and validate the DAL against field test data items from the MSW and the network per-
logs; make sure column headers are consistent formance test roughly corresponds to the overall
with data base requirements (see also 11.7 for ref- testing time.
erence). • Cross-checking with GPS data: If the location per-
• Cross-validate DAL and TAL, make sure that team mits, GPS data can be an important source of infor-
names are consistent. mation for cross-validation. For instance, the GPS
• Cross-validate MSW and ObsTool data, make sure data yield (data points/hour) should correspond
that data source ID’s can be resolved to team to the overall testing time. Also, time information
names and scenario names. in GPS can provide information for cross-check-
ing device settings. For that purpose, the back-
ground-testing tool should provide the original
17�2 Plausibility and Validity checks NMEA sentences or an adequate equivalent.
• Cross-checking the session/location logs with the
17.2.1 Basics information in the TAL/DAL.
The intensity of checks will depend on assumptions
about outer conditions of testing; if there are factors
which can lead to a higher risk of data loss, checks
34 Methodology for inter-operator and cross-border P2P money transfers