Page 39 - Methodology for inter-operator and cross-border P2P money transfers
P. 39
18 BACKGROUND TESTING OF MOBILE NETWORKS
18�1 General considerations asynchronous mode anyway), and for the A side
The considerations and recommendations on mobile SMS acts only as a secondary indicator.
network background testing provided in P.1502 fully • As can be seen in respective reports (see Refer-
apply as running such tests on the local network do ences), USSD performance has shown not to be
not depend on the test scenario. The following is highly correlated with DFS performance. Also,
therefore meant to be as an extra angle of view on in a multi-network, multi-country campaign, it
the matter. will be hard to find USSD codes which work for
As outlined in P.1502, the effect of mobile network all involved networks. In summary, the effort for
performance on overall DFS QoS depends on the including USSD to such campaigns should be
performance and interplay between DFS infrastruc- carefully considered form a cost to value point of
ture and the network. Only if the DFS infrastruc- view.
ture works very well, i.e., the processing times are
consistently short, mobile network performance may The definition of valid TA excludes transactions
become the defining or limiting factor in overall DFS which are taken via Wi-Fi, were interrupted by the
QoS. With slow or strongly fluctuating DFS perfor- user (“user break”) or are masked-out otherwise.
mance, the influence of the mobile network may not Also, through joining with the TAL, there is an effec-
be visible in output data at all. tive time-windowing to exclude TA taken outside the
Also, if mobile network performance is a matter of date range of respective scenario. Due to the fact
interest at all depends on the overall scope and goals that measurements were taken stationary (in the
of a campaign. Therefore, the effort made when same location), there is, however, no time windowing
testing mobile network performance (e.g., if devic- with respect to MSW time ranges.
es are allocated for testing, or the budgeted cost of MDR values are, different from standard MDR aver-
mobile data plans) is typically decided case by case. aging, taken over all TA including unsuccessful ones.
For instance, if tests are done stationary, spot test- This avoids biasing towards higher expected values
ing may be sufficient to assess the mobile network which occurs when timed-out transactions are
coverage quality, instead of running data-intensive excluded from averaging.
testing all the time. ST values are calculated over values from successful
TA only to avoid inconsistencies by clipping. When
18�2 Testing Tools interpreting data, success rates need to be consid-
Generally, all applicable network performance test- ered along with ST values.
ing tools can be used. For practical reasons, tools When setting up a scenario for network testing,
which come as an Android app running on “out of it also needs to be considered where respective
the box” mobile phones may be preferable from a content is hosted. The effort to be taken is, again,
cost point of view. a matter of scope and purpose of measurement. If
network KPI shall only have an indicative or second-
18�3 KPI ary character, a simple approach i.e., by hosting all
The following set of use cases and KPI provides good content on the same server can be taken (located in
overview at reasonable effort. It does not include one of the participating countries, or elsewhere). If
SMS or USSD for the following reasons: precision measurements are intended, multiple serv-
er locations with high supported bandwidth may be
• SMS is not a primary transport service for DFS required, possibly accompanied by calibration and
related information. It is only used to transfer noti- validation testing.
fication SMS to the B party (which is irrelevant in
Methodology for inter-operator and cross-border P2P money transfers 37