Page 208 - Kaleidoscope Academic Conference Proceedings 2020
P. 208
2020 ITU Kaleidoscope Academic Conference
5. CONCLUSION [7] W. Wang, M. Zhu, J. Wang, et al., "End-to-end
encrypted traffic classification with one-dimensional
After a thorough analysis of the possibility of applying the convolution neural networks," IEEE International
full-NLP scheme for encrypted traffic classification, we Conference on Intelligence and Security Informatics
point out that the byte data of raw traffic packets can be (ISI), Beijing, pp. 43-48, 2017.
transformed to character strings by proper tokenization.
Based on this, we propose a new method named PERT to [8] M.Lotfollahi, R.S.H. Zade, M.J. Siavoshani, et al.,
encode the encrypted traffic data and to serve as an automatic “Deep Packet: A Novel Approach For Encrypted
traffic feature extractor. In addition, we discuss the pre- Traffic Classification Using Deep Learning,” Soft
training strategy of dynamic word embedding in a condition Computing, 2017.
of the flow-level encrypted traffic classification. In
accordance with a series of experiments on the public ISCX [9] M. Lopez-Martin, B. Carro, A. Sanchez-Esguevillas,
data set and Android HTTPS traffic, our proposed et al., “Network Traffic Classifier With
classification framework can provide significantly better Convolutional and Recurrent Neural Networks for
results than current DL-based methods and traditional ML- Internet of Things,” IEEE Access, vol. 5, pp. 18042-
based methods. 18050, 2017
REFERENCES [10] W. Wang, Y. Sheng , J. Wang , et al., “HAST-IDS:
Learning Hierarchical Spatial-Temporal Features
[1] P. Velan, M. Cermak, P. Celeda, et al., “A Survey of Using Deep Neural Networks to Improve Intrusion
Methods for Encrypted Traffic Classification and Detection,” IEEE Access, vol. 6, pp. 1792-1806,
Analysis,” International Journal of Network 2018.
Management, 2015.
[11] T. Mikolov, K. Chen, G. Corrado, et al., "Efficient
[2] S. Rezaei and X. Liu, “Deep Learning for Encrypted Estimation of Word Representations in Vector
Traffic Classification: An Overview,” IEEE Space," Proceedings of Workshop at ICLR, 2013.
Communications Magazine, vol. 57, no. 5, pp. 76-81,
2019. [12] M.E. Peters, M. Neumann, M. Iyyer, et al., “Deep
contextualized word representations,” 2018.
[3] J. Devlin, M.W. Chang, K. Lee, et al., "BERT: Pre-
training of deep bidirectional transformers for [13] A. Vaswani, N. Shazeer, N. Parmar, et al., “Attention
language understanding," Proceedings of the 2019 Is All You Need,” 2017.
Conference of the North American Chapter of the
Association for Computational Linguistics: Human [14] Y. Bengio, R. Ducharme, P. Vincent, “A Neural
Language Technologies, 2019. Probabilistic Language Model,” Journal of Machine
Learning Research, 2000.
[4] A.Y. Javaid, Q. Niyaz, W. Sun , et al., "A Deep
Learning Approach for Network Intrusion Detection [15] G. Draper-Gil, A.H. Lashkari, M.S.I. Mamun, et al.,
System," 9th EAI International Conference on Bio- "Characterization of Encrypted and VPN Traffic
inspired Informatio and Communications using Time-related Features," The International
Technologies. ICST., 2015. Conference on Information Systems Security and
Privacy (ICISSP), 2016.
[5] J. Hochst, L. Baumgartner, M. Hollick, et al.,
"Unsupervised Traffic Flow Classification Using a [16] Z. Lan, M. Chen, S. Goodman, et al., “ALBERT: A
Neural Autoencoder," IEEE 42nd Conference on Lite BERT for Self-supervised Learning of Language
Local Computer Networks (LCN), Singapore, pp. Representations,” 2019.
523-526, 2017.
[6] S. Rezaei, X. Liu, “How to Achieve High
Classification Accuracy with Just a Few Labels: A
Semi-supervised Approach Using Sampled Packets,”
2018.
– 150 –