冷依凌,邹细勇.基于CNN-BiBASRU-AT的网络异常流量检测模型[J]. 微电子学与计算机,2024,41(1):93-99. doi: 10.19304/J.ISSN1000-7180.2022.0853
引用本文: 冷依凌,邹细勇.基于CNN-BiBASRU-AT的网络异常流量检测模型[J]. 微电子学与计算机,2024,41(1):93-99. doi: 10.19304/J.ISSN1000-7180.2022.0853
LENG Y L,ZOU X Y. Network abnormal traffic detection model based on CNN-BiBASRU-AT[J]. Microelectronics & Computer,2024,41(1):93-99. doi: 10.19304/J.ISSN1000-7180.2022.0853
Citation: LENG Y L,ZOU X Y. Network abnormal traffic detection model based on CNN-BiBASRU-AT[J]. Microelectronics & Computer,2024,41(1):93-99. doi: 10.19304/J.ISSN1000-7180.2022.0853

基于CNN-BiBASRU-AT的网络异常流量检测模型

Network abnormal traffic detection model based on CNN-BiBASRU-AT

  • 摘要: 针对目前网络异常流量识别准确率不高、基础深度学习模型特征提取能力不足以及循环神经网络训练效率低等问题,提出了基于卷积神经网络(Convolutional Neural Network, CNN)-双向内置注意力简单循环单元(Bidirectional Built in Attention Simple Recurrent Unit, BiBASRU)-AT的网络异常流量检测模型。 采用深层一维卷积模块提取流量局部特征表示,对高维度流量特征进行降维且学习到显著分类特征,增强模型的特征表示能力;同时构建内置自注意力简单循环单元(Built in self Attention Simple Recurrent Unit, BASRU)以同时捕捉流量中长距离的时序特征信息和内部特征之间的相互依赖关系,进一步挖掘流量特征内的高维结构信息。软注意力机制识别出对分类结果影响较大的重点特征,赋予关键特征更高权重,避免无关信息对分类结果造成干扰,最后由线性层输出分类概率分布,经Softmax函数归一化后取最大值对应标签作为流量识别结果。在多分类网络异常流量公开标准数据集UNSW-NB15上的实验结果表明,该模型取得了92.81%的F1值,高于实验对比的其他先进深度学习模型的结果,内置自注意力简单循环单元特征捕捉能力和训练效率优于其他传统循环神经网络的结果,证明了模型的可行性和有效性。

     

    Abstract: To address the problems such as the low accuracy of network abnormal traffic identification, the lack of feature extraction ability of basic deep learning model and the low training efficiency of recurrent neural network , a network abnormal traffic detection model based on Convolutional Neural Network(CNN)-Bidirectional Built in Attention Simple Recurrent Unit(BiBASRU)-AT is proposed. The deep one-dimensional convolution module is used to extract the local feature representation of traffic, reduce the dimension of high-dimensional traffic features and learn significant classification features to enhance the feature representation ability of the model. At the same time, the Built-in self Attention Simple Recurrent Unit (BASRU) is built to capture the interdependencies between the time-series feature information and the internal features in the medium and long distance of the traffic at the same time, further mining the high-dimensional structural information in the traffic features. The soft attention mechanism identifies the key features that have a greater impact on the classification results, assigns a higher weight value to the key features, and avoids the interference of irrelevant information on the classification results. Finally, the classification probability distribution is output from the linear layer, and the label corresponding to the maximum value is taken as the traffic identification result after the normalization of the Softmax function. The experimental results on the open standard data set UNSW-NB15 of multi-classification network abnormal traffic show that the model has achieved 92.81% F1 value, which is higher than other advanced deep learning models compared with the experiment. The feature capture ability and training performance of the built-in self-attention simple cycle unit are better than other traditional cycle neural networks, which proves the feasibility and effectiveness of the model.

     

/

返回文章
返回