Netinfo Security ›› 2024, Vol. 24 ›› Issue (3): 427-437.doi: 10.3969/j.issn.1671-1122.2024.03.008

Previous Articles     Next Articles

SHDoS Attack Detection Research Based on Attention-GRU

JIANG Kui1(), LU Lufan2, SU Yaoyang2, NIE Wei2   

  1. 1. Information Center, Shenzhen University, Shenzhen 518060, China
    2. College of Electronics and Information Engineering, Shenzhen University, Shenzhen 518060, China
  • Received:2023-09-17 Online:2024-03-10 Published:2024-04-03
  • Contact: JIANG Kui E-mail:jiangkui@szu.edu.cn

Abstract:

Aiming at the problem that SHDoS initiates a frequency conversion attack that causes the threshold detection scheme to fail, a deep learning model based on attention-GRU was proposed. The model used the improved Borderline-SMOTE for data balance processing firstly, then introduced the self-attention mechanism to build a two-layer GRU classification network, learned and trained the preprocessed data, and analyzed the SHDoS attack traffic to test finally. Verified by the CICIDS2018 dataset and self-built ShDoS dataset, and the experimental results shows that the accuracy rate of the model is 98.73% and 97.64% respectively, the recall rate is 96.57% and 96.27% respectively. The model with self-attention mechanism shows significant improvement compared to the model without it, compared to other models that use SMOTE or Borderline-SMOTE for data preprocessing, the performance of this model is also the best.

Key words: SHDoS attack, Borderline-SMOTE oversampling algorithm, self-attention mechanism, gated recurrent unit

CLC Number: