信息网络安全 ›› 2014, Vol. 14 ›› Issue (10): 38-43.doi: 10.3969/j.issn.1671-1122.2014.10.007

• 技术研究 • 上一篇    下一篇

基于规则库和网络爬虫的漏洞检测技术研究与实现

杜雷, 辛阳   

  1. 北京邮电大学信息安全中心,北京 100876
  • 收稿日期:2014-08-26 出版日期:2014-10-01 发布日期:2015-08-17
  • 作者简介:

    杜雷(1988-),男,山东,硕士研究生,主要研究方向:计算机网络、网络安全;辛阳(1977-),男,山东,副教授,博士,主要研究方向:移动通信安全、计算机安全。

  • 基金资助:
    国家自然科学基金[61121061、61161140320]、中央高校基本科研业务费专项资金[2012RC0215、2012RC0216]

Research and Implementation of Web Vulnerability Detection Technology Based on Rule Base and Web Crawler

DU Lei, XIN Yang   

  1. Information Security Center, Beijing University of Posts and Telecommunications, Beijing 100876, China
  • Received:2014-08-26 Online:2014-10-01 Published:2015-08-17

摘要:

Web技术是采用HTTP或HTTPS协议对外提供服务的应用程序,Web应用也逐渐成为软件开发的主流之一,但Web应用中存在的各种安全漏洞也逐渐暴露出来,如SQL注入、XSS漏洞,给人们带来巨大的经济损失。为解决Web网站安全问题,文章通过对Web常用漏洞如SQL注入和XSS的研究,提出了一种新的漏洞检测方法,一种基于漏洞规则库、使用网络爬虫检测SQL注入和XSS的技术。网络爬虫使用HTTP协议和URL链接来遍历获取网页信息,通过得到的网页链接,并逐步读取漏洞规则库里的规则,构造成可检测出漏洞的链接形式,自动对得到的网页链接发起GET请求以及POST请求,这个过程一直重复,直到规则库里的漏洞库全部读取构造完毕,然后继续使用网络爬虫和正则表达式获取网页信息,重复上述过程,这样便实现了检测SQL注入和XSS漏洞的目的。此方法丰富了Web漏洞检测的手段,增加了被检测网页的数量,同时涵盖了HTTP GET和HTTP POST两种请求方式,最后通过实验验证了利用此技术对Web网站进行安全检测的可行性,能够准确检测网站是否含有SQL注入和XSS漏洞。

关键词: 网络爬虫, SQL注入, XSS漏洞, 规则库

Abstract:

Web technology is the application using HTTP or HTTPS protocols to provide services. Web applications are becoming one of the main software development trends, but a variety of security vulnerabilities in Web applications are gradually exposed, such as SQL injection, XSS vulnerabilities. It brings a lot of economic loss. To solve the problem of Web site security, based on Web research for common vulnerabilities such as SQL injection and XSS, this paper presents a novel method for vulnerability detection which can detect Web vulnerabilities using Web Crawler constructing using URLs combined with vulnerability rule base. Web Crawler uses the HTTP protocol and URL links to traverse the acquisition Web page information through web links, and gradually read the rules in the rule library that configured to detect vulnerabilities link form, then initiate a GET request and a post request automatically. This process doesn’t repeats until all the rule library is read completed. And then using the Web Crawler and regular expressions to obtain Web page information, this will achieve the detection of SQL injection and XSS vulnerabilities purpose through repeating the process. This method is a means to enrich Web vulnerability detection, increasing the number of tested Web pages. At the same time, the HTTP GET and HTTP POST have done safety detection. Finally, the experiment can prove that the use of this technology on the Web site can be safety testing and detect whether the site has a SQL injection and XSS vulnerabilities.

Key words: Web Crawler, SQL injection, XSS vulnerabilities, rule base

中图分类号: