华人的又一大“发明”:忽略robots.txt监控论坛

虽然忽略robots.txt伪装浏览器行为来扒站监控言论,在兲朝早就不是什么秘密,但是一个正儿八经的来自U of Arizona的华人Ph.D居然还把这东西写成paper给“发明”出来,我还是感到比较震精的。

paper地址
http://dx.doi.org/10.1002/asi.21323

paper概览
https://docs.google.com/viewer?url=http://ai.arizona.edu/research/terror/forum_poster.pdf

paper abstract

The unprecedented growth of the Internet has given rise to the Dark Web, the problematic facet of the Web associated with cybercrime, hate, and extremism. Despite the need for tools to collect and analyze Dark Web forums, the covert nature of this part of the Internet makes traditional Web crawling techniques insufficient for capturing such content. In this study, we propose a novel crawling system designed to collect Dark Web forum content. The system uses a human-assisted accessibility approach to gain access to Dark Web forums. Several URL ordering features and techniques enable efficient extraction of forum postings. The system also includes an incremental crawler coupled with a recall-improvement mechanism intended to facilitate enhanced retrieval and updating of collected content. Experiments conducted to evaluate the effectiveness of the human-assisted accessibility approach and the recall-improvement-based, incremental-update procedure yielded favorable results. The human-assisted approach significantly improved access to Dark Web forums while the incremental crawler with recall improvement also outperformed standard periodic- and incremental-update approaches. Using the system, we were able to collect over 100 Dark Web forums from three regions. A case study encompassing link and content analysis of collected forums was used to illustrate the value and importance of gathering and analyzing content from such online communities.

robots.txt和用户注册本来就是防君子不防小人的东西。正规学术研究这样乱搞,我只能说两个字:无耻。遇到这种流氓机构的流氓爬虫,只有搞一个类似nmap基于行为,而不是基于特征的鉴别系统才能对付。

from newscientist via reddit

Comments