Description

This plugin searches for the robots.txt file, and parses it. This file is used to as an ACL that defines what URL’s a search engine can access. By parsing this file, you can get more information about the target web application.

Plugin type

Crawl

Options

This plugin doesn’t have any user configured options.

Source

For more information about this plugin and the associated tests, there’s always the source code to understand exactly what’s under the hood:
github-logoPlugin source code
Unittest source code

Dependencies

This plugin has no dependencies.