Automater is a URL/Domain, IP Address, and Md5 Hash OSINT tool aimed at making the analysis process easier for intrusion Analysts. Given a target (URL, IP, or HASH) or a file full of targets Automater will return relevant results from sources like the following: IPvoid.com, Robtex.com, Fortiguard.com, unshorten.me, Urlvoid.com, Labs.alienvault.com, ThreatExpert, VxVault, and VirusTotal.
By default, if Automater does not find data available it will not submit the target to that site to get data. If you would like Automater to use an HTTP POST to send target data to a source like IPVoid or URLVoid use –p
There are also new output methods. –o will output to a file in the same format that is printed to screen, -c will output a csv, and –w will output an html file.
It does take Automater a little longer to run then it used to. That is because a delay of 2 seconds between requests was implemented to ensure sources don’t get overloaded. You can modify this delay with a –d
usage: Automater.py [-h] [-o OUTPUT] [-w WEB] [-c CSV] [-d DELAY] [-s SOURCE] [--p] target
IP, URL, and Hash Passive Analysis tool
target List one IP Addresses, URL or Hash to query or pass
the filename of a file containing IP Addresses, URL or
Hash to query each separated by a newline.
-h, --help show this help message and exit
-o OUTPUT, --output OUTPUT This option will output the results to a file.
-w WEB, --web WEB This option will output the results to an HTML file.
-c CSV, --csv CSV This option will output the results to a CSV file.
-d DELAY, --delay DELAY This will change the delay to the inputted seconds.
Default is 2.
-s SOURCE, --source SOURCE This option will only run the target against a
specific source engine to pull associated domains.
Options are defined in the name attribute of the site
element in the XML configuration file
--p This option tells the program to post information to
sites that allow posting. By default the program will
NOT post to sites that require a post.
Automater is now very easily extensible even for those that are not familiar with python. All the sources that are queried and what they are queried for are contained in sites.xml. This must be in the same directory as Automater.py and all the other .py’s that Automater ships with.
You can download Automater here:
Or read more here.