CHAPTER 7Directory Hunting

Inevitably on your quest to find information about a website, you will need to look for hidden (i.e., forgotten) treasures. The best way to do this is to look for random directories. If you have not tried this yet, you will be amazed at how much information people just forget about: webshells, PhPMyAdmin pages, folders with directory browsing and/or full read/write permissions, private files, and so much more. There are several ways to accomplish this. One is bruteforcing the target to find working directories, and the other is analyzing crawl data. This chapter will look at both.

Dirhunt

Dirhunt is a type of web crawler designed to search and analyze directories and folders within a web application. Dirhunt is not really a scraper, and it does not use bruteforce to find folders. Instead, Dirhunt checks a number of sources to find interesting files or folders, including Google and VirusTotal. Dirhunt is also designed to detect false 404 errors to minimize the number of false positives in your results.

The parameters of Dirhunt include:

 -t, --threads INTEGER           Number of threads to use
 -x, --exclude-flags TEXT        Exclude results with these flags
 -i, --include-flags TEXT        Only include results with these flags
 -e, --interesting-extensions TEXT   Look for files with the 
    following extensions
 -f, --interesting-files TEXT    The files with these names are
    interesting
 --stdout-flags TEXT ...

Get Hunting Cyber Criminals now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.