Bug Hunter's Methodology v4 Notes
Youtube Link: https://www.youtube.com/watch?v=p4JgIu1mceI
1. Project Tracking
Xmind - for creating mind maps structuring better the information.
With X-mind we can create awesome mind maps to map out how a c-level domain branches out into subdomains, ASNs, maybe some JSON API connections. We can basically use it as a map to gather all the data we will be able to harvest using the recon techniques.
Note: we need to check for this issues first to see if they are in scope for the project we are working.
1. Acquisitions
When starting recon is best to start working with understanding the company and what it does. Gain also some information in acquisition is very important.
Crunchbase [https://www.crunchbase.com]
Here is a good place to start because we can get various information about companies but more important information on acquisitions. Of course this information can also be acquired by googling, wikipedia etc.
This is important as we can do some recon from this angle on old subdomains that may or may not exist or are spread somewhere within the new subdomain tree of the target.
Example
Twitch has a wide scope domain on hacker one, we could try enumerating for domains like of Curse which was a relatively recent acquisition as that can be easily part of the wildcard in-scope they offer.
3. ASN (Autonomous System Numbers) Enumeration
Definition: An Autonomous System (AS) is a group of one or more IP prefixes (lists of IP addresses accessible on a network) run by one or more network operators that maintain a single, clearly-defined routing policy. Network operators need Autonomous System Numbers (ASNs) to control routing within their networks and to exchange routing information with other Internet Service Providers (ISPs).
For bigger companies and corporations they are offered ASNs globally which are some unique identifiers for bigger infrastructures, a company can have 1 to multiple ASNs depending on the scale of their operations and infrastructure
3.1 Web Tool - Hurricane Electric Services
A good tool to do ASN Enumeration is Hurricane Electric Services
Link: https://bgp.he.net
The searches should be done by the name of the organization instead of a domain. Keep in mind that this is a starting place for ASN, Registrar information and top level IPv4 and IPv6es.
ASN enumeration can be done also from the terminal for purposes of automatization and of course exporting some data on the fly.
3.2 CMD Line Tools
Metabigor will fetch data combined from bgp.he.net and ASN Lookup to scrape more information about ASNs and IPs
ASN Lookup does more or less the same thing as Hurrican Electric Services only it uses a different dataset based on the website maxmind.com
NOTE - this happened to Jason Haddix when trying to automate with these tools:
With these command line tools there is, however, a downside because of the way they scrape the data problem is that on accident we could get data on other companies that have similar names. If for example we are targeting tesla we might get ASNs or IPs from organizations not related to the target company.
If you start manually with BGP site it's easier to automate after because with filtering you can definitely get more accurate data to look for eliminating dupes and getting correct IP classes by ASNs.
3.3 Amass
For discovering more seed domains based on the ASNs we can use Amass with a port scanner to return any root domains in which we can see SSL certificates and so on.
Example:
amass intel -asn 46489
4. Reverse WHOIS
4.1 Whoxy
This is a neat tool to find more information about the organization.
NOTE: we need to be careful with the fidelity of the data as it does give some history background of what happened in the past but sometimes it can give some restricted subset of data. [MEDIUM FIDELITY DATA]
Example of F-ups:
You can get for example some domains that are not related to the business anymore or were related in the past and you can find yourself out into some place that is easily out of scope.
5.1 DOMLink
This tool will recursively query everything found in the Whoxy WHOIS API. It will start first querying the target records, analyze the data then look for other records and so on.
It targets: domains including emails in the records which is pretty neat.
6. Ad/Analytics Relationships
1. BuiltWith
Link: https://builtwith.com
Aside the fact that this tool offers information on technologies a web app is built with there is a tab called relationship profile where we could recon more data about possible domains and subdomains related with the target organization by verifying there google ads IPs or other ads trackers IPs they might have.
https://builtwith.com/relationships/name-of-the-company-recon
Builtwith also has a chrome/FF extension similar to Wappalyzer
2. Google-Fu
Google:
Copyright text - copy it from the main website
Terms of Service
Privacy policy
Scrolling down and looking through the pages we can get access to some more seed domains.
Shodan
Shodan is a tool that continuously spiders internet infratstructure. It captures response data, cert data, stack profiling data. It requires registration but usually an API key goes for cheap you get onto some sales.
Link: https://shodan.io
1. Linked & JS Discovery
Burp Suite Pro
Turn off passive scanning
Set forms auto to submit (if you are feeling frisky)
Set scope to advanced control and use "keyword" of target name (not a normal FQDN - like site address)
Browse main site and while doing that burp prox will spider all the hosts recursively
Check the bug hunter methodolgy video for more in-depth details.
GoSpider
This is a good well rounded spider that also filters the type of results it gets, parses domains/subdomains/urls/javascript and what is awesome is that we can export data to text files.
Note: The problem with GoSpider is that it doesn't do this recursively, in that case we will have to code a script that adds recursion on our on.
hakrawler
This does almost the same thing as GoSpider and although hakrawler has recursion coded in it the problem that some people pointed out is that it might have some bugs some times so part of the data might be left out or just straight be more inaccurate.
2. Subdomain Enumeration
1. Subdomainizer
This tool takes a page and when runs it does the following:
Find subdomains referenced in JS files
Find cloud services referenced in JS files
Also uses the Shannon Entropy (TODO - try to expand and understand that a bit better) to find potentially sensitive items in JS files
Note: If we're just interested in getting only subdomain data for specific types of bug bounty we might want to have a look at subscraper by Cillian-Collins because it purely has coded in a better recursion method.
DOME - not from bug bounty methodology
Link: https://github.com/v4d1/Dome
Dome is a good tool for subdomain enumeration that it should be mentioned here. It doesn't have yet recursive enumeration but it uses a tremendous amount of OSINT resources to crawl subdomains for a target. Very useful for some wide scope or wildcard scopes in bug bounty. Worth checking it out.
8. Subdomain Scraping
Subdomain scraping can be done through open source intelligence means such as:
Infrastructure sources:
Censys
Robtex!
WaybackMachine
DNSDB Search
dnsdumpster.com
ptrarchive.com
Search sources:
Yahoo!
Google
Baidu
bing
Ask
dogpile
duckduckgo
Certificate sources:
crt.sh
CertDB
sslmate certspotter
Security sources:
Hacker Target
Security Trails
virustotal
F-Secure Riddler
ThreatCrowd
ThreatMiner
Subdomain Scraping Example - Google
- Google: site:twitch.tv -www.twitch.tv
Google: site:twitch.tv -www.twitch.tv -watch.twitch.tvGoogle: site:twitch.tv -www.twitch.tv -watch.twitch.tv -dev.twitch.tv
Amass
Has the most sources and you can also see what sources it uses. Has different modes for doing recon as you can integrate different types of scanning including even bruteforce and so on and so forth.
The returned data also provides the subdomains with the ASNs as well or what ASN they came out of. Helps a lot out when branching our XMind mindmap and it can give a better overview like fx if it's a cloud service or something on premises.
Subfinder
Subdfinder is now maintained by Project Discovery which is an organization that is trying to retool some old tools and maintaining them. They will also launch there own analytics product similar to burp in some ways for recon which will proably be launched towards the end of 2023, however for now the tools can be found open source on their website and github.
Website: https://projectdiscovery.io Github: suls
Github Enumeration - github-subdomains.py
This tool is awesome for github enumeration through the github API that can basically crawl any data about subdomains we can find linked to github resources.
NOTEIndeed it is more of a niche tool and a good mention in here is to run 5-6 iteration of the script with a 6 second sleep in between the first 4 and the last 2 with a 10 second sleep, to get consistency. This is due to the fact that the rate of the tool is limited and somehow it retrieves random results probably between the scripting combination and the Github API restrictions.
The advantage of Github enumeration is that you can get some subdomains that you might never find anywhere else at all as this subdomains are subdomains more catered to the development team or people that worked in the past for the bug bounty target we are hunting.
In addition to that the tool incorporates multiple other options and scripts that offers dorks and ways to get e-mails. However since the Github API is whacky keep in mind the note above. :)
shosubgo
This is a terminal based tool for gathering subdomains from Shodan - you need a Shodan API key for this to work. It's great for testing a long with the github tool as it gives some edge to the subdomain enumeration game against other bug bounty hunters.
Cloud Ranges
Cloud Ranges is an enumeration tool that looks only for cloud related infrastructure all under port 443 verifying SSL certs. It's a bit more of a hassle to setup but it can provide some awesome cloud data we wouldn't find anywhere else. Cloud Ranges also should be run weekly as this resources can go weekly.
Note You could basically create a wrap the tool in better ways to use some specific template. There is also a service that runs non-stop weekly you can query it from time to time to find new data on our targets.
9. Subdomain Bruteforce
Subdomain bruting is a technique to actually bruteforcing some subdomains based on a wordlist we provide to find subdomains that maybe some other people or sources didn't find yet.
Amass
Amass implemented this inside their own tools using their own built in list and has 8 DNS resolvers that can assist with the process:
For default Amass bruting we can use the following command:
amass enum -brute -d twitch.tv -src
For custom Amass bruting we can add support dns resolvers and specific lists to help with the process and the command would be:
amass enum -brute -d twitch.tv -rf resolvers.txt -w bruteforce.list
shuffleDNS
This is another tool very similar to what Amass is doing through bruteforcing. Maintained by ProjectDiscovery team.
This is a wrapper based on massdns core (the original DNS bruteforcing method).
NOTE
Between Amass and shuffleDNS there is more of a flavor preference to be honest based on Jhaddix description as mostly they are doing a similar thing. More important than this is what actual wordlists are we using rather than the tool of choice
Subdomain Bruting Lists
Jhaddix has a massive file called all.txt for subdomain bruteforcing a collection of lists compiled from 7 years of DNS bruteforcing scraped from all the tools that use different wordlists.
Link to all.txt: https://gist.github.com/jhaddix/86a06c5dc309d08580a018c66354a056
In order to create customized wordlists: Who, What, Where, When - TomNomNom Link to talk: https://www.youtube.com/watch?v=W4_QCSIujQ4
Another source for lists that takes some data from other places like reddit, specific code related subdomains as Rails/NodeJS etc.
Link: https://github.com/assetnote/commonspeak2
10. Alteration scanning
Alteration scanning or permutation scanning is basically a way to scan for related subdomain names.
Like if we know there is a subdomain dev.company.com that company might have something like: dev1.company.com dev2.company.com dev-3.company.com Dev.2.company.com
etc.
A tool to search for this is altdns that was written by Naffy and Shubs.
Amass now includes this type of analysis.
WAF Bypass
![[Pasted image 20231017194018.png]]
In case the target uses WAF bypass you can apply alteration scanner on other sub domains that might work and you can bypass the filtering protections.
Check the examples above as well.
11. OTHER
These is stuff that goes into recon but might bleed into vulnerability analysis.
1. Port Analysis
1.1 masscan
When we do mass analysis masscan is a much better scanner than nmap
Example
masscan -p1 65535 -iL $inputIPfile -max-rate 1800 -oG $outputfile.log
$inputIPfile - is a file that will list all the IPs we want to find
A good tutorial on how to use masscan can be found at: https://danielmiessler.com/p/masscan
1.2 dnmasscan
A wrapper around masscan for us to be able to scan domains instead of IPs.
Adds the ability to resolve a dns and passes the ip then to masscan.
This tool looks only for ports: 80 and 443.
1.3 brutespray - Service Scanning
After we scan for ports and we aggregate all the data we can use an nmap -oG (output file) to put in brutespray to actually scan the remote administration protocols such as telnet, rdp, ssh, ftp etc.
brutespray, however we will run with a small specific wordlist of the most common user:password like admin:admin root:root blank:blank etc. so we check for some not configured or misconfigured credentials.
NOTE: masscan => Nmap service scan -oG => Brutespray credential bruteforce (TODO: understand how the passing of the data goes)
2. Github Dorking
Link: https://gist.github.com/jhaddix/2a08178b94e2fb37ca2bb47b25bcaed1
Github dorking can be useful because developers either in the past or the present from various companies might leave traces on github for data from what they developed or what are developing it can be found through some scripts or some repositories related to our specific target. A good list of dorks to use is jhaddix's one mentioned above to gather some more data.
3. Screenshotting
After we got all the data about the possible domains we can target we can definitely prioritize the subdomains of interest for testing further vulnerabilities and with time and experience we might be able to do this manually, however as beginners we can partly automate this process as eyeballing functionality of a webpage without experience might make us oversee things.
There are a bunch of tools that can assist us in this process:
Aquatone - is a wider recon framework that does this
HTTPScreenshot
Eyewitness - jhaddix likes this because we can use both http and https protocol.
httprobe (Tomnomnom) - is a good way to also sort subdomains to probe for http or https, curate a list of those and add them to the screenshot framework of your flavor to do the screesudonshots faster.
Note: Play around with the tools and find the one to your liking.
4. Subdomain Takeover
You can check if you can snatch subdomains from a specific target to target some other location.
Link: https://github.com/EdOverflow/can-i-take-over-xyz
SubOver
This is an automated scanner for subdomain takeover
nuclei
Link: https://projectdiscovery.io/nuclei
This is a larger framework for vulnerability scanning that includes subdomain takeovers (jhaddix mentions it seems to be the most advanced one for subdomain takeover scanning). It might become the paid version of ProjectDiscovery
12. Automation++
1. Extending tools (interlace)
When you will start developing your own recon network interlace is the go to place to glue together the recon framework because what interlace does is basically translating and connecting different type of outputs to inputs into other tools so we can streamline the bug bounty process.
Advantages:
Adding multithreading to non-multithreaded tools
Translating tools for CIDR input and glob support
Link to interlace: https://github.com/codingo/Interlace
2. TomNomNom's Github
Whatever is in this github is pretty useful for multiple stuff.
Frameworks
C-Tier:
Description: Automation built around scripting up other tools in bash or python. Step based, no workflow, few techniques. Little extensibility.
https://github.com/AdmiralGaust/bountyRecon
https://github.com/offhourscoding/recon
https://github.com/Sambal0x/Recon-tools
https://github.com/yogeshojha/rengine/
https://github.com/yourbuddy25/Hunter
https://github.com/venom26/recon/blob/master/ultimate_recon.sh
https://github.com/dwisiswant0
B-Tier
Description: automation writing a few of their own modules. Some GUI or advanced workflow. Medium techniques. Runs point-in time (only once). Flat files.
A-Tier
Description: Automation writing all their own modules, has GUI, Runs iteratively, manages data via a db.
https://github.com/nahamsec/lazyrecon
https://github.com/Findomain/Findomain
https://github.com/SilverPoision/Rock-ON
https://github.com/epi052/recon-pipelinehttps://github.com/0xspade/Automated-Scanner
https://github.com/shmilylty/OneForAll/blob/master/docs/en-us/README.md
https://github.com/SolomonSklash/chomp-scan
https://github.com/screetsec/Sudomy
https://github.com/skynet0x01/tugarecon
S-Tier
Description: automation writing their own modules, has GUI, runs iteratively, manages data via db. Scales across multiple boxes. Sends alert to user. Uses novel techniques like ML + AI to optimize speed.
Intrigue - https://intrigue.io - was bought by Google
AssetNote - https://assetnote.io
Last updated