# Google dorks are usefull for finding new subdomains
site:wikipedia.org
site:*.wikipedia.org -www -store -jobs -uk
# Bing also support dorks and can give others results
site:
# When you use the Google Dork: site:*.example.com, NEVER forget to check
site:*.*.example.com
site:*.*.*.example.com
# Github dorks also allow to find many subdomains"teslamotors.com" password
# ...
Online DNS tools and services
# VirusTotal runs its own passive DNS replication service# DNS Dumpster can also find large number of sub-domains
Certificate Transparency
# It's a project started by Google to log and audit SSL/TLS certificates for every CA# Interesting because you can request these certificates using some tools
https://crt.sh/
https://censys.io/
https://developers.facebook.com/tools/ct/
https://transparencyreport.google.com/https/certificates
# crt.sh also provides public access to their CT logs using postgresql interface
psql -h crt.sh -p 5432 -U guest certwatch
# Appsecco wrote some scripts to automate these queries
https://github.com/appsecco/the-art-of-subdomain-enumeration
https://github.com/appsecco/bugcrowd-levelup-subdomain-enumeration
# You can use ct.py and massdns to find resolvable or not subdomains
./ct.py icann.org | ./bin/massdns -r resolvers.txt -t A -q -a -o -w icann_resolvable_domains.txt -
# Another tool is ct-exposer
ct-exposer.py -d domain.com
DNS Enumeration, Bruteforce and Passive Tools
# DNSRecon is a powerfull enumeration tool
dnsrecon.py -h
dnsrecon.py -d domain.fr
# You can do dictionnary enumeration# -n : nameserver used# -t : type (brt=brute domain and hosts using a given dictionnary)
dnsrecon.py -n ns1.insecuredns.com -d insecuredns.com -D subdomains-top1mil-5000.txt -t brt
# dnsenum is another tool
dnsenum domain.fr
# Subdomains bruteforce
python sublist3r.py -d domain.fr
# DNS enumeration and information gathering using fierce (not really passiv)
fierce -dns domain.fr
# Aquatone is also a great tool
# Subfinder# Subdomain discovery tool that discovers valid subdomains for websites by using passive online sources.https://github.com/projectdiscovery/subfinder
$ subfinder -d freelancer.com -o output.txt
$ subfinder -dL domains.txt -oD ~/path/to/output
$ cat domains.txt | subfinders -subs
# You can also uses ffuf in order to bruteforce for subdomains# Option 1# Redirecting to the host
$ ffuf -u https://mydomain.com -w my_wordlist -H "Host: FUZZ.mydomain.com"# Option 2# But you'll have to wait for the timeout
$ ffuf -u https://FUZZ.mydomain.com -w my_wordlist
Linked Discovery (using BurpSuite Pro)
https://drive.google.com/file/d/1aG_qqRvNW-s5_8vvPk5rJiMSMeNL2uY9/view
# Using the spider, you can find lots of subdomains/roots by browsing the website# Turn off passive scanning# Set scope to advanced control and use "keyword" of target name (not a normal FQDN) (example : "twitch")# Brows the site then spider all hosts recursively# Export# Select domains in the burp tree# "Engagement Tools" --> "Analyze Target"# Save report as HTML# Copy hosts from the HTML
OWASP Amass
# OWASP Amass tool suite is used to build a network map of the target# It relies for subdomain enumeration on scrapping data-sources, recursive bruteforcing, crawling web services, permuting names and reverse DNS sweeping# Basic use (DNS lookups and name alterations)
amass -d example.com
# Brute force subdomain enumeration and print data sources and ip for discovered names
amass -src -ip -brute -d example.com
# Passiv mode
amass -passiv -d example.com
# Importing Amass results into Maltego
amass -src -ip --active -brute -do owasp.json -d owasp.org
amass.viz -i owasp.json --maltego owasp.csv
→ Then import csv files
SubDomainizer
https://github.com/nsonaniya2010/SubDomainizer
# SubDomainizer is a tool designed to find hidden subdomains and secrets present in# either webpage, Github, and external javascripts present in the given URL. # This tool also finds S3 buckets, cloudfront URL's and more from those JS files which # could be interesting like S3 bucket is open to read/write, or subdomain takeover and similar case for cloudfront# To find subdomains, s3 buckets, and cloudfront URL's for given single URL:
python3 SubDomainizer.py -u http://www.example.com
# To find subdomains from given list of URL (file given):
python3 SubDomainizer.py -l list.txt
# To save the results in (output.txt) file:
python3 SubDomainizer.py -u https://www.example.com -o output.txt
# To give cookies:
python3 SubDomainizer.py -u https://www.example.com -c "test=1; test=2"# To scan via github:
python3 SubDomainizer.py -u https://www.example.com -o output.txt -gt <github_token> -g
# No SSL Certificate Verification:
python3 SubDomainizer.py -u https://www.example.com -o output.txt -gt <github_token> -g -k
Knockpy
# Knockpy is a python tool designed to enumerate subdomains on a target # domain through a wordlist. It is designed to scan for DNS zone transfer # and to try to bypass the wildcard DNS record automatically if it is enabled. # Now knockpy supports queries to VirusTotal subdomains, you can set # the API_KEY within the config.json file.# Classic research with internal wordlist
knockpy domain.com
# Use a specific wordlist
knockpy domain.com -w wordlist.txt
# Resolve IP
knockpy -r domain.com or IP
# Save output to CSV
knockpy -c domain.com
# export in JSON
knockpy -j domain.com
Permutation Scanning
# Another technique is to use permutations, alterations and mutations of known subdomains to find new ones# Tools like altdns.py can do that# -i : already known subdomains used in input# -o : file containing a massiv list of permuted subdomains# -w : words usef for permutations# -r : resolve each newly generated subdomain # -s : result for the resolved permuted subdomains
altdns.py -i icann.domains -o data_output -w icann.words -r -s results_output.txt
Autonomous System (AS) Numbers
# You can use ASN to identify netblocks belonging to a company# Resolve the IP of a given domain, using dig or host# Then look for the ASN associated
https://asn.cymru.com/cgi-bin/whois.cgi (given an IP address)
https://bgp.he.net/ (given a domain name)# Then you can use nmap NSE scripts to find netblocks for an ASN
nmap --script targets-asn --script-args targets-asn.asn=17012 > netblocks.txt
# You will get range networks on which you can investigate more
Zone Transfer
# DNS transaction where a DNS Server gives a full/part copy of it's zone file to another DNS Server# If bad configuration, anyone can initiate a zone transfer and so access to the zone file
dig +multi AXFR @ns1.insecuredns.com insecuredns.com
# Subjack is a tool used to automate subdomain takeover discovery (https://github.com/haccer/subjack)
./subjack -w subdomains.txt -t 100 -timeout 30 -o results.txt -ssl
DNSSEC Walking
# To prove things that do not exist, DNSSEC list all the things that exist# If the list is proven, then everything not listed must not exist# This list is created by the NSEC (or NSEC3) records# Interesting side-effect, it allows anyone to list the zone content by following the linked list of NSEC records.
→ zone walking
# Tools like ldns-walk can exploit this
ldns-walk @ns1.insecuredns.com insecuredns.com
ldns-walk insecuredns.com
# Some DNSSEC zones uses NSEC3 → Hashed domain names to prevent attackers# It's still possible to collect hashes and crack them# Tools like nsec3walker
./collect icann.org > icann.org.collect
cat icann.org.unhash | grep "icann"| awk '{print $2;}'
Project Collecting Data
# Some projects collect data and provide them to the community# ForwardDNS dataset is a part of Project Sonar# This dataset is created by extracting domain names from a number of sources and then sending an ANY query for each domain# You can download the dataset and parse it but IT'S HUGE (20Gb compressed, 300+ Gb uncompressed)# It seems to be the best solution to find subdomains
curl -silent https://scans.io/data/rapid7/sonar.fdns_v2/20170417-fdns.json.gz | pigz -dc | grep “.icann.org” | jq
wget https://opendata.rapid7.com/sonar.fdns_v2/2019-11-29-1574985929-fdns_a.json.gz
cat 20170417-fdns.json.gz | pigz -dc | grep ".target.org"| jq
# SonarSearch is a project built arond the Rapid7 Sonar's database
https://github.com/Cgboal/SonarSearch
# An API instance is available at
https://omnisint.io/
# Crobat is a CLI tool designed to query this API
$ crobat -h
Usage of crobat:
-r string
Perform reverse lookup on IP address or CIDR range. Supports files and quoted lists
-s string
Get subdomains for this value. Supports files and quoted lists
-t string
Get tlds for this value. Supports files and quoted lists
-u Ensures results are unique, may cause instability on large queries due to RAM requirements
ASN/Burp example
# Get ASN for your target
bgp.he.net
# Then you a use amass to find top level domains
amass intel -active -asn 361XXXX
# Then look on crunchbase for acquisitions# Link discovery in Burp# BurpSuite History too# Set target, go on the main website, then look for requests made to different domains# Select all and send them to spider --> Discover new ones# Doing it from a VPN can be better to avoid IP ban# Reverse whois / By owner# Test HTTP for subdomains# To get domain running webservers
cat subdomains_list | httprobe
# Using aquatone to get screenshots of discovered sites
cat subdomains_servers | aquatone
# OpenList Firefox extension can help to visualize websites# Content Discovery# Ffuf is one of the fastest web fuzzer but not recursive# Dirsearch is good for recursive
https://github.com/ffuf/ffuf
ffuf -u site/FUZZ -w wordlist
ffuf -u site/FUZZ/HERE -w wordlist