commands
// 2026-01-05
// ID: REF-Hakrawler Command List
Hakrawler Command List
Hakrawler Command Guide
Hakrawler is a simple, go-based web crawler designed to find endpoints and assets quickly by piping stdin.
Top 10 Useful Commands
1. Basic Pipe
echo http://example.com | hakrawler
Explanation: Basic usage. Pipe a URL in, get URLs out.
2. Depth
echo http://example.com | hakrawler -d 3
Explanation: Crawl 3 levels deep.
3. Plain Output
echo http://example.com | hakrawler -plain
Explanation: Only output the URLs (no prefixes like [href]).
4. Show Source
echo http://example.com | hakrawler -s
Explanation: Show the source code of the found assets.
5. Header Usage
echo http://example.com | hakrawler -h "Cookie: admin=1"
Explanation: Send custom header (auth).
6. Subdomains
echo http://example.com | hakrawler -subs
Explanation: Specifically look for and output subdomains.
7. Unique Output
echo http://example.com | hakrawler -u
Explanation: Ensure output URLs are unique.
8. Input List
cat urls.txt | hakrawler
Explanation: Crawl a massive list of domains.
9. Filter JS
echo http://example.com | hakrawler | grep ".js"
Explanation: Often used to find Javascript files.
10. Insecure SSL
echo https://example.com | hakrawler -insecure
Explanation: Ignore SSL errors.
The Most Powerful Command
cat domains.txt | hakrawler -d 2 -plain -subs | grep "admin"
Explanation: Mass crawl a list of domains, go 2 levels deep, extract subdomains, and immediately grep for interesting "admin" endpoints.