commands // 2026-01-05 // ID: REF-Gospider Command List

Gospider Command List


Gospider Command Guide

Gospider is a fast web spider written in Go. It crawls sites, Javascript files, and sitemaps to find URLs and subdomains.

Top 10 Useful Commands

1. Basic Crawl

gospider -s "http://example.com/"

Explanation: Spiders a single site.

2. Site List

gospider -S sites.txt

Explanation: Spiders a list of URLs from a file.

3. Threads

gospider -s "http://example.com/" -t 20

Explanation: Sets number of concurrent threads.

4. Output Directory

gospider -s "http://example.com/" -o output/

Explanation: Saves results into specific directory.

5. Depth

gospider -s "http://example.com/" -d 3

Explanation: Sets crawling depth (how many links deep to follow).

6. Other Sources

gospider -s "http://example.com/" -a

Explanation: Also queries AlienVault, Wayback, CommonCrawl for URLs.

7. Robots.txt

gospider -s "http://example.com/" -r

Explanation: Specifically parsing robots.txt.

8. Sitemap

gospider -s "http://example.com/" --sitemap

Explanation: Forces sitemap parsing.

9. Cookie

gospider -s "http://example.com/" -c "cookie=value"

Explanation: Sets cookies for authenticated crawling.

10. No Redirects

gospider -s "http://example.com/" --no-redirect

Explanation: Do not follow redirects.

The Most Powerful Command

gospider -S urls.txt -o results -c "cookie=session" -d 2 --other-source --include-subs

Explanation: Crawl a list of sites, deeply, checking 3rd party sources, including subdomains, as an authenticated user.