在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称:jaeles-project/gospider开源软件地址:https://github.com/jaeles-project/gospider开源编程语言:Go 100.0%开源软件介绍:GoSpiderGoSpider - Fast web spider written in Go Painless integrate Gospider into your recon workflow?this project was part of Osmedeus Engine. Check out how it was integrated at @OsmedeusEngine Installation
Features
ShowcasesUsageFast web spider written in Go - v1.1.5 by @thebl4ckturtle & @j3ssiejjj
Usage:
gospider [flags]
Flags:
-s, --site string Site to crawl
-S, --sites string Site list to crawl
-p, --proxy string Proxy (Ex: http://127.0.0.1:8080)
-o, --output string Output folder
-u, --user-agent string User Agent to use
web: random web user-agent
mobi: random mobile user-agent
or you can set your special user-agent (default "web")
--cookie string Cookie to use (testA=a; testB=b)
-H, --header stringArray Header to use (Use multiple flag to set multiple header)
--burp string Load headers and cookie from burp raw http request
--blacklist string Blacklist URL Regex
--whitelist string Whitelist URL Regex
--whitelist-domain string Whitelist Domain
-t, --threads int Number of threads (Run sites in parallel) (default 1)
-c, --concurrent int The number of the maximum allowed concurrent requests of the matching domains (default 5)
-d, --depth int MaxDepth limits the recursion depth of visited URLs. (Set it to 0 for infinite recursion) (default 1)
-k, --delay int Delay is the duration to wait before creating a new request to the matching domains (second)
-K, --random-delay int RandomDelay is the extra randomized duration to wait added to Delay before creating a new request (second)
-m, --timeout int Request timeout (second) (default 10)
-B, --base Disable all and only use HTML content
--js Enable linkfinder in javascript file (default true)
--subs Include subdomains
--sitemap Try to crawl sitemap.xml
--robots Try to crawl robots.txt (default true)
-a, --other-source Find URLs from 3rd party (Archive.org, CommonCrawl.org, VirusTotal.com, AlienVault.com)
-w, --include-subs Include subdomains crawled from 3rd party. Default is main domain
-r, --include-other-source Also include other-source's urls (still crawl and request)
--debug Turn on debug mode
--json Enable JSON output
-v, --verbose Turn on verbose
-l, --length Turn on length
-L, --filter-length Turn on length filter
-R, --raw Turn on raw
-q, --quiet Suppress all the output and only show URL
--no-redirect Disable redirect
--version Check version
-h, --help help for gospider
Example commandsQuite output
Run with single site
Run with site list
Run with 20 sites at the same time with 10 bot each site
Also get URLs from 3rd party (Archive.org, CommonCrawl.org, VirusTotal.com, AlienVault.com)
Also get URLs from 3rd party (Archive.org, CommonCrawl.org, VirusTotal.com, AlienVault.com) and include subdomains
Use custom header/cookies
Blacklist url/file extension.P/s: gospider blacklisted
Show and Blacklist file length.
License
Donation |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论