here are some of the more most important features of web-sorrow:
- CMS (Content Management System) detection
- Port scanning
- Login page scanning
- Proxy support
- Bruteforce (Subdomains, Files, and Directories)
- Error bagging
- Standard set of scans indexing of directories, banner grabbing, language detection, robots.txt, and 200 response testing, thumbs.db scanning, and etc.
perl Wsorrow.pl [HOST OPTIONS] [SCAN(s)] [SCAN SETTING(s)]HOST OPTIONS:
-host [host] -- Defines host to scan, a list separated by
semicolons, 184.108.40.206-100 type ranges, and
1.1.1.* type ranges. You can also use the
220.127.116.11-100 type ranges for domains
-port [port num] -- Defines port number to use (Default is 80)
-proxy [ip:port] -- Use an HTTP, HTTPS, or gopher proxy server
-S -- Standard set of scans including: agresive directory indexing,
Banner grabbing, Language detection, robots.txt,
HTTP 200 response testing, Apache user enum, SSL cert,
Mobile page testing, sensitive items scanning,
thumbs.db scanning, content negotiation, and non port 80
HTTP port sweeps
-auth -- Scan for login pages, admin consoles, and email webapps
-Cp [dp | jm | wp | all] scan for cms plugins.
dp = drupal, jm = joomla, wp = wordpress
-Fd -- Scan for common interesting files and dirs (Bruteforce)
-Sfd -- Very small files and dirs enum (for the sake of time)
-Sd -- BruteForce Subdomains (host given must be a domain. Not an IP)
-Ws -- Scan for Web Services on host such as: cms version info,
blogging services, favicon fingerprints, and hosting provider
-Db -- BruteForce Directories with the big dirbuster Database
-Df [option] Scan for default files. platfroms/options: Apache,
Frontpage, IIS, Oracle9i, Weblogic, Websphere,
MicrosoftCGI, all (enables all)
-ninja -- A light weight and undetectable scan that uses bits and
peices from other scans (it is not recomended to use with any
other scans if you want to be stealthy. See readme.txt)
-fuzzsd -- Fuzz every found file for Source Disclosure
-e -- Everything. run all scans
-intense -- like -e but no bruteforce
-I -- Passively scan interesting strings in responses such as:
emails, wordpress dirs, cgi dirs, SSI, facebook fbids,
and much more (results may Contain partial html)
-dp -- Do passive tests on requests: banner grabbing, Dir indexing,
Non 200 http status, strings in error pages,
Passive Web services
-flag [txt] -- report when this text shows up on the responces.
-ua [ua] -- Useragent to use. put it in quotes. (default is firefox linux)
-Rua -- Generate a new random UserAgent per request
-R -- Only request HTTP headers via ranges requests.
This is much faster but some features and capabilitises
May not work with this option. But it's perfect when
You only want to know if something exists or not.
Like in -auth or -Fd
-gzip -- Compresses http responces from host for speed. Some Banner
Grabbing will not work
-d [dir] -- Only scan within this directory
-https -- Use https (ssl) instead of http
-nr -- Don't do responce analisis IE. False positive testing,
Iteresting headers (other than banner grabbing) if
you want your scan to be less verbose use -nr
-Shadow -- Request pages from Google cache instead of from the Host.
(mostly for just -I otherwise it's unreliable)
-die -- Stop scanning host if it appears to be offline
-reject -- Treat this http status code as a 404 error
perl Wsorrow.pl -host scanme.nmap.org -S
perl Wsorrow.pl -host nyan.cat -Fd -fuzzsd
perl Wsorrow.pl -host nationalcookieagency.mil -Cp dp,jm -ua "script w/ the munchies"
perl Wsorrow.pl -host chatrealm.us -d /wordpress -Cp wp
perl Wsorrow.pl -host 18.104.22.168 -port 8080 -proxy 22.214.171.124:3128 -S -Ws -I
- The -ninja doesn't make other scans stealthy. it's far itself a experiment that makes use of very few requests.
- while the use of -Cp you may experiment more than one or unmarried cms plugins. as an instance: -Cp wp,dp or -Cp wp;dp. also, it would not depend what you separate the options with.
- while the usage of -ua, you should use rates if it consists of whitespace.
- if you use -e with different scans, it will run it twice.
- To log outcomes to a report: perl Wsorrow.pl -host host.com -S -I >logfile.txt.