14.04 - Technologies extraction using URL of the website - Ask Ubuntu


i know whether possible retrieve technologies used build website having url using ubuntu.
example: if have url:
https://www.wikipedia.org/

i want know technologies used build website.
output :

php, hhvm, vanish, addthis , many others. 

is there way done faster?
please, remember have file of list of website , want extract web technologies of websites , place them in file after url (line line). kindly, let me know if possible using ubuntu command or software on ubuntu.

you may kali or parrot distributions information collecting tools.

  • nikto 1 of them i'd tried before , gives partial info. available ubuntu repository too.

    ~$ whatis nikto nikto (1)            - scan web server known vulnerabilities  ~$ sudo apt-get install nikto ~$ sudo nikto -update ~$ nikto -tuning b -h www.wikipedia.org - nikto v2.1.5 --------------------------------------------------------------------------- + target ip:          91.198.174.192 + target hostname:    www.wikipedia.org + target port:        80 + start time:         2016-11-14 09:22:30 (gmt1) --------------------------------------------------------------------------- + server: varnish + ip address found in 'x-client-ip' header. ip "105.107.105.185". + anti-clickjacking x-frame-options header not present. + uncommon header 'x-client-ip' found, contents: 105.107.105.185 + uncommon header 'x-cache' found, contents: cp3041 int + uncommon header 'x-varnish' found, contents: 827655138 + uncommon header 'x-cache-status' found, contents: int + root page / redirects to: https://www.wikipedia.org/ + no cgi directories found (use '-c all' force check possible dirs) + server banner has changed 'varnish' 'mw1187.eqiad.wmnet' may suggest waf, load balancer or proxy in place + cookie geoip created without httponly flag + retrieved via header: 1.1 varnish-v4, 1.1 varnish-v4, 1.1 varnish-v4 + retrieved x-powered-by header: hhvm/3.3.0-static + server leaks inodes via etags, header found file /, fields: 0xw/3b2 0x5369720eefb07  + uncommon header 'x-analytics' found, contents: nocookies=1 + uncommon header 'backend-timing' found, contents: d=236 t=1478774110870502 + 269 items checked: 0 error(s) , 12 item(s) reported on remote host + end time:           2016-11-14 09:23:21 (gmt1) (51 seconds) --------------------------------------------------------------------------- + 1 host(s) tested 
  • whatweb tool. has non fixed bug (invalid multibyte escape error) in ubuntu.

    1. open file of encoding auto-detection library editing

      sudo nano /usr/lib/ruby/vendor_ruby/rchardet/universaldetector.rb 
    2. add # encoding: us-ascii


    even workaround above, output not clean in kali.

    ~$ whatis whatweb whatweb (1)          - web scanner identify websites running.  ~$ whatweb www.wikipedia.org /usr/share/whatweb/lib/tld.rb:85: warning: key "2nd_level_registration" duplicated , overwritten on line 85 /usr/share/whatweb/lib/tld.rb:93: warning: key "2nd_level_registration" duplicated , overwritten on line 93 /usr/share/whatweb/lib/tld.rb:95: warning: key "2nd_level_registration" duplicated , overwritten on line 95 /usr/share/whatweb/plugins/wordpress.rb:436: warning: key "2.7-beta1" duplicated , overwritten on line 453 /usr/share/whatweb/lib/extend-http.rb:102:in `connect': object#timeout deprecated, use timeout.timeout instead. http://www.wikipedia.org [301] cookies[wmf-last-access], country[netherlands][nl], httpserver[varnish], httponly[wmf-last-access], ip[91.198.174.192], redirectlocation[https://www.wikipedia.org/], uncommonheaders[x-varnish,x-cache-status,x-client-ip], varnish /usr/share/whatweb/lib/extend-http.rb:102:in `connect': object#timeout deprecated, use timeout.timeout instead. /usr/share/whatweb/lib/extend-http.rb:140:in `connect': object#timeout deprecated, use timeout.timeout instead. https://www.wikipedia.org/ [200] cookies[geoip,wmf-last-access], country[netherlands][nl], email[wikipedia-logo-v2@1.5x.png,wikipedia-logo-v2@2x.png,wikipedia_wordmark@1.5x.png,wikipedia_wordmark@2x.png,sprite-bookshelf_icons@1.5x.png,sprite-bookshelf_icons@2x.png,sprite-project-logos@1.5x.png,sprite-project-logos@2x.png], html5, httpserver[mw1253.eqiad.wmnet], httponly[wmf-last-access], ip[91.198.174.192], mediawiki, script, title[wikipedia], uncommonheaders[backend-timing,x-varnish,x-cache-status,strict-transport-security,x-analytics,x-client-ip], varnish, via-proxy[1.1 varnish-v4, 1.1 varnish-v4, 1.1 varnish-v4], x-powered-by[hhvm/3.3.0-static] 

    output kali:

    ~# whatweb https://www.wikipedia.org https://www.wikipedia.org [200 ok] cookies[geoip,wmf-last-access], country[netherlands][nl], email[wikipedia-logo-v2@1.5x.png,wikipedia-logo-v2@2x.png,wikipedia_wordmark@1.5x.png,wikipedia_wordmark@2x.png,sprite-bookshelf_icons@1.5x.png,sprite-bookshelf_icons@2x.png,sprite-project-logos@1.5x.png,sprite-project-logos@2x.png], html5, httpserver[mw1253.eqiad.wmnet], httponly[wmf-last-access], ip[91.198.174.192], mediawiki, script, strict-transport-security[max-age=31536000; includesubdomains; preload], title[wikipedia], uncommonheaders[backend-timing,x-varnish,x-cache-status,x-analytics,x-client-ip], varnish, via-proxy[1.1 varnish-v4, 1.1 varnish-v4, 1.1 varnish-v4], x-powered-by[hhvm/3.3.0-static] 

Comments

Popular posts from this blog

download - Firefox cannot save files (most of the time), how to solve? - Super User

windows - "-2146893807 NTE_NOT_FOUND" when repair certificate store - Super User