Saturday, May 31, 2014

Vulnerability Data into Elasticsearch

My day job has me focusing on Elasticsearch more these days. A while back I did a post on getting vulnerability data into ELSA. As a follow up I have been meaning to write a brief post on how to do the same with Elasticsearch. If you are not familiar with Elasticsearch go check it out here. From their website it is classified as "distributed restful search and analytics". It is often combined with Logstash and Kibana forming the "ELK" stack. The same reasons that having vulnerability data available with your event logs was a good idea in ELSA also apply if you are using the ELK stack. I modified my existing script to take an input file from one of several vulnerability scanners and index the results with Elasticsearch.

Before we begin my Python script makes use of the Elasticsearch API. I installed it via pip:

# pip install elasticsearch

I assume  an index exists called vulns. You can create it by hitting up the Elasticsearch API like this:
$ curl -XPUT http://localhost:9200/vulns
Different vulnerability scanners present time formats slightly different. It is a good idea to format it appropriately. For more information in the Elasticsearch docs check here. This is a sample API call you could make:

After the indexes are created you can run the script with XML output from a vulnerability scanner as input.
python VulntoES.py -i nessus_report_test_home.nessus -e 192.168.1.183 -r nessus

I have created a very simple dashboard in Kibana to visualize some of the vulnerabilities.


The script and dashboard can be found at my Github page:

4 comments:

  1. Hello

    How would you crate a mapping to use in nmap import?

    Great job!!

    Cheers

    ReplyDelete
  2. Only came across this this morning, exactly what I need!! Thanks man

    ReplyDelete
  3. Tokenization is a framework where you substitute the genuine installment card information with an intermediary set of recognizing data. This is done as such that vendors don't need to handle the delicate and managed information furthermore to keep it from being unreliable and more uncovered than should be expected. https://goo.gl/I2bx8k

    ReplyDelete
  4. An interesting discussion is worth comment. I think that you should write more on this topic, it might not be a taboo subject but generally people are not enough to speak on such topics. To the next. Cheers
    profile

    ReplyDelete

AWS Glue, Fitbit and a "Health Data Lake" - part 1

A couple years ago I got a Charge HR Fitbit device. I have worn it off and on for the past couple years. It has been mildly entertaining to ...