Logs processing with ELK


Module C10

The training “The processing of logs with Hadoop and ELK” is intended to form developers to process logs with Hadoop and the Elastic Search Logstash suite Kibana (ELK).

This training is for computer training populations (developers) having a solid knowledge of Java and comfortable with Java development tools like Eclipse or IntelliJ, Maven and if possible also an acquaintance of Scala and Spark thanks to the C17 curriculum.


Program

Day 1 : Tour of the different applicable open source tools

  • Some reminders about Hadoop and open source tools off Hadoop
  • Flume, Nifi, Logstash : open source tools for injecting logs with their
  • Introduction to Kafka: Message Oriented Middleware with TP
  • Storm and Spark streaming: to manage real-time aspects
  • Log-island: a thin, open source layer for handling real-time messages
  • Elastic Search, Solr or log-aware databases

Day 2 : Log injection with ELK

  • Introduction to Elastic Search
  • Introduction to Logstash
  • Introduction to Kibana
  • Using the ELK (Elastic Search Kibana) suite to inject its logs into Elastic Search with TP

Day 3 : Event and real-time log processing with Kafka and Spark streaming

  • Using log-island open source software to do complex event processing with Kafka and Spark streaming with TP

Prerequisites : Solid knowledge of Java and related development environments & amp; Minimal knowledge of Scala and Spark