Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Kibana 7 Quick Start Guide
Kibana 7 Quick Start Guide

Kibana 7 Quick Start Guide: Visualize your Elasticsearch data with ease

eBook
€13.98 €19.99
Paperback
€24.99
Subscription
Free Trial
Renews at €18.99p/m

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning
OR
Modal Close icon
Payment Processing...
tick Completed

Shipping Address

Billing Address

Shipping Methods
Table of content icon View table of contents Preview book icon Preview Book

Kibana 7 Quick Start Guide

Introducing Kibana

Kibana is a dashboard tool that's easy to use and works closely with Elasticsearch. We can use Kibana for different use cases, such as system monitoring and application monitoring. Kibana isn't just a visualization tool, it also creates a complete monitoring ecosystem when we leverage the power of Elastic Stack. Here's a small example: you're working on a project where you can't tolerate any outrage, be it due to the database, application, system-related issues, or anything related to the application's performance. In a traditional monitoring system, you can monitor system performance, application logs, and so on. But with Kibana and Elastic Stack, we can do following:

  • Configure Beats to monitor system metrics, database metrics, and log metrics
  • Configure APM to monitor your application metrics and issues if your application platform is supported
  • Configure the JDBC plugin of Logstash to pull RDBMS data into Elasticsearch to make it available to Kibana for creating visualizations on KPIs
  • There are different third-party plugins that help us to get data from those sources, for example, you can use the Twitter plugin to get Twitter feeds
  • You can create alerts for certain thresholds, so that whenever that situation occurs, you get alerts so you don't have to continuously monitor the application
  • You can apply machine learning on your data to get data anomalies or future trends by analyzing the current dataset

Elastic Stack

Kibana with Elastic Stack can be used to fetch data from different sources and filter, process, and analyze it to create meaningful dashboards. Elastic Stack has the following components:

  • Elasticsearch: We can store data in Elasticsearch.
  • Logstash: A data pipeline that we can use to read data from various sources, and can write it to various sources. It also provides a feature to filter the input data before sending it to output.
  • Kibana: A graphical user interface that we can use to do a lot of things, which I will cover in this chapter.
  • Beats: Lightweight data shippers that sit on different servers and send data to Elasticsearch directly or via Logstash:
    • Filebeat
    • Metricbeat
    • Packetbeat
    • Auditbeat
    • Winlogbeat
    • Heartbeat

The following diagram shows how Elastic Stack works:

In the preceding diagram, we have three different servers on which we have installed and configured Beats. These Beats are shipping data to Elasticsearch directly or via Logstash. Once this data is pushed into Elasticsearch, we can analyze, visualize, and monitor the data in Kibana. Let's discuss these components in detail; we're going to start with Elasticsearch.

Elasticsearch

Elasticsearch is a full-text search engine that's primarily used for searching. It can also be used as a NoSQL database and analytics engine. Elasticsearch is basically schema-less and works in near-real-time. It has a RESTful interface, which helps us to interact with it easily from multiple interfaces. Elasticsearch supports different ways of importing various types of structured or unstructured data; it handles all types of data because of its schema-less behavior, and it's quite easy to scale. We have different clients available in Elasticsearch for the following languages:

  • Java
  • PHP
  • Perl
  • Python
  • .NET
  • Ruby
  • JavaScript
  • Groovy

Its query API is quite robust and we can execute different types of queries, such as boosting some fields over other fields, writing fuzzy queries, or searching on single or multiple fields, along with field search. Applying a Boolean search or wildcard search aggregation is another important feature of Elasticsearch, which helps us to aggregate different types of data; it has multiple types of aggregations, such as metric aggregation, bucket aggregation, and term aggregation.


In fuzzy queries, we match words even then if there's no exact match for the spelling. For example, if we try to search a word with the wrong spelling, we can get the correct result using fuzzy search.

The architecture of Elasticsearch has the following components:

  • Cluster: A collection of one or more nodes that work together is known as a cluster. By default, the cluster name is elasticsearch, which we can change to any unique name.
  • Node: A node represents a single Elasticsearch server, which is identified by a universally unique identifier (UUID).
  • Index: A collection of documents where each document in the collection has a common attribute.
  • Type: A logical partition of the index to store more than one type of document. Type was supported in previous versions and is deprecated from 6.0.0 onward.
  • Document: A single record in Elasticsearch is known as a document.
  • Shard: We can subdivide the Elasticsearch index into multiple pieces, which are called shards. During indexing, we can provide the number of shards required.

Elasticsearch is primarily used to store and search data in the Elastic Stack; Kibana picks this data from Elasticsearch and uses it to analyzes or visualizes it in the form of charts and more, which can be combined to create dashboards.

Logstash

Logstash is a data pipeline that can take data input from various sources, filter it, and output it to various sources; these sources can be files, Kafka, or databases. Logstash is a very important tool in Elastic Stack as it's primarily used to pull data from various sources and push it to Elasticsearch; from there, Kibana can use that data for analysis or visualization. We can take any type of data using Logstash, such as structured or unstructured data , which comes from various sources, such as the internet. The data can be transformed using Logstash's filter option, which has different plugins to play with different sets of data. For example, if we get an IP address in our data, the GeoIP plugin can add geolocation using that IP address, and in the output, we can get additional information of geolocation, which can then be used in Kibana to plot a map.

The following expression shows us an example of a Logstash configuration file:

input 
{
file
{
path => "/var/log/apache2/access.log"
}
}
filter
{
grok
{
match => {message => "%{COMBINEDAPACHELOG}"}
}
}
output
{
elasticsearch
{
hosts => "localhost"
}
}

In the preceding expression, we have three sections: input, filter, and output. In the input section, we're reading the Apache access log file data. The filter section is there to extract Apache access log data in different fields, using the grok filter option. The output section is quite straightforward as it's pushing the data to the local Elasticsearch cluster. We can configure the input and output sections to read or write from or to different sources, whereas we can apply different plugins to transform the input data; for example, we can mutate a field, transform a field value, or add geolocation from an IP address using the filter option.


Grok is a tool that we can use to generate structured and queryable data by parsing unstructured data.

Kibana

In Elastic Stack, Kibana is mainly used to provide the graphical user interface, which we use to do multiple things. When Kibana was first released, we just used it to create charts and histograms, but with each update, Kibana evolves and now we have lots of killer features that make Kibana stand out from the crowd. There are many features in Kibana, but when we talk about the key features, they are as follows:

  • Discover your data by exploring it
  • Analyze your data by applying different metrics
  • Visualize your data by creating different types of charts
  • Apply machine learning on your data to get data anomaly and future trends
  • Monitor your application using APM
  • Manage users and roles
  • A console to run Elasticsearch expressions
  • Play with time-series data using Timelion
  • Monitor your Elastic Stack using Monitoring

Application Performance Monitoring (APM) is built on top of an Elastic Stack that we use to monitor application and software services in real time. We'll look at APM in more detail in Chapter 6, Monitoring Applications with APM.

In this way, there are different use cases that can be handled well using Kibana. I'm going to explain each of them in later chapters.

Beats

Beats are single-purpose, lightweight data shippers that we use to get data from different servers. Beats can be installed on the servers as a lightweight agent to send system metrics, or process or file data to Logstash or Elasticsearch. They gather data from the machine on which they are installed and then send that data to Logstash, which we use to parse or transform the data before sending it to Elasticsearch, or we can send the Beats data directly into Elasticsearch.

They are quite handy as it takes almost no time to install and configure Beats to start sending data from the server on which they're installed. They're written to target specific requirements and work really well to solve use cases. Filebeat is there to work with different files like Apache log files or any other files, they keep a watch on the files, and as soon as an update happens, the updated data is shipped to Logstash or Elasticsearch. This file operation can also be configured using Logstash, but that may require some tuning; Filebeat is very easy to configure in comparison to Logstash.

Another advantage is that they have a smaller footprint and they sit on the servers from where we want the monitoring data to be sent. This makes the system quite simple because the collection of data happens on the remote machine, and then this data is sent to a centralized Elasticsearch cluster directly, or via Logstash. One more feature that makes Beats an important component of the Elastic Stack is the built-in Dashboard, which can be created in no time. We have a simple configuration in Beats to create a monitoring Dashboard in Kibana, which can be used to monitor directly or we might have to do some minor changes to use it for monitoring. There are different types of Beats, which we'll discuss here.

Filebeat

Filebeat is a lightweight data shipper that forwards log data from different servers to a central place, where we can analyze that log data. Filebeat monitors the log files that we specify, collects the data from there in an incremental way, and then forwards them to Logstash, or directly into Elasticsearch for indexing.

After configuring Filebeat, it starts the input as per the given instructions. Filebeat starts a harvester to read a single log to get the incremental data for each separate file. Harvester sends the log data to libbeat, and then libbeat aggregates all events and sends the data to the output as per the given instructions like in Elasticsearch, Kafka, or Logstash.

Metricbeat

Another lightweight data shipper that can be installed on any server to fetch system metrics. Metricbeat helps us to collect metrics from systems and services and to monitor the servers. Metrics are running on those servers, on which we installed Metricbeat. Metricbeat ships the collected system metrics data to Elasticsearch Logstash for analysis. Metricbeat can monitor many different services, as follows:

  • MySQL
  • PostgreSQL
  • Apache
  • Nginx
  • Redis
  • HAProxy

I've listed only some of the services, Metricbeat supports a lot more than that.

Packetbeat

Packetbeat is used to analyze network packets in real time. Packetbeat data can be pushed to Elasticsearch, which we can use to configure Kibana for real-time application monitoring. Packetbeat is very effective in diagnosing network-related issues, since it captures the network traffic between our application servers and it decodes the application-layer protocols, such as HTTP, Redis, and MySQL. Also, it correlates the request and response, and captures important fields.

Packetbeat supports the following protocols:

  • HTTP
  • MySQL
  • PostgreSQL
  • Redis
  • MongoDB
  • Memcache
  • TLS
  • DNS

Using Packetbeat, we can send our network packet data directly into Elasticsearch or through Logstash. Packetbeat is a handy tool since it's difficult to monitor the network packet. Just install and configure it on the server where you want to monitor the network packets and start getting the packet data into Elasticsearch using which, we can create packet data monitoring dashboard. Packetbeat also provides a custom dashboard that we can easily configure using the Packetbeat configuration file.

Auditbeat

Auditbeat can be installed and configured on any server to audit the activities of users and processes. It's a lightweight data shipper that sends the data directly to Elasticsearch or using Logstash. Sometimes it's difficult to track changes in binaries or configuration files; Auditbeat is helpful here because it detects changes to critical files, such as different configuration files and binaries.

We can configure Auditbeat to fetch audit events from the Linux audit framework. The Linux audit framework is an auditing system that collects the information of different events on the system. Auditbeat can help us to take that data and push it to Elasticsearch from where Kibana can be utilized to create dashboards.

Winlogbeat

Winlogbeat is a data shipper that ships the Windows event logs to Logstash or the Elasticsearch cluster. It keeps a watch and reads from different Windows event logs and sends them to Logstash or Elasticsearch in a timely manner. Winlogbeat can send different types of events:

  • Hardware Events
  • Security Events
  • System Events
  • Application Events

Winlogbeat sends structured data to Logstash or Elasticsearch after reading raw event data to make it easy for filtering and aggregating the data.

Heartbeat

Heartbeat is a lightweight shipper that monitors server uptime. It can be installed on a remote server; after that, it periodically checks the status of different services and tell us whether they're available. The major difference between Metricbeat and Heartbeat is that Metricbeat tells us whether that server is up or down, while Heartbeat tells us whether services are reachable—it's quite similar to the ping command, which tells us whether the server is responding.

Use cases of Elastic Stack

There are many areas where we can use the Elastic Stack, such as logging where we mainly use Elastic Stack or for searching using Elasticsearch or for dashboarding for monitoring but these are just a few use case of the Elastic Stack which we primarily use, there are many other areas where we can use the power of Elastic Stack. We can use Elastic Stack for the following use cases:

  • System Performance Monitoring
  • Log Management
  • Application Performance Monitoring
  • Application Data Analysis
  • Security Monitoring and Alerting
  • Data Visualization

Let's discuss each of these in detail.

System Performance Monitoring

When we run any application in production, we need to make it stable by avoiding anything that can impact the application's performance; this can be anything, such as the system, database, or any third-party dependencies, since if anything fails it impacts the application. In this section, we'll see how system monitoring can help us to avoid situations where the system can cause the application to outage.

Let's discuss the factors that can hamper application's performance. There can be number of reasons, such as the system memory or CPU creating a bottleneck because of an increase in user hits. In this situation, we can do multiple things, such as optimizing the application if it's possible and increasing the memory or CPU. We can do it to mitigate the outrage of the application, but it's only possible if we're monitoring the system metrics of the servers on which the application has been deployed. Using the monitoring, we can configure the alert whenever the threshold value of any component increases. In this way, you can protect yourself from any application outage because of system performance.

Log Management

Log Management is one of the key use cases of Elastic Stack, and it has been primarily used for this purpose for many years. There are many benefits of log management using Elastic Stack, and I'll explain Elastic Stack simplifies things when it comes to monitoring logs. Let's say you have a log file and you need to explore it to get the root cause of the application outage how are you going to proceed? Where will you open the file and what are you going to search and filter? We just need to push the log data into Elasticsearch and configure Kibana to read this data. We can use Filebeat to read the log files, such as Apache access and error logs. Apart from system logs, we can also configure Filebeat to capture application logs. Instead of Filebeat, we can use Logstash to take file data as input and output it to Elasticsearch.

Application Performance Monitoring

Elastic Stack APM monitors applications and helps developers and system administrators monitor software applications for performance and availability. It also helps them to identify any current issues, or ones that may occur in the near future. Using APM, we can find and fix any bug in the code, as it makes the problems in the code searchable. By integrating APM with our code, we can monitor our code and make it better and more efficient. Elastic APM provides us with custom preconfigured dashboards in Kibana. We can integrate application data using APM and server stats, network details, and log details using Beats. This makes it easy to monitor everything in a single place.

We can apply machine learning to APM data by using the APM UI to find any abnormal behavior in the data. Alerts can also be applied to get an email notification if anything goes wrong in the code. Currently, Elastic APM supports Node.js, Python, Java, Ruby, Go, and JavaScript. It's easy to configure APM with your application and it requires only a few lines of code.

Security, Monitoring, and Alerting with Elastic Stack

With X-Pack, we can enable security, alerting, and monitoring with our Elastic setup. These features are very important and we need them to protect our Elastic Stack from external access and any possible issues. Now let's discuss each of them in detail.

Security

Security is a very important feature of X-Pack; without it, anyone can open the URL and access everything in Kibana, including index patterns, data, visualizations, and dashboards. During X-Pack installation and setup, we create the default user credentials. For security, we have role management and user management, using which we can restrict user access and secure the Elastic Stack.

Monitoring

Monitoring provides us the insight on Elasticsearch, Logstash, and Kibana. Monitoring comes with X-Pack, which we can install after installing the basic Elastic Stack setup. Monitoring-related data is stored in Elasticsearch, which we can see from Kibana. We have built-in status warning in Kibana, custom alerts can be configured on data in the indices used for monitoring.

Alerting

Elastic Stack uses alerting to keep an eye on any activity, such as whether CPU usage increases, memory consumption goes beyond some threshold, the response time of an application goes up, or 503 errors are increasing. By creating alerts, we can proactively monitor the system or application behavior and can apply a check before anything actually goes wrong.

Using alerts, we can notify every stakeholder without missing anything. We can apply alerts to detect specific issues, such as a user logged in from a different location, credit card numbers are showing in application logs, or the indexing rate of Elasticsearch increases. These are just some examples; we can apply alerts in so many cases.

There are different ways to notify the users, as there are lots of built-in integrations available for emails, slack, and so on. Apart from these built-in options, we can integrate alerts with any existing system by integrating the webhook output provided by Elastic Stack. Alerts also have simple template support, which we can use to customize the notification. I'll cover how we can configure the alerts in the coming chapters.

Data Visualization

Data visualization is the main feature of Kibana and it's the best way to get information from the raw data. As we know, a picture tells a thousand words, so we can easily learn about a data trend by just seeing a simple chart. Kibana is popular because it has the ability to create dashboards for KPIs using data from different sources; we can even use Beats to get ready—made dashboards. We have different types of visualizations in Kibana, such as basic charts, data, time-series, and maps, which we'll cover in coming chapters. If we have data in Elasticsearch, we can create visualizations by creating index patterns in Kibana for those indexes in Elasticsearch.

Installing Elastic Stack

Elastic Stack consists of different components, such as Elasticsearch, Logstash, Kibana, and different Beats. We need to install each component individually, so let's start with Elasticsearch.

The installation steps might change, depending on the release of version 7. The updated steps will be available at the following link once the version is released.
https://www.packtpub.com/sites/default/files/downloads/InstallationofElasticStack7.pdf

Elasticsearch

To install Elasticsearch 6, we need at least Java 8. Please ensure first that Java is installed with at least version 8 in your system. Once Java is installed, we can go ahead and install Elasticsearch. You can find the binaries at www.elastic.co/downloads.

Installation using the tar file

Follow the steps to install using the tar file:

  1. First, we need to download the latest Elasticsearch tar, as shown in the following code block:
curl -L -O https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.x.tar.gz
  1. Then, extract it using the following command:
tar -xvf elasticsearch-6.x.tar.gz
  1. After extracting it, we have a bunch of files and folders. Move to the bin directory by executing the following command:
cd elasticsearch-6.x/bin 
  1. After moving to the bin directory, run Elasticsearch using the following command:
./elasticsearch

Installation using Homebrew

Using Homebrew, we can install Elasticsearch on macOS, as follows:

brew install elasticsearch

Installation using MSI Windows installer

For Windows, we have the MSI Installer package, which includes a graphical user interface (GUI) that we can use to complete the installation process. We can download the Elasticsearch 6.x MSI from the Elasticsearch download section at https://www.elastic.co/downloads/elasticsearch.

We can launch the GUI-based installer by double-clicking on the downloaded executable file. On the first screen, select the deployment directories and install the software by following the installation screens.

Installation using the Debian package

Follow the steps to install using the Debian package:

  1. First, install the apt-transport-https package using the following command:
sudo apt-get install apt-transport-https
  1. Save the repository definition on /etc/apt/sources.list.d/elastic-6.x.list:
echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list
  1. To install the Elasticsearch Debian package, run the following command:
sudo apt-get update && sudo apt-get install elasticsearch

Installation with the RPM package

  1. Download and then install the public signing key:
rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch
  1. Create a file called elasticsearch.repo for RedHat-based distributions under the /etc/yum.repos.d/ directory. For the OpenSuSE-based distributions, we have to create the file under the /etc/zypp/repos.d/ directory. We need to add the following entry in the file:
[elasticsearch-6.x]
name=Elasticsearch repository for 6.x packages
baseurl=https://artifacts.elastic.co/packages/6.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md

After adding the preceding code, we can install Elasticsearch on the following environments.

  • We can run the yum command on CentOS and older versions of RedHat-based distributions:
sudo yum install elasticsearch
  • On Fedora and other newer versions of RedHat distributions, use the dnf command:
sudo dnf install elasticsearch
  • The zypper command can be used on OpenSUSE-based distributions:
sudo zypper install elasticsearch
  • The Elasticsearch service can be started or stopped using the following command:
sudo -i service elasticsearch start
sudo -i service elasticsearch stop

Logstash

We have different ways to install Logstash based on the operating system. Let's see how we can install Logstash on different environments.

Using APT Package Repositories

Follow the steps to install using APT Package Repositories

  1. Install the Public Signing key, but before that download the APT package repository. You can do that using the following command:
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
  1. On Debian, we have to install the apt-transport-https package:
sudo apt-get install apt-transport-https
  1. Save the following repository definition, under the /etc/apt/sources.list.d/elastic-6.x.list directory:
echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list
  1. Run the sudo apt-get update command to update the repository. After the update, the repository will be ready to use. We can install Logstash by executing the following command:
sudo apt-get update && sudo apt-get install logstash

Using YUM Package Repositories

Follow the steps to install using YUM Package Repositories:

  1. Download the public signing key and then install it using the following expression:
rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch
  1. Under the /etc/yum.repos.d/ directory, add the following expression in a file with a .repo suffix, for example. See the following code block in the logstash.repo file:
[logstash-6.x]
name=Elastic repository for 6.x packages
baseurl=https://artifacts.elastic.co/packages/6.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md
  1. The repository is ready after we add the preceding code. Using the following command, we can install Logstash:
sudo yum install logstash

Kibana

From version 6.0.0 onward, Kibana only supports 64-bit operating systems, so we need to ensure we have a 64-bit operating system before installing Kibana.

Installing Kibana with .tar.gz

Follow the steps to install Kibana with .tar.gz

  1. Using the following expression, we can download and install the Linux archive for Kibana v6.x:
wget https://artifacts.elastic.co/downloads/kibana/kibana-6.x-linux-x86_64.tar.gz
tar -xzf kibana-6.1.3-linux-x86_64.tar.gz
  1. Change the directory and move to $KIBANA_HOME by running the following command:
cd kibana-6.1.3-linux-x86_64/
  1. We can start Kibana using the following command:
./bin/kibana

Installing Kibana using the Debian package

Follow the steps to install Kibana using the Debian package:

  1. For the Debian package, download and install the public signing key using the following command:
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
  1. Install the apt-transport-https package using the following expression:
sudo apt-get install apt-transport-https
  1. We need to add the following repository definition under /etc/apt/sources.list.d/elastic-6.x.list:
echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list
  1. Install the Kibana Debian package, by running the following command:
sudo apt-get update && sudo apt-get install kibana

Installing Kibana using RPM

Follow the steps to install Kibana using RPM:

  1. Install the public signing key after downloading it for the RPM package:
rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch
  1. Create a file called kibana.repo under the /etc/yum.repos.d/ directory for RedHat-based distributions. For OpenSuSE-based distributions, we need to create the file under the /etc/zypp/repos.d/ directory and then add the following expression:
[kibana-6.x]
name=Kibana repository for 6.x packages
baseurl=https://artifacts.elastic.co/packages/6.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md

After adding the preceding expression in our file, we can install Kibana using the following commands:

  • On yum, CentOS, and older RedHat-based distributions, we need to run the following command:
sudo yum install kibana
  • We can use the dnf command on Fedora and newer RedHat distributions:
sudo dnf install kibana

Using zypper on OpenSUSE-based distributions

We can use zypper to install Kibana on OpenSUSE-based distributions using the following command:

sudo zypper install kibana

Installing Kibana on Windows

Follow the steps to install Kibana on Windows:

  1. From the Elastic download section (https://www.elastic.co/downloads/kibana), we can download the .zip windows archive for Kibana v6.x.
  2. Create a folder called kibana-6.x-windows-x86_64 by unzipping the zipped archive; we refer to this folder path as $KIBANA_HOME. Now move to the $KIBANA_HOME directory by using the following expression:
cd c:\kibana-6.x-windows-x86_64
  1. To start Kibana, we need to run the following command:
.\bin\kibana

Beats

Beat is a separately-installable product; they are lightweight data shippers. There are many Beats available, as follows:

  • Packetbeat
  • Metricbeat
  • Filebeat
  • Winlogbeat
  • Heartbeat

Packetbeat

There are many ways to download and install Packetbeat, depending on your operating system. Let's see different commands for different types of OSes:

  • To install Packtbeat on deb use the following command:
sudo apt-get install libpcap0.8
curl -L -O https://artifacts.elastic.co/downloads/beats/packetbeat/packetbeat-6.2.1-amd64.deb
sudo dpkg -i packetbeat-6.2.1-amd64.deb
  • To install Packetbeat on rpm use the following command:
sudo yum install libpcap
curl -L -O https://artifacts.elastic.co/downloads/beats/packetbeat/packetbeat-6.x-x86_64.rpm
sudo rpm -vi packetbeat-6.2.1-x86_64.rpm
  • To install Packetbeat on macOS use the following command:
curl -L -O https://artifacts.elastic.co/downloads/beats/packetbeat/packetbeat-6x-darwin-x86_64.tar.gz
tar xzvf packetbeat-6.2.1-darwin-x86_64.tar.gz
  • To install Packetbeat on the Windows environment, perform the following steps:
  1. Get the Packtebeat Windows zip file from the Elastic downloads section.
  2. Extract the zip file to C:\Program Files.
  3. Rename the extracted file Packetbeat.
  4. Run the PowerShell prompt as an Administrator.
  5. To install Packetbeat as a Windows service, run the following command:
PS > cd 'C:\Program Files\Packetbeat'
PS C:\Program Files\Packetbeat> .\install-service-packetbeat.ps1

Metricbeat

There are different ways to install Metricbeat on your operating system. Using the following expressions, we can install Metricbeat on different OSes:

  • To install Metricbeat on deb use the following command:
curl -L -O https://artifacts.elastic.co/downloads/beats/metricbeat/metricbeat-6.x-amd64.deb
sudo dpkg -i metricbeat-6.x-amd64.deb
  • To install Meticbeat on rpm use the following command:
curl -L -O https://artifacts.elastic.co/downloads/beats/metricbeat/metricbeat-6.x-x86_64.rpm
sudo rpm -vi metricbeat-6.x-x86_64.rpm
  • To install Meticbeat on macOS use the following command:
curl -L -O https://artifacts.elastic.co/downloads/beats/metricbeat/metricbeat-6.x-darwin-x86_64.tar.gz
tar xzvf metricbeat-6.x-darwin-x86_64.tar.gz
  • To install Meticbeat on Windows perform the following steps:
  1. Download the Metricbeat Windows zip from the Elastic download section.
  2. Extract the file to the C:\Program Files directory.
  3. Rename the metricbeat long directory name to Metricbeat.
  4. Run the PowerShell prompt as an Administrator.

If you're running Windows XP, you may need to download and install PowerShell.
  1. Run the following commands to install Metricbeat as a Windows service:
  2. To install Metricbeat, run the following commands from the PowerShell prompt:
PS > cd 'C:\Program Files\Metricbeat'
PS C:\Program Files\Metricbeat> .\install-service-metricbeat.ps1

Filebeat

We can download and install Filebeat on different operating systems in the following ways:

  • To install Filebear on deb use the following command:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.x-amd64.deb
sudo dpkg -i filebeat-6.x-amd64.deb
  • To install Filebeat on rpm use the following command:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.x-x86_64.rpm
sudo rpm -vi filebeat-6.x-x86_64.rpm
  • To install Filebeat on macOS use the following command:
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.x-darwin-x86_64.tar.gz
tar xzvf filebeat-6.2.1-darwin-x86_64.tar.gz
  • To install Filebeat on Windows perform the following steps:
  1. From the Elastic downloads section, download the Filebeat Windows zip file.
  2. Extract the zip file into C:\Program Files.
  3. Rename the long filebeat directory to Filebeat.
  4. Open a PowerShell prompt as an administrator.
  5. From the PowerShell prompt, execute the following commands:
PS > cd 'C:\Program Files\Filebeat'
PS C:\Program Files\Filebeat> .\install-service-filebeat.ps1

Summary

In this chapter, we introduced you to Elastic Stack, where we discussed the different components of Elastic Stack, such as Elasticsearch, Logstash, Kibana, and different Beats. Then we looked at different use cases of Elastic Stack, such as System Performance Monitoring, where we monitor the system's performance, Log Management, where we collect different logs and monitor them from a central place, and Application Performance Monitoring, where we monitor our application by connecting it to a central APM server. We also covered Application Data Analysis, where we analyze the application's data, Security Monitoring and Alerting, where we secure our stack using X-Pack, monitor it regularly, and configure alerts to keep an eye on changes that can impact the system's performance, and Data Visualization, where we use Kibana to create different types of visualizations using the available data.

In the next chapter, we'll cover different methods of pushing data into Kibana, such as from RDBMS, files, system metrics, CSV, and applications. We'll start with different Beats to demonstrate the complete process of configuring these Beats and sending data directly to Elasticsearch or via Logstash to Elasticsearch. Then, we'll look at how to import data from CSV by configuring Logstash to take input and insert data into Elasticsearch. After CSV, we'll fetch data from RDBMS using SQL queries through the JDBC plugin and insert it into Elasticsearch. We'll use the preceding methods to insert data into Elasticsearch, and then we'll configure Kibana to fetch the data by creating an index pattern. In this way, we can fetch any type of data into Kibana and can then perform different operations on that data.

Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • Your hands-on guide to visualizing the Elasticsearch data as well as navigating the Elastic stack
  • Work with different Kibana plugins and create effective machine learning jobs using Kibana
  • Build effective dashboards and reports without any hassle

Description

The Elastic Stack is growing rapidly and, day by day, additional tools are being added to make it more effective. This book endeavors to explain all the important aspects of Kibana, which is essential for utilizing its full potential. This book covers the core concepts of Kibana, with chapters set out in a coherent manner so that readers can advance their learning in a step-by-step manner. The focus is on a practical approach, thereby enabling the reader to apply those examples in real time for a better understanding of the concepts and to provide them with the correct skills in relation to the tool. With its succinct explanations, it is quite easy for a reader to use this book as a reference guide for learning basic to advanced implementations of Kibana. The practical examples, such as the creation of Kibana dashboards from CSV data, application RDBMS data, system metrics data, log file data, APM agents, and search results, can provide readers with a number of different drop-off points from where they can fetch any type of data into Kibana for the purpose of analysis or dashboarding.

Who is this book for?

Kibana 7 Quick Start Guide is for developers new to Kibana who want to learn the fundamentals of using the tool for visualization, as well as existing Elastic developers.

What you will learn

  • Explore how Logstash is configured to fetch CSV data
  • Understand how to create index patterns in Kibana
  • Become familiar with how to apply filters on data
  • Discover how to create ML jobs
  • Explore how to analyze APM data from APM agents
  • Get to grips with how to save, share, inspect, and edit visualizations
  • Understand how to find an anomaly in data
Estimated delivery fee Deliver to France

Premium delivery 7 - 10 business days

€10.95
(Includes tracking information)

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Jan 31, 2019
Length: 172 pages
Edition : 1st
Language : English
ISBN-13 : 9781789804034
Category :

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Paperback book shipped to your preferred address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Product feature icon AI Assistant (beta) to help accelerate your learning
OR
Modal Close icon
Payment Processing...
tick Completed

Shipping Address

Billing Address

Shipping Methods
Estimated delivery fee Deliver to France

Premium delivery 7 - 10 business days

€10.95
(Includes tracking information)

Product Details

Publication date : Jan 31, 2019
Length: 172 pages
Edition : 1st
Language : English
ISBN-13 : 9781789804034
Category :

Packt Subscriptions

See our plans and pricing
Modal Close icon
€18.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
€189.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just €5 each
Feature tick icon Exclusive print discounts
€264.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just €5 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total 95.97
Learning Elastic Stack 7.0
€28.99
Elasticsearch 7.0 Cookbook
€41.99
Kibana 7 Quick Start Guide
€24.99
Total 95.97 Stars icon
Banner background image

Table of Contents

8 Chapters
Introducing Kibana Chevron down icon Chevron up icon
Getting Data into Kibana Chevron down icon Chevron up icon
Exploring Data Chevron down icon Chevron up icon
Visualizing Data Chevron down icon Chevron up icon
X-Pack with Machine Learning Chevron down icon Chevron up icon
Monitoring Applications with APM Chevron down icon Chevron up icon
Kibana Advanced Tools Chevron down icon Chevron up icon
Other Books You May Enjoy Chevron down icon Chevron up icon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

What is the delivery time and cost of print book? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela
What is custom duty/charge? Chevron down icon Chevron up icon

Customs duty are charges levied on goods when they cross international borders. It is a tax that is imposed on imported goods. These duties are charged by special authorities and bodies created by local governments and are meant to protect local industries, economies, and businesses.

Do I have to pay customs charges for the print book order? Chevron down icon Chevron up icon

The orders shipped to the countries that are listed under EU27 will not bear custom charges. They are paid by Packt as part of the order.

List of EU27 countries: www.gov.uk/eu-eea:

A custom duty or localized taxes may be applicable on the shipment and would be charged by the recipient country outside of the EU27 which should be paid by the customer and these duties are not included in the shipping charges been charged on the order.

How do I know my custom duty charges? Chevron down icon Chevron up icon

The amount of duty payable varies greatly depending on the imported goods, the country of origin and several other factors like the total invoice amount or dimensions like weight, and other such criteria applicable in your country.

For example:

  • If you live in Mexico, and the declared value of your ordered items is over $ 50, for you to receive a package, you will have to pay additional import tax of 19% which will be $ 9.50 to the courier service.
  • Whereas if you live in Turkey, and the declared value of your ordered items is over € 22, for you to receive a package, you will have to pay additional import tax of 18% which will be € 3.96 to the courier service.
How can I cancel my order? Chevron down icon Chevron up icon

Cancellation Policy for Published Printed Books:

You can cancel any order within 1 hour of placing the order. Simply contact [email protected] with your order details or payment transaction id. If your order has already started the shipment process, we will do our best to stop it. However, if it is already on the way to you then when you receive it, you can contact us at [email protected] using the returns and refund process.

Please understand that Packt Publishing cannot provide refunds or cancel any order except for the cases described in our Return Policy (i.e. Packt Publishing agrees to replace your printed book because it arrives damaged or material defect in book), Packt Publishing will not accept returns.

What is your returns and refunds policy? Chevron down icon Chevron up icon

Return Policy:

We want you to be happy with your purchase from Packtpub.com. We will not hassle you with returning print books to us. If the print book you receive from us is incorrect, damaged, doesn't work or is unacceptably late, please contact Customer Relations Team on [email protected] with the order number and issue details as explained below:

  1. If you ordered (eBook, Video or Print Book) incorrectly or accidentally, please contact Customer Relations Team on [email protected] within one hour of placing the order and we will replace/refund you the item cost.
  2. Sadly, if your eBook or Video file is faulty or a fault occurs during the eBook or Video being made available to you, i.e. during download then you should contact Customer Relations Team within 14 days of purchase on [email protected] who will be able to resolve this issue for you.
  3. You will have a choice of replacement or refund of the problem items.(damaged, defective or incorrect)
  4. Once Customer Care Team confirms that you will be refunded, you should receive the refund within 10 to 12 working days.
  5. If you are only requesting a refund of one book from a multiple order, then we will refund you the appropriate single item.
  6. Where the items were shipped under a free shipping offer, there will be no shipping costs to refund.

On the off chance your printed book arrives damaged, with book material defect, contact our Customer Relation Team on [email protected] within 14 days of receipt of the book with appropriate evidence of damage and we will work with you to secure a replacement copy, if necessary. Please note that each printed book you order from us is individually made by Packt's professional book-printing partner which is on a print-on-demand basis.

What tax is charged? Chevron down icon Chevron up icon

Currently, no tax is charged on the purchase of any print book (subject to change based on the laws and regulations). A localized VAT fee is charged only to our European and UK customers on eBooks, Video and subscriptions that they buy. GST is charged to Indian customers for eBooks and video purchases.

What payment methods can I use? Chevron down icon Chevron up icon

You can pay with the following card types:

  1. Visa Debit
  2. Visa Credit
  3. MasterCard
  4. PayPal
What is the delivery time and cost of print books? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela