2022-03-14

Combining Logstash and Axiom for Observability

Tola Ore-Aruwaji
@thecraftman_

For you to maintain the health, visibility, and performance of your applications, you need to monitor communication between all of your application's dependencies and events.

Logstash is a centralized data processing pipeline that lets you read data from various sources, filter, transform and extract data directly to your configured output.

Logstash dynamically ingests, transforms, and ships your data regardless of format or complexity. Derive structure from unstructured data with grok, decipher geo coordinates from IP addresses, anonymize or exclude sensitive fields, and ease overall processing.

By combining Logstash with Axiom, you can monitor and ship your logstash data. Using our advanced query language enables you to visualize all your data from any input. Your engineering team will be able to collect all logs, monitor and visualize logstash applications with Axiom dashboards, correlate logs, and set alerts with Axiom monitors.

For deep insights into your application health, you can analyze the efficiency of traffic between your logstash events using Axiom Data Explorer. Axiom maps all of your application dependencies so you can know how much data your system is producing, and where the data is coming from.

In this tutorial, I will show you how to ship logs from logstash to Axiom for streaming and visualization through queries

Prerequisites

In this tutorial, you will be able to:

  • Analyze your logstash logs directly on Axiom.
  • Gather live events on your Logstash data.
  • Customize dashboard and run Aggregations.

Let’s get going 💡

  1. Create your dataset for your logstash data by selecting Settings → Datasets on the Axiom UI

1

  1. Generate your API token,
  • On the Axiom UI, click on settings, select API Tokens.
  • Select Add ingest token.
  • Enter a name, description and select the permissions you want to give your token. You can choose Ingest and Query, Ingest or Query permissions for your token.
  • Copy the generated token to your clipboard. Once you navigate from the page, the token can be seen again by selecting API Tokens.

1

  1. Next, head over to our logstash configuration page, copy, edit and configure your input, filter, and output stage.
input{
  exec{
    command => "date"
    interval => "1"
  }
}
output{
  opensearch {
    hosts => "https://cloud.axiom.co:443/api/v1/datasets/$DATASET_NAME/elastic"
    user => "axiom"
    password => "xaat-xxxx-xxxxxx-xxxxx"
  }
}
  • In your configuration, https://cloud.axiom.co:443 is your Axiom Cloud account URL.
  • $DATASET_NAME is the name of your dataset which we created in step 1
  • password is your API Token we created in step 2.
  1. When you are done configuring your output stage, you can now run your configuration and ship your Logstash logs directly to Axiom. Visit the datasets tab you will see your logstash logs ingested into your dataset.
  • In our configuration, I’m running my logstash configuration using Docker

1

  • and running the commands from using the opensearch output plugin using docker-compose
logstash:
    image: opensearchproject/logstash-oss-with-opensearch-output-plugin:7.16.2
    volumes:
      - ./logstash/pipeline/:/usr/share/logstash/pipeline/
    ports:
      - "9600:9600"
  1. Next, in your logstash dataset in your Axiom Cloud console, you can see you Logstash data directly in your dataset view.

1

  1. On your logstash dataset, you can run queries using our operators and functions directly on Data Explorer.
  • With Axiom Data Explorer, you can explore and monitor your logs from logstash, get insights into how your applications are performing over time, and run super-fast queries in seconds.

  • On the Axiom dashboard, select the fourth (4th) icon on the pane, which is the Data Explorer dashboard, and click on your dataset. The syntax of the tabular expression tabular flow from one query operator to another, starting with your logstash dataset and then flow through a set of data operators and functions that are bound together through the use of the pipe | delimiter.

1

Explore different operators and functions with Axiom Data Explorer:

  1. Project Operator: Use the project operator to select fields to insert and embed new columns. The following query returns a specific set of fields as columns.
  • In the query below, we are inserting the project @version command host and message fields from our logstash data

1

  1. Where: Filters out a dataset to a branch of rows that meets a condition when executed.
  • The query below filters our logstash data by the host and command

1

  1. summarize operator: Produces a table that aggregates the content of the dataset.

The summarize operator groups together rows that have the same values in the by clause. In the query below, the aggregation functions (count and topk) combine each group into one row.

1

  • In the query below, there is a row for topk_message topk_command topk_@version and a column for each aggregation field in the operator.

1

Visualize queries with Aggregations:

  1. You can group and extract statistics/insights from your Logstash events by running aggregations across your logstash dataset.
  • count(): Here, we are running the count aggregation with the AND filter clause where command contains “date”.

1

  • topk(): You can use the topk aggregation to get the “top 10” or “top 15” (where ‘10’ and ‘15’ are ‘k’ in topk) values for your fields in your logstash dataset.

1

  • distinct(): You can specify the distinct($fieldName) function to get the chart for the values in each field you select from your logstash dataset. The table below the chart shows the total number of distinct values for the entire time period selected.

1

  1. You can group collections of queries that help identify and diagnose common issues quickly when problems occur using Axiom Dashboard. To get started with Dashboards, head over to Dashboards on Axiom.

1

  1. Select NEW DASHBOARD - you’ll see a dialog that asks you for the name of your dashboard and the description.
  • Enter the name and description you want to give your dashboard

1

  1. After creating your dashboard, you will see the (+) button, click on it to build your query. This will let you add your first charts to your dashboard.
  • You can select which kind of query you want to add. Currently we support Simple Query Builder and Advanced Query Language We will be starting with the Advanced Query Language.

1

  1. Use the advanced query builder to query, manipulate and process data using APL.
  • Select your logstash dataset.
  • The Chart type can be Time Series or Statistic
  • Your dashboard view can be: Time Series, Results Table or Time Series & Results Table
  • When you are done configuring your dashboard, click on VALIDATE & SAVE.

1

  1. You can add more queries. In my second chart, I will create a chart using the Simple Query Builder.
  • In this chart, I’m visualizing the top 10 values of our @versionfield from our logstash dataset. When you are done, click on SAVE.

1

  • You can combine and add many more charts to your dashboard using the Simple Query Builder or the Advanced query language

1

  • When you are done, you can see your charts in a single view.

1

  • You can also adjust the size of your charts, duplicate them, Open your charts in Data Explorer, or create a monitor from chart by selecting the icon at the top right of each chart.

1

  1. To access the Dashboard settings, select the icons at the top right corner of the dashboard which let you:
  • Share your dashboard views with everyone on the team.
  • Choose your specific time rate and create a custom time range for a particular or all queries in an organized widget display.
  • Compare against using a custom time.
  • View your dashboard in full screen.

1

Happy querying!


We have our Sandbox environment for you to play with different datasets, run queries, stream events and analyze your datasets. Check out Axiom Playground.

Help 'n' tips 'n' fun

Whew! If you can do all that, you are all set to be off to the races to build your own Queries and run Aggregations.

You can see how fast and easy for you to run queries using Axiom Data Explorer, create visualizations using aggregations, and build dashboards on your Logstash data.

If you have specific questions or issues configuring the file, I'd love to hear about them. Contact us here or ask a question in the Axiom Community!

For further reading, check out the official documentation on the power of Axiom Processing Language and if this article helped you out, let us know on Twitter!

Join us in changing how developers think about data