Ingesting Data

Ingest, transport and fetch data from different sources such as Relational database, web logs, batch data, real-time, application logs, streaming data, etc for later usage using the Axiom API.

You can also collect, load, group and move data from one or more sources to Axiom where it can be stored and further analyzed.

Before ingesting data, you will need an API Token which you can generate from the Settings->Tokens page on the Axiom Dashboard. See API Tokens documentation

Once you have an ingest token, there are three ways to get your data into Axiom:
  1. Using the Ingest API
  2. Using a Data Shipper (Logstash, Filebeat, Metricbeat, Fluentd etc)
  3. Using the Elasticsearch Bulk API that Axiom supports natively

Ingest API

Axiom exports a simple REST API that can accept any of the following formats:

Ingest using JSON

  • application/json - single event or json array of events


curl -X 'POST' '$DATASET_NAME/ingest' \
  -H 'Authorization: Bearer $API_TOKEN' \
  -H 'Content-Type: application/json' \
  -d '[
            "tags": {
                "server": "aws",
                "source": "wordpress"

Ingest using NDJSON

  • application/nd-json- structured data processed at a time


curl -X 'POST' '$DATASET_NAME/ingest' \
  -H 'Authorization: Bearer $API_TOKEN' \
  -H 'Content-Type: application/x-ndjson' \
  -d '{"id":1,"name":"machala"}
  {"index": {"_index": "products"}}
  {"timestamp": "2016-06-06T12:00:00+02:00", "attributes": {"key1": "value1","key2": "value2"}}
  {"queryString": "count()"}'

Ingest using CSV

  • text/csv - this should include the header line with field names separated by commas


curl -X 'POST' '$DATASET_NAME/ingest' \
      -H 'Authorization: Bearer $API_TOKEN' \
      -H 'Content-Type: text/csv' \
      -d 'user, name
         foo, bar'

If you would like to instead use a language binding, currently our client libraries are available:


Maximum Event Batch Size1000
Maximum Event Fields250
Maximum Array Field Items100

Data Shippers

Configure, read, collect, and Send logs to your Axiom deployment using a variety of data shippers. Data shippers are lightweight agents that acquire logs, metrics, and lets you ship data directly into Axiom.

integrations iconIngest using Elastic Beatsgo
integrations iconIngest using FluentDgo
integrations iconIngest using Kubernetesgo
integrations iconIngest using Logstashgo
integrations iconIngest using Loki Proxygo
integrations iconIngest using Syslog Proxygo
integrations iconIngest using Vercelgo

Was this page helpful?