Fluent Bit

Fluent Bit is a lightweight and scalable logging and metrics processor and forwarder. Fluent Bit can be configured to send logs to Parseable with HTTP output plugin and JSON output format.

This document explains how to set up Fluent Bit to ship logs to Parseable Docker Compose and Kubernetes. This should give you an idea on how to configure the output plugin for other scenarios.

For demo purpose, we used Fluent Bit's Memory Metrics Input plugin as the source of logs.

Docker Compose

Please ensure Docker Compose installed on your machine. Then run the following commands to set up Parseable and Fluent Bit.

mkdir parseable
cd parseable
wget https://www.parseable.com/fluentbit/fluent-bit.conf
wget https://www.parseable.com/fluentbit/docker-compose.yaml
docker-compose up -d

You can now access the Parseable dashboard on http://localhost:8000. You should see a log stream called fluentbitdemo populated with log data generated by the Memory Metrics Input plugin.

Kubernetes

Install Fluent Bit

We use the official Fluent Bit Helm chart. But, we'll use a modified values.yaml file, that contains the configuration for Fluent Bit to send logs to Parseable.

wget https://www.parseable.com/fluentbit/values.yaml
helm repo add fluent https://fluent.github.io/helm-charts
helm install fluent-bit fluent/fluent-bit --values values.yaml -n fluentbit --create-namespace

Let's take a deeper look at the Fluent Bit configuration in values.yaml. Here we use the kubernetes filter to enrich the logs with Kubernetes metadata. We then use the http output plugin to send logs to Parseable. Notice the Match section in the http output plugin. We use kube.* to match all logs from Kubernetes filter. With the header X-P-Stream fluentbitdemo, we tell Parseable to send the logs to the fluentbitdemo stream.

  filters: |
    [FILTER]
        Name                 kubernetes
        Match                kube.*
        Merge_Log            On
        Keep_Log             Off
        K8S-Logging.Parser   On
        K8S-Logging.Exclude  On

  outputs: |
    [OUTPUT]
        Name                 http
        Match                kube.*
        host                 parseable.parseable.svc.cluster.local
        uri                  /api/v1/ingest
        port                 80
        http_User            admin
        http_Passwd          admin
        format               json
        compress             gzip
        header               Content-Type application/json
        header               X-P-Stream fluentbitdemo
        json_date_key        timestamp
        json_date_format     iso8601

Check logs in Parseable

Port forward Parseable service to access the dashboard with:

kubectl port-forward svc/parseable 8000:80 -n parseable

You can now check the Parseable server fluentbitdemo stream to see the logs from this setup.

Batching and Compression

Parseable supports batching and compressing the log data before sending it via HTTP POST. Fluent Bit supports this feature via the compress and buffer_max_size option. We recommend enabling both of these options to reduce the number of HTTP requests and to reduce the size of the HTTP payload.

Adding custom columns

In several cases you may want to add additional metadata to a log event. For example, you may want to append hostname to each log event, so filtering becomes easy at the time of debugging. This is done using lua scripts. Here is an example:

-- Use a Lua function to create some additional entries in the log record
function append_columns(tag, timestamp, record)
    new_record = record
    -- Add a static new field to the record
    new_record["environment"] = "production"
    -- Add a dynamic field to the record
    -- We get the env variable HOSTNAME from the Docker container
    -- Then we add it to the record
    hostname = os.getenv("HOSTNAME")
    new_record["hostname"] = hostname
    -- Return the new record
    -- "1" means that the record is modified
    -- "timestamp" is updated timestamp
    -- "new_record" is the new record (after modification)
    return 1, timestamp, new_record
end

Lua scripts are added to Fluent Bit as filters. To add this script as a filter, save the above script as filters.lua file. Place the filters.lua file in the same directory as rest of the Fluent Bit configuration files. Then add a filters section in the Fluent Bit config. For example:

[FILTER]
    Name                 lua
    Match                *
    Script               filters.lua
    Call                 append_columns

[OUTPUT]
    Name                 http
    Match                *
    host                 parseable
    uri                  /api/v1/ingest
    port                 8000
    http_User            admin
    http_Passwd          admin
    format               json
    compress             gzip
    header               Content-Type application/json
    header               X-P-Stream fluentbitdemo
    json_date_key        timestamp
    json_date_format     iso8601

Note that the [Input] section needs to be added.

Updated on