8.6.3. Export data to a ETL Logstash via the syslog protocol

8.6.3.1. Introduction

This procedure describes how to configure the connection to Logstash.
A pipeline developed by Gatewatcher makes it possible to retrieve the JSON content of the exported logs so that it can then be manipulated as desired with the Logstash filters.
Configuring the connection between the GCenter and the Logstash ETL requires the following steps:
  • On the GCenter, configure data export:

Note

See the presentation of Syslog servers.
See the presentation of the exported data described in Data use.
The graphical interface for the data export function is described in `Admin-GCenter- Data exports` screen of the legacy web UI.

8.6.3.2. Prerequisites

  • User : member of Administrator group


8.6.3.3. Preliminary operations


8.6.3.4. Procedure to access the `Data exports` window for an administrator account

  • In the navigation bar, click successively on:

  • The `Admin` button

  • The `Gcenter` sub menu

  • The `Data exports` command
    The `Data exports` window is displayed.

8.6.3.5. Procedure to setup the general parameters

../../_images/DATA_EXPORT-01.PNG
  • Click the `Configure` button (5) on one of the two connections (6 or 7) to be configured.
    The `Syslog data export` window opens.
../../_images/DATA_EXPORT-02.PNG
  • Click on `GENERAL` tab (1).

Note

Values with a $VALUE format are context-specific and are noted as such so that they can be referenced in the rest of the documentation.

  • Enter parameters using the following table:

Item

Parameter

Description

Value

15

Enable

Activate this export pipeline

Activated

14

Name

Syslog export name

$SYSLOG_NAME

13

Hostname

Logstash server DNS name or IP address

$LOGSTASH_IP

7

Port

destination port

$LOGSTASH_PORT

12

Codecs

Codec used for export

JSON

6

RFC

Standard used by the codec

3164

11

Facility

Syslog header `facility`

default kernel; header will be removed by the reception pipeline

8

Severity

Value of `severity` in the Syslog header

emergency by default; the header will be deleted by the reception pipeline

5

Protocol

The transport protocol used. TCP or UDP can be used

$PROTOCOL

10

Output interface

Choose the GCenter interface used for Syslog export

$GCENTER_IFACE

  • Validate using button (9) `Save`.
    The following message indicates that the update has been completed: `Updated with success`.

8.6.3.6. Procedure to setup the filtration parameters

../../_images/DATA_EXPORT-02.PNG
  • Click on the `FILTERS` tab (2).

../../_images/DATA_EXPORT-03.PNG
  • Enter parameters using the following table:

Item

Parameter

Description

16

`Message type`

Defines the type of event to send to the remote server. Either only alerts or alerts and metadata (Example: alerts, all)

17

`Ip addresses`

Filter by IP or networks. By default, all data is sent to the remote server if the field is empty

18

`Gcaps`

Filter by GCap. By default, all GCap data paired to GCenter is sent to the remote server if nothing is selected (Example: GCap1, GCap2)

19

`Additional fields`

Adds additional fields in exported events.
A name (`Name`) and a description (`Values`) can be entered in this window.
In the case of using the idmef codec, this field is not supported.

20

`Protocols`

Selects protocols to export
(Example : dcerpc, dhcp, dnp3, dns, enip, ftp, http, http2, ikev2, krb5, mqtt, modbus, netflow, nfs, ntp, rdp, rfb, sip, smb, smtp, ssh, tftp et tls)

21

`Save`

Changes are only taken into account after pressing the `Save` button.

Note

`Select All` selects all the protocols listed: a protocol that is not selected will not be exported.
If GCap is newer than GCenter, some protocols may be missing.
To export everything, disable this filter with `Deselect all`.
  • Validate using button (21) `Save`.
    The following message indicates that the update has been completed: `Updated with success`.

8.6.3.7. Procedure to configure encryption settings

The "ENCRYPTION" tab enables the encryption of the flow generated by the GCenter.
Logstash’s "syslog" input is not compatible with data encryption.
This feature cannot be used.

8.6.3.8. Procedure to be performed on the server

  • Configure the flow receiving pipeline from GCenter.

8.6.3.8.1. Pipeline Logstash

The input used is Syslog.
In order to be compatible with any Syslog header, a grok pattern is specified.
The JSON content of the log is in the syslog_message field.
yaml
input {
  syslog {
    port => $LOGSTASH_PORT
    type => syslog
    grok_pattern => '^<%{NUMBER:syslog_priority}>(?:1 |)(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:syslog_timestamp}) %{SYSLOGHOST:syslog_hostname} (?:gatewatcher\[-\]:|gatewatcher - - \[-\]) %{GREEDYDATA:syslog_message}\n$'
  }
}
Only the syslog_message field is preserved and is converted to JSON.
The original field (syslog_message) and the field specific to elasticsearch (@version) are then removed.
yaml
filter {
  prune {
    whitelist_names => [ "syslog_message" ]
  }

  json {
    source => "syslog_message"
  }

  mutate {
    remove_field => [ "@version","syslog_message" ]
  }

}
Any output can then be used.
In this example, the logs are described directly on the disk as files:
yaml
output {
  file {
    path => '/usr/share/logstash/data/output/%{[type]}-%{+YYYY.MM.dd}.log'
    codec => json_lines
  }
}