13. Kafka Integration FAQ#

In certain cases you would want to access the data generated by the metric collectors (ontp-wire) instances to be sent to another central location in addition to the metric collection database (ontp-tspdb).

In these instances we recommend using Kafka as a metric distribution plane. As with kafka you have the ability to push these messages to subscibers and also to desired locations such as Google Cloud Storage or AWS buckets.

13.1. Exporting data to kafka#

Note

You may send data either from the capture agent or from the message bus. This choice will depend on your specific deployment of components and capture architecture.

You will require some base kafaka config and utilization of their GCS plugin. At a minumium the configuration would be as what is shown below.

#
name=gcs-sink
connector.class=io.confluent.connect.gcs.GcsSinkConnector
tasks.max=1
time.interval=HOURLY
#
gcs.bucket.name=kafka-connect-example
gcs.part.size=5242880
flush.size=3
#
gcs.credentials.path=/<your-path-to>/gcp-key.json
storage.class=io.confluent.connect.gcs.storage.GcsStorage
# Key converter same for both examples
key.converter=org.apache.kafka.connect.storage.StringConverter
# JSON example with pageviews test data
topics=network-metrics
format.class=io.confluent.connect.gcs.format.json.JsonFormat
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
#
schema.compatibility=NONE
partitioner.class=io.confluent.connect.storage.partitioner.DefaultPartitioner

For more detaied information on setting up your Kafka connector to export to GCS https://docs.confluent.io/5.4.1/connect/kafka-connect-gcs/source/index.html for more detailed information on connecting kafka to GCS.

13.2. ontp-wire Export to Kafka#

13.2.1. Add Kafaka configuration to the Agent:#

  • Modify the configuration file of the agent to allow sending data to kafka.

  • You will need to add the following configuration section to the exsisting config.

{
 "kafka_config": {"topic_name": "NY1-Zone1", "ontp-koutput": "json", "bootstrap.servers": "192.168.1.20:9093",
                  "message.timeout.ms": "800", "session.timeout.ms": "6000",
                  "security.protocol": "SSL","ssl.ca.location": "./secure-keys/tls/ca.pem",
                  "ssl.key.location": "./secure-keys/tls/client-key.pem",
                  "ssl.certificate.location": "./secure-keys/tls/client.pem",
                  "enable.ssl.certificate.verification": "true" }
}
  • The above configuration will allow the ONTP metric capture agent to send data to kafka directly

Note

ontp-koutput - you can specify json or have it send over as base64 encoded data

13.3. ontp-mbus Export to Kafka#

13.3.1. Add Kafaka configuration to the ONTP Message bus agent:#

  • Modify the configuration file of the ONTP-Mbus to allow sending data to kafka.

  • You will need to add the following configuration section to the exsisting config.

{
 "destination_sinks": ["db","kafka"],                      # specify to dump to database and to kafka
 "proc_uuid": "63f4e162-4fe3-11ed-a605-6451065c4b7c",      # a unique id to diffrentiate msgs this instance sends to kafka

 "kafka_config": {"topic_name": "NY1-Zone1", "ontp-koutput": "json", "bootstrap.servers": "192.168.1.20:9093",
                  "message.timeout.ms": "800", "session.timeout.ms": "6000",
                  "security.protocol": "SSL","ssl.ca.location": "./secure-keys/tls/ca.pem",
                  "ssl.key.location": "./secure-keys/tls/client-key.pem",
                  "ssl.certificate.location": "./secure-keys/tls/client.pem",
                  "enable.ssl.certificate.verification": "true" }
}
  • The above configuration will allow the ONTP Message bus agent to send data to kafka directly