Exporting via a Kafka Consumer

Data can be consumed from your streams [1] in JSON format via an internet accessible authorized access to our Kafka export cluster.

Create an exporter

First you need to create an exporter. This creates a kafka-exporter and one associated kafka-user. The user credentials can be used to access the topic.

$ strm kafka-exporters create demo
{
  "topic": "export-7c163d02-5838-49ab-9d87-056221d04218",
  "clientId": "export-02e5c5e9-...",
  "clientSecret": "748e6b36-f389-..."
}

You can see the created user credentials. [2]

$ strm kafka-exporters-users list demo
[
  {
    "name": "service-account-export-02e5c5e9-7e8a-4ecc-bbc1-cc29fb8d74b8",
    "billingId": "hello0123456789",
    "clusterName": "shared-export",
    "clusterBillingId": "internal",
    "topic": "export-7c163d02-5838-49ab-9d87-056221d04218",
    "clientId": "export-02e5c5e9-...",
    "clientSecret": "748e6b36-f389-..."
  }
]

Consuming.

Clone the Python Kafka Consumer and go into the directory and create a file config.ini. Fill out the values from the JSON above:

[kafka]
bootstrap_servers = export-bootstrap.kafka.strm.services:9092
topic = export-7c163d02-5838-49ab-9d87-056221d04218
client_id = export-02e5c5e9-7e8a-4ecc-bbc1-cc29fb8d74b8
secret = 748e6b36-f389-460b-b809-ff957343f5f2
token_uri = https://auth.strm.services/token
group = demo

Next, install the Python dependencies:

$ python -m venv
$ . venv/bin/activate
$ pip install -r requirements.txt

And run the consumer:

$ python consumer.py

{'strmMeta':
        {'schemaId': 'nps_unified_v1',
        'nonce': -956233225,
        'timestamp': 1616749246981,
        'keyLink': -1225595339,
        'billingId': 'hello0123456789',
        'consentLevels': [7, 3]},
        'brand_source': '', 'platform': '', 'os': '',
        'version': '',
        'device_id': 'AXo+XdyK3A+QbVUe2+KoxqBumttxNZCbrNvhHHFK',
        'customer_id': 'AXo+XdxTFu26cu47dnYUvEKjxUqdsNk1kKrFOA/CjRI=',
        ...

Deleting the Kafka-Exporter

If you just try to delete the Kafka Exporter, you’ll get an error.

$ strm kafka-exporters delete demo
{
  "code": 400, "message":
  "Cannot delete Kafka exporter for stream demo, as it still has users linked to it.
  Delete those first before deleting this exporter."
}

So let’s do that:

$ strm kafka-exporters-users list demo
[
  {
    "name": "service-account-export-02e5c5e9-7e8a-4ecc-bbc1-cc29fb8d74b8",
    "billingId": "hello0123456789",
    "clusterName": "shared-export",
    "clusterBillingId": "internal",
    "topic": "export-7c163d02-5838-49ab-9d87-056221d04218",
    "clientId": "export-02e5c5e9-7e8a-4ecc-bbc1-cc29fb8d74b8",
    "clientSecret": "748e6b36-f389-460b-b809-ff957343f5f2"
  }
]

$ strm kafka-exporters-users delete demo export-02e5c5e9-7e8a-4ecc-bbc1-cc29fb8d74b8
{}


$ strm kafka-exporters-users list demo
[]

$ strm kafka-exporters delete demo
{}
the CLI is still fairly limited. The next major version will have much more functionality, like recursive deletes and auto complete on streams and such.

1. encrypted and decrypted
2. this still exposes some internal features. This will be removed with the CLI v2.0