Skip to main contentIBM Garage Vaccine Delivery at Scale

Data Collection from Event Stream in Cloud Pak For Integration

To develop the anomaly detection service we first need to access the data. We have two datasources in this example: the reefer information and the telemetries data coming from the different reefer containers. With the telemetries we should be able to assess anomaly. The Telemetries data can be obtained from IBM Event Stream (Kafka) in Cloud Pak For Integration.

Following is the example topic Event Stream will use to push all the vaccine monitoring data and also Cloud pak for data will use this topic to get all the streaming data.

topic

Here is the sample code link on how to set up the connection to Event Stream topic:

Following is the whole architecture on how the user case works: Cloud pak for data will read streaming data from Cloud pak for integration and then build model on the streaming data and deploy it with Watson Machine Learning in Cloud Pak for Data. After the model is deployed, it will have a endpoint to score the new message from Event Stream in Cloud pak for Integration.

eventstream user case

The data obtained from event stream in the above mentioned way can be published to Watson Knowledge Catalog by using ‘Publish to Catalog’ link in the drop down associated with the 3 dots sign at the end of the data asset.