Define the products data into postgresql
The Simulator references product data stored in a Postgresql database. There are multiple ways to populate this database depending on your level of experience with Postgresql, database services, and your local development environment.
We have provided the following documented methods for populating the Product database:
- Kubernetes Job running on remote cluster (RECOMMENDED)
- Docker image running on local machine
- Postgresql CLI (psql) running on local machine
Kubernetes Job running on remote cluster
In an effort to keep development systems as clean as possible and speed up deployment of various scenarios, our deployment tasks have been encapsulated in Kubernetes Jobs. These are runnable on any Kubernetes platform, including OpenShift.
- Following the configuration prerequisistes defined in the Backing Services documentation for using Databases for PostgreSQL on IBM Cloud, you should already have the following Kubernetes Secrets defined in your target namespace:
postgresql-url
(in the format ofjdbc:postgresql://<hostname>:<port>/<database-name>?sslmode=...
)kubectl create secret generic postgresql-url --from-literal=binding='jdbc:postgresql://<hostname>:<port>/<database-name>?sslmode=...'
postgresql-user
kubectl create secret generic postgresql-user --from-literal=binding='ibm_cloud_...'
postgresql-pwd
kubectl create secret generic postgresql-pwd --from-literal=binding='1a2...9z0'
postgresql-ca-pem
(this requires use of the Cloud Databases CLI Plug-in for the IBM Cloud CLI)ibmcloud cdb deployment-cacert [PostgreSQL on IBM Cloud service instance name] > postgres.crt kubectl create secret generic postgresql-ca-pem --from-literal=binding="$(cat postgres.crt)"
- Create the
create-postgres-tables
Job from the root of therefarch-reefer-ml
repository:kubectl apply -f scripts/createPGtables.yaml
- You can tail the created pod's output to see the progress of the database initialization:
kubectl logs -f --selector=job-name=create-postgres-tables
Docker image running on local machine
The simulator code includes the infrastructure/ProductRepository.py that creates tables and adds some product definitions inside the table.
- Uncomment line 101 from
/simulator/infrastructure/ProductRepository.py
:# repo.populateProductsReferenceData()
- The following command is using our python environment docker image and the python code:
./scripts/createPGTables.sh IBMCLOUD
Postgresql CLI (psql) running on local machine
An alternate techniques is to use psql as described in this section. Previous experience with PSQL is recommended.
- We use a docker image to run psql:
$ cd scripts $ ./startPsql.sh IBMCLOUD $ PGPASSWORD=$POSTGRES_PWD psql --host=$HOST --port=$PORT --username=$POSTGRES_USER --dbname=$POSTGRES_DB ibmclouddb =>
- List relations...
ibmclouddb => \d
- Then create table if not done before:
ibmclouddb => CREATE TABLE products ( product_id varchar(64) NOT NULL PRIMARY KEY, description varchar(100), target_temperature REAL, target_humidity_level REAL );
- Populate the data:
ibmclouddb => INSERT INTO products(product_id,description,target_temperature,target_humidity_level) VALUES ('P01','Carrots',4,0.4), ('P02','Banana',6,0.6), ('P03','Salad',4,0.4), ('P04','Avocado',6,0.4), ('P05','Tomato',4,0.4);
- List the products
SELECT * FROM products;
You should see:
product_id | description | target_temperature | target_humidity_level | content_type ------------+-------------+--------------------+-----------------------+-------------- P01 | Carrots | 4 | 0.4 | 1 P02 | Banana | 6 | 0.6 | 2 P03 | Salad | 4 | 0.4 | 1 P04 | Avocado | 6 | 0.4 | 2 P05 | Tomato | 6 | 0.3 | 6
- Exit the PSQL environment
ibmclouddb => \q