Asynchronous Events
The Digital Enterprise Suite exposes different ways of registering to receive asynchronous message notifications when an event occurs in the suite.
Events publication is divided into hierarchical topics where each topic regroups a small number of message types. Each topic also defines security constraints regarding which users can receive a given message sent on that topic.
The definition of the topics, messages and security constraints can be found under link:https://<instance>.trisotech.com/publicapi/doc-events.
There are multiple mechanisms to subscribe to events that are detailed in the following sections.
Events Fomat
Event messages are serialized to text using the Cloud Events standard (https://cloudevents.io) in JSON. The following attributes of cloud events are mapped as follows:
Attribute | Required | Mapping |
---|---|---|
id |
Yes |
A unique event identifier |
time |
Yes |
The RFC 3339 encoded event time |
source |
Yes |
A unique source identifier. By default, the instance name (instance.trisotech.com). This field can be overloaded in emitters. |
type |
Yes |
The type of event |
data |
No |
A JSON payload of the data. Each message types define a different payload. |
specversion |
Yes |
Constant to the cloud events version. Currently 1.0.1 |
subject |
No |
By default, not present. This field can be defined for emitters. |
Ip |
No |
If the event was user generated, the IP address of the user that generated the event. |
user |
No |
If the event was user generated, the email address of the user that generated the event. |
Emitters
Emitters are different asynchronous publication channels that can be configured by users (user emitters) or at the system level (system emitters).
Emitters (user and system) are defined using a definition with the following attributes:
Attribute | Required | Description |
---|---|---|
id |
Yes |
Each event emitter is uniquely identified by an identifier. For user emitters this identifier is automatically generated, for system emitters it is defined. |
type |
Yes |
Type of the event emitter is bound to a given integration strategy or the end system events will be pushed to. See the next section about the different type of emitters and their potential usage as a user or system emitter. |
filter |
No |
A FEEL expression evaluated against the event message data variables. If the expression evaluates to true, the message is emitted. If omitted, all events are accepted. |
topics |
Yes |
Topics are a way of categorizing events that allow to subscribe only to a subset of events that are relevant. Topics are built in hierarchical to offer targeted registration. For example, users can subscribe to all services invocation for production environment and services from a group called finance using the topic [ service, production, finance ]. This attribute is an array of arrays, each array representing a topic/subtopic subscription. |
messages |
No |
Most topic receives multiple types of messages. This attribute allows to filter to receive only specific type of messages (type attribute of the message). This attribute is an array of allowed messages types. If omitted, all messages will be emitted. |
owner |
No |
User defining this emitter. This automatically is assigned for user emitters and should be omitted for system emitters (to prevent user based security filters on top of messages). |
identityRef |
No |
Optional reference to an identity used to publish on the emitter. Identity identifiers are available from the Identity Management Page. |
parameters |
No |
A map of emitter specific parameters. |
Emitters Types
Webhook emitter
Type: Webhook
Available to be configured as user and system emitters. |
Webhook emitter allows to push events to an arbitrary HTTP endpoint. Each event will be published as an HTTP POST to the configured URL.
The payload of the event is in the Cloud Events JSON format that can be in one of the two modes:
-
Binary - payload consists of the event’s data while additional metadata attributes are delivered via HTTP headers (each attribute is prefixed with "ce-")
-
Structured - both event data and metadata attributes are delivered as payload in one JSON document See Cloud Event specification to read about HTTP binding and different types of payloads.
Parameters
Parameter | Required | Type | Description |
---|---|---|---|
url |
Yes |
String |
URL location where events should be posted. |
mode |
No |
String |
Payload handling (binary, structured). Defaults to binary. |
prettyPrint |
No |
Boolean |
Indicates if the event being should be pretty printed (defaults to false) |
source |
No |
String |
Alternative source name for the events (defaults to server name) |
subject |
No |
String |
Subject constant for the events (defaults to none) |
Audit emitter
Type: Audit
Available to be configured as user emitters. |
Audit emitter allows to store events into a storage location that is accessible through a REST API to pull from at a regular interval.
Parameters
Parameter | Required | Type | Description |
---|---|---|---|
prettyPrint |
No |
Boolean |
Indicates if the event being should be pretty printed (defaults to false) |
Accessing audit logs
Logs can be downloaded from the REST API https://<instance>.trisotech.com/publicapi/emitters/audit/{id}
. This endpoint is documented in the public api documentation available at: https://<instance>.trisotech.com/publicapi/doc/
.
Logs can also be downloaded from Event emitter definitions.
Only the user that defined the audit log is allowed to download it. |
File emitter
Type: File
Available only as a system emitters. |
File event emitter allows to store individual events into a file in dedicated location.
Parameters
Parameter | Required | Type | Description |
---|---|---|---|
path |
Yes |
String |
Determine the filesystem location of the log file. |
rolling |
No |
Boolean |
Create a daily file based on the path name (true) or a single file (false) (defaults to false) |
prettyPrint |
No |
Boolean |
Indicates if the event being should be pretty printed (defaults to false) |
source |
No |
String |
Alternative source name for the events (defaults to server name) |
subject |
No |
String |
Subject constant for the events (defaults to none) |
Console emitter
Type: Console
Available only as a system emitters. |
Console event emitter allows to log individual events into a system output.
Parameters
Parameter | Required | Type | Description |
---|---|---|---|
output |
Yes |
String |
Determine the console output channel between STDOUT, STDERR or the logging system. Values are: out, err or log (defaults to out). |
logPrefix |
No |
String |
Logging system prefix to be used only if the output is log (defaults to event.) |
level |
No |
String |
Logging system level to be used only if the output is log. Values are: SEVERE, WARNING, INFO, FINE (defaults to INFO.) |
prettyPrint |
No |
Boolean |
Indicates if the event being should be pretty printed (defaults to false) |
source |
No |
String |
Alternative source name for the events (defaults to server name) |
subject |
No |
String |
Subject constant for the events (defaults to none) |
Power BI emitter
Type: PowerBI
Available to be configured as user and system emitters. |
Power BI event emitter allows to store event’s data into PowerBI datasets and tables. Since PowerBI is table based it requires upfront setup to be able to publish events to it.
Most important is to
-
Create client application that will be given access to PowerBI
-
Define dataset to be used
-
Define table(s) where event’s data will be stored Above steps are described in PowerBI developer documentation that can be found here: https://docs.microsoft.com/en-us/rest/api/power-bi/.
Parameters
Parameter | Required | Type | Description |
---|---|---|---|
id |
Yes |
String |
Identifier of the data set |
schema |
Yes |
String |
Name of the data set |
name |
Yes |
String |
Name of the table |
columns |
Yes |
Array of Mapping Objects |
List of columns to be used to map events data into the table. |
Columns are the PowerBI event emitter specific way of defining how to map incoming event to the PowerBI table structure. Due to this there is a restriction that only one message can be configured in event emitter definition.
Each column consists of three attributes
Attribute | Required | Type | Description |
---|---|---|---|
name |
Yes |
String |
Name of the column that must match table column created in PowerBI |
type |
Yes |
String |
Data type of the column that must match data type in PowerBI table |
lookup |
Yes |
String |
A FEEL expression to extract data from the event |
Kafka emitter
Type: Kafka
Available to be configured as a user and system emitters. |
Kafka event emitter allows to publish messages into topic of Apache Kafka broker. Each event will be published as separate message. It follows Cloud Event specification binding for Apache Kafka messages and by that allows to use following modes:
-
Binary - payload consists of the event’s data while additional metadata attributes are delivered via Kafka message headers (each attribute is prefixed with "ce_")
-
Structured - both event data and metadata attributes are delivered as payload in one JSON document Refer to the Cloud Event specification to read about Apache Kafka binding and different types of payloads.
Parameters
Parameter | Required | Type | Description |
---|---|---|---|
brokers |
Yes |
String |
Host and port of the Apache Kafka brokers - separated by comma e.g. myserver:9092,myanotherserver:9092 |
topic |
Yes |
String |
Name of the topic where events should be published |
mode |
No |
String |
Payload handling (binary, structured). Defaults to binary. |
keyExpression |
No |
String |
FEEL expression that will be used to get the key to be set on Kafka message |
Authentication is realized via existing identities. Whenever identity information is present it will be used to provide authentication information to connect to Apache Kafka broker.
Apache Kafka Event emitter uses the following:
-
Security protocol config set to SASL_SSL
-
SASL mechanism set to PLAIN
Some Apache Kafka brokers are configured with disabled topic automatic creation feature which can result in failures during registration of the emitter. With that in mind it is recommended to ensure that the topic configured for the emitter exists in the Apache Kafka broker. |
Disabled Emitters
An emitter can become disabled if it fails to deliver a message multiple times in a row. This applies mostly with emitters that rely on network connection to deliver their message. In that case, the owner of the emitter will receive an email notification and he will need to restart the emitter manually.
Registering User Emitters
User emitters are registered using the Digital Enterprise Suite REST API using the Event emitters resource through a POST on /emitters/definitions.
Posting the following configuration will send all modeling places events to a request bin (https://requestbin.com) webhook:
{
"type": "Webhook",
"topics": [ [ "repository" ] ],
"parameters": {
"url": "https://xxxxxxxxxxxxx.x.pipedream.net/",
"mode": "structured"
}
}
Registering System Emitters
Only available for client hosted deployments |
System emitters are registered through JSON configuration files stored in the system-emitters folder of the data volume (typically under /data/des/system-emitters).
Adding a file called everything.json to the system emitters folder with the following content would write all events to a daily file (/logs/everything-YYYY-MM-DD.log) :
{
"type": "File",
"parameters":{
"path": "/logs/everything.log",
"rolling": true
}
}
Web Socket
It is possible to integrate using HTTP web sockets using the url: wss://<instance>.trisotech.com/events?topics=[ [ topic ] ]. This endpoint requires a topics query parameter that explicitly defines which topics to connect to. This endpoint also accepts additional parameters as meta information for the connection.
Attribute | Required | Description |
---|---|---|
topic |
Yes |
Topics are a way of categorizing events that allow to subscribe only to a subset of events that are relevant. Topics are built in hierarchical to offer targeted registration. For example, users can subscribe to all services invocation for production environment and services from a group called finance using the topic [ service, production, finance ]. This attribute is an array of arrays, each array representing a topic/subtopic subscription. |
clientId |
No |
A client identifier that can then be traced through the websession topic that identify this session from a user point of view and should remain constant through reconnections. |
name |
No |
A product name of the client connecting to this session. |
version |
No |
A product version of the client connecting to this session. |
For example, registering to the services and execution environments messages would require having [ [service], [environment] ]
URL encoded which would mean adding ?topics=%5B%5Bservice%5D%2C%20%5Benvironment%5D%5D
to the web socket URL.
Another example could be to connect to receive only services messages from a given environment (prod). This would require encoding [ [ service ,prod ] ]
in the topics query.
Once connected, the web socket connection will receive Cloud Events JSON encoded messages through the web socket. An initial WebSessionStarted event (documented under the websession topic at https://<instance>.trisotech.com/publicapi/doc-events/#/websession
) is sent to inform the client of its server session identifier.
On an established connection, clients are most likely required to implement a ping mechanism to prevent the websocket from automatically due to timeout (5 minutes by default for Trisotech Hosted deployments). We strongly suggest sending ping messages with the following payload:
{ "type": "ping" }
It will generate a pong reply with a pong type. We recommend sending that ping every at 80% of the server websocket timeout.
During a connection, it is also possible to dynamically subscribe and unsubscribe to topics. To subscribe, send a subscribe message with the payload:
{ "type": "subscribe", "data": { "topics" : [ ["topic1"], ["topic2"] ] } }
To unsubscribe, a similar message can be sent with an unsubscribe type instead.