Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1....

31
Data Streamer User Manual Setup, Configuration and API Integrations Version 1.9, 06.04.2020

Transcript of Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1....

Page 1: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Data Streamer User ManualSetup, Configuration and API Integrations

Version 1.9, 06.04.2020

Page 2: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Table of ContentsRevision History. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  1

1. Creating a Data Stream . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  2

2. Description of Data Stream Integrations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  4

2.1. AWS Kinesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  4

2.2. AWS S3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  4

2.3. RestAPI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  4

2.3.1. RestAPI (Bulk-Mode) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  4

2.4. Salesforce . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  4

3. Configuring Integrations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  5

3.1. AWS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  5

3.1.1. Trust Relationship . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  5

3.1.2. S3 Integration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  5

3.1.3. Kinesis Integration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  8

3.1.4. S3 Security Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  10

3.2. Salesforce Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  11

3.3. keen.io Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  13

3.4. DataDog Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  14

4. Managing a Data Stream . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  16

5. Filtering Data Streams. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  17

6. Data Streamer API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  18

6.1. Listing Data Streams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  18

6.2. Creating Data Streams. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  18

6.3. Deleting Data Streams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  18

7. Compatibility Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  20

8. Event Data Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  21

8.1. Generic Event Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  21

8.2. Additional Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  21

8.2.1. Additional Details by Event Type . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  21

8.3. Event Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  23

8.4. Severity Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  23

8.5. Event Source . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  24

9. Usage Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  25

9.1. Usage Data Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  25

9.2. Usage Data JSON Sample . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  25

9.3. Usage Data S3 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  26

10. Example Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  27

10.1. User Authentication Failed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  27

10.2. Update Location . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  27

10.3. S3 CSV Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  28

11. Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  29

Page 3: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Revision HistoryVersion Number Revision Date Description

1.0 05.12.2018 Initial Version

1.1 07.02.2019 Updated screenshots demonstrating API integrationsUpdated Configuring Integrations with steps for integrating each service

1.2 13.02.2019 Added reference to S3 naming conventions in S3 Integration details

1.3 25.03.2019 Updated URL to EMnify login page

1.4 01.07.2019 Updated Kinesis stream management to include region/stream-name parameter

1.5 08.07.2019 Added Management of Data Streams via API in Data Streamer API

1.6 02.10.2019 Added

• S3 Security Guidelines for securing S3 buckets.

• Compatibility Notes for new versions of the data streamer.

1.7 07.11.2019 Updated

• Event example in Usage Data JSON Sample

• Latest included events in Usage Data Properties

1.8 30.03.2020 Updated

• AWS Integration Authentication method in Trust Relationship

• Trust Relationship details in Kinesis Integration

• Trust Relationship details in S3 Integration

1.9 06.04.2020 Updated DataDog integration to specify integration support for US Region only in DataDogSetup.

Data Streamer User Manual Version 1.9, 2020-04-06

1

Page 4: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 1. Creating a Data StreamTo create a Data Stream, log in to an EUI account and navigate to the Technical Settings page by clicking on the link icon in thetop-right navigation menu. The Technical Settings page contains a panel dedicated to configuring and displaying Data Streams.

Figure 1. Tech Settings Page

1. Click on the action button Add Data Stream. The stream type can be any one of the following:◦ Usage Data◦ Event Data◦ Usage and Event Data

Figure 2. EDS Add Data Stream and Stream Type

2. Choose an API type. This may be RestAPI (optionally Bulk-mode), Salesforce, keen.io, DataDog, AWS S3 or AWS Kinesis (seeDescription of Data Stream Integrations).

Figure 3. EDS Add Data Stream Type

API keys and configuration parameters differ by integration type. Details on how to configure each integrationtype may be found in the Configuring Integrations section.

3. If historic data is required, the Stream historic data box should be checked. The system will send historic data up to 20 daysold. It may take time to catch up with live data if this option is enabled.

4. Click Add Data Stream

When created, new data streams are instantly activated and their status can be monitored on the data streams panel on theTechnical Settings page.

Version 1.9, 2020-04-06 Data Streamer User Manual

2

Page 5: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Figure 4. Data Streams Panel with an active stream to Amazon S3

Status CodesThe column Remote Status displays the HTTP status code from the remote side of the stream. A response code of 200 (OK)indicates that the data stream is correctly integrated at the server.

If the configuration is invalid or the receiving component or server is down, it will show 5XX errors.

Data Streamer User Manual Version 1.9, 2020-04-06

3

Page 6: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 2. Description of Data Stream Integrations2.1. AWS KinesisAmazon Kinesis allows for collecting and processing large streams of data records in real time. Applications created on Kinesis canrun on Amazon EC2 instances and typical uses are to send processed records to dashboards, generate alerts, dynamically changepricing or advertising strategies, or send data to other AWS services.

2.2. AWS S3Amazon S3 allows for storage of the raw event and usage data as it arrives from a data stream. Shortly after the stream is created,CSV files are uploaded to the S3 bucket; one for event data and one for usage data. The CSV files can then be sent to other Amazonservices or consumed by a third-party analytics or BI tool for generating insights.

2.3. RestAPIUsers who wish to use the RestAPI data stream must provide a REST API or application server which itself consumes the datastream. If you choose to implement an application which consumes events from the EMnify API, the Data Streamer will make POSTrequests with JSON data payloads when EMnify platform events occur. This is the most flexible method of processing a data streamas it allows any custom implementation of analytics, reporting or a pipeline of tools to process usage and event data.

2.3.1. RestAPI (Bulk-Mode)Using the EMnify API in bulk-mode, each HTTP POST will include an array of objects. The HTTP POSTs are sent at intervals andshould be used if the receiving system should process multiple events in bulk instead of individual events as they occur.

2.4. SalesforceWith the data streamer Salesforce configuration, it’s possible to stream your event data directly into a Salesforce account. This isdone by setting up a connected app in Salesforce which provides a client id and client secret. These credentials are used in theEMnify User Interface to grant access to the Salesforce platform event system.

Consider configuring this data streamer type for selected event types only as there are limits on the number ofevents supported. Location Updates or PDP Context events can be extremely frequent and are probably not themost typical to be processed in Salesforce. More information on this feature is detailed in the section FilteringData Streams.

Version 1.9, 2020-04-06 Data Streamer User Manual

4

Page 7: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 3. Configuring IntegrationsThe following section describes how to configure each offered integration by service. With each integration, credentials are requiredto ensure that the data streamer has permissions for writing data to that service.

3.1. AWS

3.1.1. Trust RelationshipThe AWS integrations are securely configured by means of a Trust Relationship of the EMnify Datastreamer role(arn:aws:iam::884047677700:role/datastreamer).The Trust Relationship can be added to new and/or already-existing roles.

Trust Relationship Policy Document Example

{  "Version": "2012-10-17",  "Statement": [  {  "Sid": "",  "Effect": "Allow",  "Principal": {  "AWS": [  "arn:aws:iam::884047677700:role/datastreamer"  ]  },  "Action": "sts:AssumeRole",  "Condition": {  "ForAnyValue:StringEquals": {  "sts:ExternalId": [  "org-<my-emnify-org-id>" ①  ]  }  }  }  ]}

① replace <my-emnify-org-id> with the numeric ID of your EMnify org. This can be retrieved with a GET request to/organisation/my.

3.1.2. S3 IntegrationThis section covers the steps necessary to create a new role with S3 write access.

1. In the AWS console, navigate to S3 and create a new bucket

Figure 5. Data Streamer Add S3 bucket

Bucket names should be DNS-compliant, see AWS docs: Rules for Bucket Naming.

2. In IAM → Policies click Create Policy to create a policy which allows PutObject permissions to the S3 bucket:

Data Streamer User Manual Version 1.9, 2020-04-06

5

Page 8: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Figure 6. Create policy for Datastreamer

3. In IAM → Roles, click Create Role for the S3 use case and click Next: Permissions

4. Attach the policy created in step 2 and click Next: Tags, then Create Role.

Figure 7. Attach policy to Role

5. Edit the newly-created role and click Trust Relationships → Edit Trust Relationships

6. Edit the policy document to allow EMnify’s datastreamer role.For details, see Trust Relationship.

Version 1.9, 2020-04-06 Data Streamer User Manual

6

Page 9: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Figure 8. Trust Relationship policy document example

7. The AWS configuration is complete and the role should look like the following:

Figure 9. Correctly-configured Datastreamer Role

A Data Stream can now be created via the EMnify API towards the configured bucket. Shortly after the stream is created, .csv fileswill be uploaded to the S3 bucket:

Data Streamer User Manual Version 1.9, 2020-04-06

7

Page 10: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Figure 10. Data Streamer S3 bucket with data

Event DataThe event data filename follows the format events_YYYYMMDD_HHmmss.csv.

Usage DataThe usage data filename follows the format cdr_YYYYMMDD_HHmmss.csv.

3.1.3. Kinesis IntegrationThis section covers the steps necessary to create a new role with Kinesis write access.

1. In the AWS console, navigate to Kinesis and create a new stream

Figure 11. Data Streamer Add Kinesis stream

2. In IAM → Policies click Create Policy which allows PutRecord and PutRecords write permissions to the Kinesis stream:

Version 1.9, 2020-04-06 Data Streamer User Manual

8

Page 11: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Figure 12. Create policy for Kinesis Stream

3. In IAM → Roles, click Create Role for Kinesis Analytics and click Next: Permissions

4. Attach the policy created in step 2 and click Next: Tags → Create Role.

Figure 13. Attach policy to Kinesis Role

5. Edit the newly-created role and click Trust Relationships → Edit Trust Relationships

6. Edit the policy document to allow EMnify’s datastreamer role.

Data Streamer User Manual Version 1.9, 2020-04-06

9

Page 12: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Figure 14. Trust Relationship policy document example

7. The AWS configuration is complete and the role should look like the following:

Figure 15. Correctly-configured Datastreamer Role

8. In the EMnify portal, navigate to the tech settings page and add a Kinesis Data Stream.

A Data Stream can now be created via the EMnify API towards the configured stream. Shortly after the stream is created, PUTrecords will arrive in the Kinesis stream.

3.1.4. S3 Security GuidelinesEvent data that is sent via Data Streams may include usernames, email addresses and other data which can identify users orplatform resources. The delivered .csv files should therefore be treated as containing sensitive information. Precautions should betaken to ensure that the event and usage data in the destination S3 buckets are adequately secured.

Version 1.9, 2020-04-06 Data Streamer User Manual

10

Page 13: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

The following steps should be considered as the minimum security requirements for storing such data in S3:

1. Ensure that the S3 bucket is not publicly accessible. This can be applied in the Permissions tab of the S3 bucket:

Figure 16. Blocking Public Access on the S3 Bucket

2. Server-Side Encryption can be enabled per-bucket and S3 will encrypt objects before they are saved to disk. The decryption isthen performed when downloading the objects. This can be enabled in the Properties tab of the S3 bucket:

Figure 17. Enabling Server-Side Encryption

3.2. Salesforce SetupWith the data streamer Salesforce configuration, it’s possible to stream your event data directly into a Salesforce account.

In Salesforce, we use events described as Platform Events which are intended for streaming data between external apps andSalesforce. More information about Platform Events and how to implement them can be found in the Salesforce Documentation -Platform Events.

Consider configuring this data streamer type for selected event types only as there are limits on the number ofevents supported. Location Updates or PDP Context events can be extremely frequent and are probably not themost typical to be processed in Salesforce. Details of how to achieve this can be found in Filtering Data Streams.

Data Streamer User Manual Version 1.9, 2020-04-06

11

Page 14: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

To prepare Salesforce system to receive our event stream you need to perform the following steps:

1. Setup a connected app in Salesforce which will provide your client id and client secret

2. Obtain a refresh token via Oauth2 to grant the data streamer access to your Salesforce platform event stream

3. Create a new platform event in your Salesforce account

Afterwards, add the following custom fields and assign them to your previously created platform event:

Figure 18. Salesforce fields required for Data Streamer Events

Now you can configure your Salesforce event stream in EUI. Your platform event name will be reflected as event stream name.

Version 1.9, 2020-04-06 Data Streamer User Manual

12

Page 15: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Figure 19. EDS S3 add stream

3.3. keen.io Setupkeen.io offers APIs for streaming, analyzing, and embedding rich data and integrates with the EMnify Data Streamer. To get theEDS integrated with keen.io and run your first queries and build dashboards, the following steps are required:

1. Create a keen.io user account

2. Create a new project, e.g. EMnify Data Stream

3. In the project settings go to Access to find your project id and write key

Figure 20. Configuration for keen.io access

4. Log on to the EUI and add a keen.io stream using the Project ID and Write Key

Data Streamer User Manual Version 1.9, 2020-04-06

13

Page 16: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Figure 21. EDS keen.io add stream

Shortly after the stream is created you will see first data arriving, which you can explore on keen.io on the streams tab. Dependingon the stream type you selected there will one or two streams appear with the names EMnifyEventData and EMnifyUsageData

You choose the data collection, e.g. EMnifyUsageData and click on Last 10 Events Streamed to see the live event data coming in.

Figure 22. Streaming data arriving in keen.io

3.4. DataDog SetupDataDog provides real-time performance monitoring (https://www.datadoghq.com). In conjunction with EDS it allows you to collectand analyze metrics about the usage of your endpoints and SIM cards, you can create dashboards and trigger alerts on certainsituations.

1. Create a DataDog user account in the US Region

Currently, only DataDog accounts with data in the US Region are supported by the EMnify Data Streamer. It isnot possible to migrate data between DataDog regions.

2. Navigate to Integrations → APIs from the left navigation menu

3. Generate a new Application Key

4. Copy the API Key and Application Key

5. Log on to the EMnify Portal and add a DataDog stream using the API Key and Application Key from step 4.

Version 1.9, 2020-04-06 Data Streamer User Manual

14

Page 17: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Figure 23. EDS DataDog add stream

Shortly after the stream is created, data will begin arriving. In the DataDog explorer, you can check the incoming data and then youcan start to create dashboards using the metrics endpoint.volume, endpoint.volume_tx, endpoint.volume_rx, and endpoint.cost.

Data Streamer User Manual Version 1.9, 2020-04-06

15

Page 18: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 4. Managing a Data Stream

Figure 24. Data Streams Panel showing an active data stream

Pausing a data streamA data stream can be paused and resumed at any time by using the action buttons on the right of the stream info. This is usefulfor actions such as maintenance of the receiving server.

Removing a data streamTo permanently remove a data stream, click the remove icon on the right. You will be asked to confirm this action.

Parallel data streamsIt is possible to add up to 10 data streams which run in parallel. This means that one stream may be connected to a networkmonitoring system, another can connect to data analytics platform and another can be syncing with S3 for archiving at the sametime, and so on.

FilteringIt’s possible to add filters to a data stream which will filter the entire stream to contain only a certain event type or usage datatype. This is covered in more detail in Filtering Data Streams.

Version 1.9, 2020-04-06 Data Streamer User Manual

16

Page 19: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 5. Filtering Data StreamsThe data streamer has the capability of applying filtering to streams which can be used to restrict the data sent to only those thatthe user is interested in. By default, no filters are added to a data stream and all events are streamed. Multiple filters can be appliedto each stream and this creates more granular and targeted data for analysis.

The following screenshot shows filters applied to a data stream via the EMnify User Interface. The data stream that the filters areapplied to will only contain Update Location and User Authentication failed events.

Figure 25. Filtering a Data Stream by Event Type

Data Streamer User Manual Version 1.9, 2020-04-06

17

Page 20: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 6. Data Streamer APIThe EMnify API allows for configuring Data Streams in a programmatic way through REST entrypoints. This functionality is exposedat the following places:

Method Entrypoint Description

GET /data_stream List Data Streams

POST /data_stream Create a Data Stream

DELETE /data_stream/{data_stream_id} Delete a Data Stream by ID

6.1. Listing Data StreamsAuthenticated users may list Data Streams by making GET request to /data_stream. An example cURL request to list data streamswould look like the following

curl -X GET "https://cdn.emnify.net/api/v1/data_stream" -H "accept: application/json" -H "Authorization:Bearer AuthToken"

AuthToken is a JWT described in EMnify API Authentication Documentation

Further documentation, including interactive examples can be found in the EMnify API specification for this operation at GET/data_streams.

6.2. Creating Data StreamsAuthenticated users may create Data Streams by making POST requests to /data_stream. An example cURL to create a data streamwould look like the following:

curl -X POST "https://cdn.emnify.net/api/v1/data_stream" \-H "accept: */*" -H "Authorization: Bearer AuthToken" -H "Content-Type: application/json" \-d '{"stream_historic_data": 0,"data_stream_type":...'

The request body (set using the -d flag in cURL) is used to configure the parameters of the Data Stream itself. The following JSONrequest body example shows how to create a Data Stream with S3 integration:

{  "stream_historic_data": 0,  "data_stream_type": {  "description": "Usage Data & Events",  "id": 3  },  "api_type": {  "description": "AWS S3",  "id": 8  },  "api_username": "arn:aws:iam::<your-account_id>:role/<bucket-role-name>",  "api_parameter": "eu-west-1/bucket-name"}

Comprehensive documentation of the configurable parameters and interactive examples can be found in the EMnify APIspecification for this operation at POST /data_stream.

6.3. Deleting Data StreamsData streams may be deleted by making a DELETE request to /data_stream/{data_stream_id}.

The data_stream_id path parameter is the top-level id property in each object returned by GET requests to/data_stream and is a numerical ID that is unique for each data stream.

The following cURL request would then delete a data stream with an id of 1337:

curl -X DELETE "https://cdn.emnify.net/api/v1/data_stream/1337" -H "accept: application/json" -H"Authorization: Bearer AuthToken"

Version 1.9, 2020-04-06 Data Streamer User Manual

18

Page 21: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Further documentation and interactive examples can be found in the EMnify API specification for this operation at DELETE/data_stream/{data_stream_id}.

Data Streamer User Manual Version 1.9, 2020-04-06

19

Page 22: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 7. Compatibility NotesThe EMnify Data Streamer is under active development and is updated for performance and quality improvements regularly so thatusers of the platform may gain rich streaming insights into usage and event data.

VersioningThere is no external versioning of the Data Streamer that is necessary for developers to have to track. Updates are, therefore,always performed on the service with the intent of preserving backwards compatibility. This means that existing JSON or CSVentities are preserved and not renamed when updates to the data streamer are performed.

Parsing S3 or Kinesis ArtifactsUsers who have built custom integrations in AWS or otherwise which consume the JSON or CSV generated from S3 or Kinesisstreams should expect that additional JSON or CSV data may be added in future. Mature and tested libraries designed forparsing or reading CSV and JSON data should be used for such custom integrations (which may be in lambdas or otherwise) toensure compatibility in cases where additional fields or objects are added to data streams in future.

Version 1.9, 2020-04-06 Data Streamer User Manual

20

Page 23: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 8. Event Data Reference8.1. Generic Event DataThe following table shows the properties in generic event data. The following properties are included in all events:

Property Format Description

id Numeric A unique numerical identifier of this event. If multiple events with same id arereceived (e.g. due to transmission errors) these should be treated by thereceiver as duplicates

timestamp Timestamp Date/time when this event happened

event_type Nested Object Type of the event, see Event Types for details

event_severity Nested Object Severity of the event, see Severity Levels for details

event_source Nested Object Source of the event, see Event Source for details

organisation Nested Object Organisation associated with this event, see Organisation Object for details

alert Boolean Event is a candidate to be alerted to an user

description String Human readable description of the event

8.2. Additional PropertiesEvent types relating specifically to SIMs, endpoints and users include the following additional properties:

Property Format Description

imsi Nested Object Details of IMSI, see IMSI Object for details (in case of multi-IMSIconfiguration, multiple different IMSIs may be reported for the same SIM)

sim Nested Object Details of SIM, see SIM Object for details

endpoint Nested Object Details of Endpoint, see Details Object for details

8.2.1. Additional Details by Event TypeThe data streamer will send additional data when available depending on the event type. This data is added as a nested objectcalled detail and contains information on the actual used Mobile Network Operator and country. See Details Object for moreinformation.

Details Object

Property Format Description

id Numeric Unique identifier of the actual used MobileNetwork Operator

name String Name of the Mobile Network Operator

country Nested Object Country of Mobile Network Operator

country.id Numeric Unique identifier of the country

country.name String Name of country

country.country_code String Country code

country.mcc String Mobile Country Code (MCC)

country.iso_code String ISO code

pdp_context Nested Object PDP Context Details

volume Nested Object Volume consumed in PDP Context

volume.rx Number (up to 6 decimal places) Downstream Volume in MB

volume.tx Number (up to 6 decimal places) Upstream Volume in MB

volume.total Number (up to 6 decimal places) Total volume

Data Streamer User Manual Version 1.9, 2020-04-06

21

Page 24: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

PDP Context

Property Format Description

pdp_context_id String Unique identifier of this PDP context

tunnel_created Timestamp Date/time when this PDP context was created

gtp_version String GTP Version (either 1 or 2)

ggsn_control_plane_ip_address String IP Address of GGSN/PGW Control Plane

ggsn_data_plane_ip_address String IP Address of GGSN/PGW Data Plane

sgsn_control_plane_ip_address String IP Address of SGSN/SGW Control Plane

sgsn_data_plane_ip_address String IP Address of SGSN/SGW Data Plane

region String Region where Data Plane is located

breakout_ip String IP Address used for Internet Breakout

apn String Access Point Name (APN)

nsapi Integer Network Service Access Point Identifier (NSAPI)

ue_ip_address String IP address assigned to Endpoint

imeisv String IMEISV

mcc String Mobile Country Code (MCC)

mnc String Mobile Network Code (MNC)

lac Integer Location Area Code (LAC)

sac Integer Service Area code (SAC)

rac Integer Routing Area code (RAC)

ci Integer Cell Identification (CI)

rat_type Integer Radio Access Type (RAT).Integers [1-6] correspond to the following values:

1 3G

2 2G

3 WLAN

4 GAN

5 HSPA+

6 4G

Organisation Object

Property Format Description

id Numeric Unique identifier of this organisation

name String Name of Organisation

User Object

Property Format Description

id Numeric Unique identifier of this user

username String Username e.g. email address

name String Realname of user

IMSI Object

Property Format Description

id Numeric Unique identifier of this IMSI

imsi String International mobile subscriber identity (IMSI)

Version 1.9, 2020-04-06 Data Streamer User Manual

22

Page 25: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Property Format Description

import_date Timestamp Date/Time this IMSI was provisioned

SIM Object

Property Format Description

id Numeric Unique identifier of this SIM

iccid String Integrated Circuit Card identifier (ICCID) without checksum digit

msisdn String MSISDN

production_date Timestamp Date/Time this SIM chip was produced

Endpoint Object

Property Format Description

id Numeric Unique identifier of this Endpoint

name String Configured name of this endpoint

ip_address String IP Address assigned to this Endpoint

tags String Tags assigned to this Endpoint

imei String International mobile equipment identity (IMEI)

8.3. Event TypesId Description

0 Generic

1 Update location

2 Update GPRS location

3 Create PDP Context

4 Update PDP Context

5 Delete PDP Context

6 User authentication failed

7 Application authentication failed

8 SIM activation

9 SIM suspension

10 SIM deletion

11 Endpoint blocked

12 Organisation blocked

13 Support Access

14 Multi-factor Authentication

15 Purge Location

16 Purge GPRS location

17 Self-Signup

18 Threshold reached

19 Quota used up

8.4. Severity LevelsId Description

0 INFO

1 WARN

Data Streamer User Manual Version 1.9, 2020-04-06

23

Page 26: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Id Description

2 CRITICAL

8.5. Event SourceId Description

0 Network

1 Policy Control

2 API

Version 1.9, 2020-04-06 Data Streamer User Manual

24

Page 27: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 9. Usage Data9.1. Usage Data PropertiesProperty Format Description

id Numeric Unique identifier of this transaction

cost Number Cost calculation of reported traffic volume. May use up to 6decimal places.

currency.id Numeric Unique identifier of currency of indicated cost

currency.code ISO 4217 Currency Code

start_timestamp UTC Timestamp Start time of traffic measurement

end_timestamp UTC Timestamp End time of traffic measurement

volume.rx Number Downstream traffic (MB) received by the endpoint. May use upto 6 decimal places.

volume.tx Number Upstream traffic (MB) send by the endpoint. May use up to 6decimal places.

volume.total Number Total traffic consumed. May use up to 6 decimal places.

imsi 15 digit numericstring

Currently used IMSI

endpoint.id Numeric Unique identifier of endpoint

endpoint.name String The user-defined name set for this endpoint

endpoint.ip_address Numeric The IP address assigned to this endpoint

endpoint.tags String User-defined tags (if any) set for this endpoint

endpoint.imei Numeric The IMEI of the endpoint hardware

sim.id Numeric Unique identifier of SIM

sim.iccid 19 digit numericstring

ICCID of SIM

sim.msisdn Numeric MSISDN of the associated SIM

sim.production_date Timestamp The production date of the associated SIM

organisation.id Numeric Unique identifier of organisation

organisation.name String Name of organisation

operator.id Numeric Unique identifier of visited operator

operator.mnc Numeric Mobile Network Code of the visited operator

operator.name String Name of that mobile operator

operator.country.id Numeric Unique identifier of visited country

operator.country.mcc Numeric Mobile Country Code of the visited operator

operator.country.name String Name of visited country

tariff.id Numeric Unique identifier of applied tariff

tariff.name String Name of Tariff

tariff.ratezone.id Numeric Unique identifier of applied ratezone

tariff.ratezone.name String Name of Ratezone

traffic_type.id Numeric Unique identifier of traffic type

traffic_type.name String Name of traffic type

9.2. Usage Data JSON Sample

Data Streamer User Manual Version 1.9, 2020-04-06

25

Page 28: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

{  "sim":{  "production_date":"2019-01-22 13:39:15",  "msisdn":"423111111111111",  "id":1337,  "iccid":"1111111111111111111"  },  "traffic_type":{  "id":5,  "name":"Data"  },  "cost":0.0010142,  "volume":{  "total":0.005071,  "rx":0.002418,  "tx":0.002653  },  "imsi_id":3506619,  "currency":{  "symbol":"�",  "code":"EUR",  "id":1  },  "start_timestamp":"2019-11-06 15:50:58",  "imsi":"111111111111111",  "operator":{  "mnc":"01",  "id":2,  "country":{  "mcc":"262",  "id":74,  "name":"Germany"  },  "name":"Telekom"  },  "endpoint":{  "ip_address":"10.10.10.1",  "imei":"1111111111111111",  "id":9635467,  "tags":null,  "name":"My Endpoint"  },  "organisation":{  "id":1337,  "name":"My Org"  },  "id":85579,  "tariff":{  "id":166,  "ratezone":{  "id":276,  "name":"Zone I"  },  "name":"Global Roaming I - 1kb"  },  "end_timestamp":"2019-11-06 16:06:42"}

9.3. Usage Data S3 ExampleThe following example shows a .csv file delivered to S3 with a header and truncated to a single line of usage data.

"id","event_start_timestamp","event_stop_timestamp","organisation_id","organisation_name","endpoint_id","sim_id","iccid","imsi","operator_id","operator_name","country_id","operator_country_name","traffic_type_id","traffic_type_description","volume","volume_tx","volume_rx","cost","currency_id","currency_code","currency_symbol","ratezone_tariff_id","ratezone_tariff_name","ratezone_id","ratezone_name","endpoint_name","endpoint_ip_address","endpoint_tags","endpoint_imei","msisdn_msisdn","sim_production_date","operator_mncs","country_mcc""8486773751","2019-10-30 15:28:46","2019-10-30 15:37:36","1337","MyOrg","9635467","1606552","8988303000000590821","295090088888888","5","TelefonicaO2","74","Germany","5","Data","0.082581","0.014796","0.067785","0.0165162000","1","EUR","€","166","GlobalRoaming I - 1kb","276","Zone I","My Device","10.192.200.28",,"8672900288888888","423663918888888","2019-01-22 13:39:15","03,07,08","262"

Version 1.9, 2020-04-06 Data Streamer User Manual

26

Page 29: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 10. Example EventsThe following section describes the properties contained in events.

10.1. User Authentication Failed

{  "id": 201388127,  "alert": false,  "description": "Failed authentication request from '[email protected]', Reason: Invalid password fromIP 9.9.9.9",  "timestamp": "2017-10-26T07:42:00.000+0000",  "event_type": {  "id": 6,  "description": "User authentication failed"  },  "event_source": {  "id": 2,  "description": "API"  },  "event_severity": {  "id": 1,  "description": "Warn"  },  "organisation": {  "id": 839921,  "name": "Demo Company"  },  "user": {  "id": 84993,  "username": "[email protected]",  "name": "Scott Tiger"  }}

10.2. Update Location

Data Streamer User Manual Version 1.9, 2020-04-06

27

Page 30: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

{  "id": 201370709,  "alert": false,  "description": "New location received from VLR for IMSI='90143012345678912345', now attached toVLR='491720013095'.",  "timestamp": "2017-10-26T07:28:00.000+0000",  "event_type": {  "id": 1,  "description": "Update location"  },  "event_source": {  "id": 0,  "description": "Network"  },  "event_severity": {  "id": 0,  "description": "Info"  },  "organisation": {  "id": 839921,  "name": "Demo Company"  },  "endpoint": {  "id": 8638726,  "name": "GPS Tracker",  "ip_address": "100.96.234.249",  "tags": null,  "imei": "3577620833012201"  },  "imsi": {  "id": 205672,  "imsi": "90143012345678912345",  "import_date": "2016-12-27T10:09:23.000+0000"  },  "sim": {  "id": 274887,  "iccid": "8988303001234567890",  "production_date": "2016-12-27T10:09:23.000+0000"  },  "detail": {  "id": 3,  "name": "Vodafone",  "country": {  "id": 74,  "name": "Germany",  "country_code": "49",  "mcc": "262",  "iso_code": "de"  },  "tapcode": [{  "id": 2,  "tapcode": "DEUD2"  }],  "mnc": [{  "id": 3,  "mnc": "02"  }]  }}

10.3. S3 CSV ExampleThe following example shows a .csv file delivered to S3 with a header and truncated to a single line of event data.

"id","timestamp","event_type_id","event_type_description","event_severity_id","event_severity_description","organisation_id","organisation_name","description","alert","event_source_id","event_source_name","endpoint_id","endpoint_name","endpoint_imei","endpoint_ip_address","endpoint_tags","sim_id","sim_iccid","msisdn_msisdn","sim_production_date","imsi_id","imsi_imsi","user_id","user_username","user_name""6357085288","2019-10-28 15:43:16","9","SIM suspension","0","INFO","1572","My org","Status of SIM changedfrom 'Activated' to 'Suspended'","0","2","API","9635467","JE-OnePlus2","1111111111111111","10.192.200.28",,"1606552","8988303000088888888","423663918888888","2019-01-22 13:39:15","3506615","295050900029773",,,

Version 1.9, 2020-04-06 Data Streamer User Manual

28

Page 31: Data Streamer User Manual · 2020-06-04 · Chapter 2. Description of Data Stream Integrations 2.1. AWS Kinesis Amazon Kinesis allows for collecting and processing large streams of

Chapter 11. SupportYou can contact our support team via the following channels:

1. Chat Please log in to EUI and click on the chat button in right/bottom corner, a support representative will be available for whocan also transfer your chat to a technical expert as needed.

2. Email You can reach us at [email protected] anytime and your request will be logged in our ticket system and processed byone of our technical experts. To facilitate a swift response, include following information in your email:◦ Your company name◦ The ICCIDs of your SIMs in question◦ The type of device you are using◦ A short description of the issue◦ A phone number where we can reach you

3. Phone You can call us at +49-30-5557333555, make sure you have the ICCIDs of your SIMs available. We handle supportrequests according to their severity:◦ Critical incidents will be handled 24/7 at any day of the year. The best way to report critical problems is submit them by

email, you will receive a receipt with a Ticket ID assigned.◦ Operational issues: Requests for information and requests for new configurations will be handled during business days in

Germany 09:00-17:00 CET.

Data Streamer User Manual Version 1.9, 2020-04-06

29