AWS.IoTAnalytics (aws-elixir v0.8.0) View Source

AWS IoT Analytics allows you to collect large amounts of device data, process messages, and store them.

You can then query the data and run sophisticated analytics on it. AWS IoT Analytics enables advanced data exploration through integration with Jupyter Notebooks and data visualization through integration with Amazon QuickSight.

Traditional analytics and business intelligence tools are designed to process structured data. IoT data often comes from devices that record noisy processes (such as temperature, motion, or sound). As a result the data from these devices can have significant gaps, corrupted messages, and false readings that must be cleaned up before analysis can occur. Also, IoT data is often only meaningful in the context of other data from external sources.

AWS IoT Analytics automates the steps required to analyze data from IoT devices. AWS IoT Analytics filters, transforms, and enriches IoT data before storing it in a time-series data store for analysis. You can set up the service to collect only the data you need from your devices, apply mathematical transforms to process the data, and enrich the data with device-specific metadata such as device type and location before storing it. Then, you can analyze your data by running queries using the built-in SQL query engine, or perform more complex analytics and machine learning inference. AWS IoT Analytics includes pre-built models for common IoT use cases so you can answer questions like which devices are about to fail or which customers are at risk of abandoning their wearable devices.

Link to this section Summary

Functions

Creates the content of a data set by applying a queryAction (a SQL query) or a containerAction (executing a containerized application).

Creates a data store, which is a repository for messages.

Deletes the content of the specified dataset.

Retrieves information about a dataset.

Retrieves the current settings of the AWS IoT Analytics logging options.

Retrieves information about a pipeline.

Retrieves the contents of a data set as presigned URIs.

Lists the tags (metadata) that you have assigned to the resource.

Sets or updates the AWS IoT Analytics logging options.

Simulates the results of running a pipeline activity on a message payload.

Retrieves a sample of messages from the specified channel ingested during the specified timeframe.

Starts the reprocessing of raw message data through the pipeline.

Adds to or modifies the tags of the given resource.

Removes the given tags (metadata) from the resource.

Link to this section Functions

Link to this function

batch_put_message(client, input, options \\ [])

View Source

Sends messages to a channel.

Link to this function

cancel_pipeline_reprocessing(client, pipeline_name, reprocessing_id, input, options \\ [])

View Source

Cancels the reprocessing of data through the pipeline.

Link to this function

create_channel(client, input, options \\ [])

View Source

Creates a channel.

A channel collects data from an MQTT topic and archives the raw, unprocessed messages before publishing the data to a pipeline.

Link to this function

create_dataset(client, input, options \\ [])

View Source

Creates a dataset.

A dataset stores data retrieved from a data store by applying a queryAction (a SQL query) or a containerAction (executing a containerized application). This operation creates the skeleton of a dataset. The dataset can be populated manually by calling CreateDatasetContent or automatically according to a trigger you specify.

Link to this function

create_dataset_content(client, dataset_name, input, options \\ [])

View Source

Creates the content of a data set by applying a queryAction (a SQL query) or a containerAction (executing a containerized application).

Link to this function

create_datastore(client, input, options \\ [])

View Source

Creates a data store, which is a repository for messages.

Only data stores that are used to save pipeline data can be configured with ParquetConfiguration.

Link to this function

create_pipeline(client, input, options \\ [])

View Source

Creates a pipeline.

A pipeline consumes messages from a channel and allows you to process the messages before storing them in a data store. You must specify both a channel and a datastore activity and, optionally, as many as 23 additional activities in the pipelineActivities array.

Link to this function

delete_channel(client, channel_name, input, options \\ [])

View Source

Deletes the specified channel.

Link to this function

delete_dataset(client, dataset_name, input, options \\ [])

View Source

Deletes the specified dataset.

You do not have to delete the content of the dataset before you perform this operation.

Link to this function

delete_dataset_content(client, dataset_name, input, options \\ [])

View Source

Deletes the content of the specified dataset.

Link to this function

delete_datastore(client, datastore_name, input, options \\ [])

View Source

Deletes the specified data store.

Link to this function

delete_pipeline(client, pipeline_name, input, options \\ [])

View Source

Deletes the specified pipeline.

Link to this function

describe_channel(client, channel_name, include_statistics \\ nil, options \\ [])

View Source

Retrieves information about a channel.

Link to this function

describe_dataset(client, dataset_name, options \\ [])

View Source

Retrieves information about a dataset.

Link to this function

describe_datastore(client, datastore_name, include_statistics \\ nil, options \\ [])

View Source

Retrieves information about a data store.

Link to this function

describe_logging_options(client, options \\ [])

View Source

Retrieves the current settings of the AWS IoT Analytics logging options.

Link to this function

describe_pipeline(client, pipeline_name, options \\ [])

View Source

Retrieves information about a pipeline.

Link to this function

get_dataset_content(client, dataset_name, version_id \\ nil, options \\ [])

View Source

Retrieves the contents of a data set as presigned URIs.

Link to this function

list_channels(client, max_results \\ nil, next_token \\ nil, options \\ [])

View Source

Retrieves a list of channels.

Link to this function

list_dataset_contents(client, dataset_name, max_results \\ nil, next_token \\ nil, scheduled_before \\ nil, scheduled_on_or_after \\ nil, options \\ [])

View Source

Lists information about data set contents that have been created.

Link to this function

list_datasets(client, max_results \\ nil, next_token \\ nil, options \\ [])

View Source

Retrieves information about data sets.

Link to this function

list_datastores(client, max_results \\ nil, next_token \\ nil, options \\ [])

View Source

Retrieves a list of data stores.

Link to this function

list_pipelines(client, max_results \\ nil, next_token \\ nil, options \\ [])

View Source

Retrieves a list of pipelines.

Link to this function

list_tags_for_resource(client, resource_arn, options \\ [])

View Source

Lists the tags (metadata) that you have assigned to the resource.

Link to this function

put_logging_options(client, input, options \\ [])

View Source

Sets or updates the AWS IoT Analytics logging options.

If you update the value of any loggingOptions field, it takes up to one minute for the change to take effect. Also, if you change the policy attached to the role you specified in the roleArn field (for example, to correct an invalid policy), it takes up to five minutes for that change to take effect.

Link to this function

run_pipeline_activity(client, input, options \\ [])

View Source

Simulates the results of running a pipeline activity on a message payload.

Link to this function

sample_channel_data(client, channel_name, end_time \\ nil, max_messages \\ nil, start_time \\ nil, options \\ [])

View Source

Retrieves a sample of messages from the specified channel ingested during the specified timeframe.

Up to 10 messages can be retrieved.

Link to this function

start_pipeline_reprocessing(client, pipeline_name, input, options \\ [])

View Source

Starts the reprocessing of raw message data through the pipeline.

Link to this function

tag_resource(client, input, options \\ [])

View Source

Adds to or modifies the tags of the given resource.

Tags are metadata that can be used to manage a resource.

Link to this function

untag_resource(client, input, options \\ [])

View Source

Removes the given tags (metadata) from the resource.

Link to this function

update_channel(client, channel_name, input, options \\ [])

View Source

Updates the settings of a channel.

Link to this function

update_dataset(client, dataset_name, input, options \\ [])

View Source

Updates the settings of a data set.

Link to this function

update_datastore(client, datastore_name, input, options \\ [])

View Source

Updates the settings of a data store.

Link to this function

update_pipeline(client, pipeline_name, input, options \\ [])

View Source

Updates the settings of a pipeline.

You must specify both a channel and a datastore activity and, optionally, as many as 23 additional activities in the pipelineActivities array.