Device telemetry data is ingested from the field Build your own Anomaly Detection ML Pipeline 1 devices on a near real-time basis by calls to the This end-to-end ML pipeline detects anomalies by ingesting real-time, streaming data from various network edge field devices, API via API Gateway. The requests get performing transformation jobs to continuously run daily predictions/inferences, and retraining the ML models based on the authenticated/authorized using Amazon Cognito. incoming newer time series data on a daily basis. Note that Random Cut Forest (RCF) is one of the algorithms Amazon Kinesis Data Firehose ingests the data in for detecting anomalous data points within a data set and is designed to work with arbitrary-dimensional input. 2 real time, and invokes AWS Lambda to transform the data into parquet format. Kinesis Data AWS Cloud Firehose will automatically scale to match the throughput of the data being ingested. 1 Real-Time Device Telemetry Data Ingestion Pipeline 2 Data Engineering Machine Learning Devops Pipeline Pipeline The telemetry data is aggregated on an hourly 3 3 basis and re-partitioned based on the year, month, date, and hour using AWS Glue jobs. The Amazon Cognito AWS Glue Data Catalog additional steps like transformations and feature Device #1 engineering are performed for training the SageMaker AWS CodeBuild Anomaly Detection ML Model using AWS Glue Notebook Create ML jobs. The training data set is stored on Amazon Container S3 Data Lake.

Amazon Kinesis AWS Lambda S3 Data Lake AWS Glue Device #2 Amazon API The training code is checked in an AWS Gateway Data Firehose 4 AWS SageMaker CodeCommit repo which triggers a Machine Training CodeCommit Containers Learning DevOps (MLOps) pipeline using AWS Dataset for Anomaly CodePipeline. CodePipeline builds the Amazon Detection SageMaker training and inference containers, Device #3 triggers the SageMaker training job using the Operationalize Inference Monitoring Deploy Pipeline (Re) Training specified training dataset, deploys the trained Inference Endpoint Pipeline Pipeline AWS CodeBuild Amazon model in the testing environment, and upon Trigger QA Elastic approval, deploys the model into production Training Container using SageMaker inference endpoints. SageMaker Registry Amazon Amazon Model The ML models generated by training jobs are Amazon Cognito SageMaker CloudWatch Repository Amazon SageMaker 5 Model Monitor Training jobs registered in the SageMaker Model Repository. AWS CodeBuild The deploy pipeline selects the best ML model to Pull QA Model deploy using SageMaker hosting.

Trained new models Classify if the telemetry data is an anomaly or not User Amazon API AWS Lambda AWS Step Amazon SageMaker 6 via HTTP(s) API using Amazon API Gateway and Application Gateway 4 Functions Inference Endpoint AWS CodeBuild AWS CodeBuild Lambda functions. The Lambda function invokes Amazon SageMaker Deploy Model to Deploy Inference the SageMaker endpoint to predict the anomaly. Trained ML models Prod Endpoint to Prod 6 A2I Workflow 7 5 Human Re-labeling with version The inference quality is monitored using 7 SageMaker Model Monitor. The requests with ambiguous prediction scores are sent for re- labeling using Amazon CloudWatch events, Reviewed for technical accuracy June 1, 2021 AWS Reference Architecture triggering the SageMaker A2I workflow using © 2021, , Inc. or its affiliates. All rights reserved. AWS Step Functions.