By the end of this chapter, your serverless application should do the following:
The following steps will walk you through creation of a serverless function in AWS Lambda. The function defines a small bit of code that expects device shadow messages from IoT Core, transforms the message into the format used with your ML endpoint, then invokes your ML endpoint to return the classification of roomOccupancy and the confidence score of the inference.
classifyRoomOccupancy
.import json
import boto3
import os
# Receives a device shadow Accepted document from IoT Core rules engine.
# Event has signature like {"state": {"reported": {"sound": 5}}}.
# See expectedAttributes for full list of attributes expected in state.reported.
# Builds CSV input to send to SageMaker endpoint, name of which stored in
# environment variable SAGEMAKER_ENDPOINT.
#
# Returns the prediction and confidence score from the ML model endpoint.
def lambda_handler(event, context):
client = boto3.client('sagemaker-runtime')
print('event received: {}'.format(event))
# Order of attributes must match order expected by ML model endpoint. E.g.
# the same order of columns used to train the model.
expectedAttributes = ['sound', 'temperature', 'hvacStatus', 'roomOccupancy', 'timestamp']
reported = event['state']['reported']
reported['timestamp'] = event['timestamp']
reportedAttributes = reported.keys()
# Validates the input event has all the expected attributes.
if(len(set(expectedAttributes) & set(reportedAttributes)) < len(expectedAttributes)):
return {
'statusCode': 400,
'body': 'Error: missing attributes from event. Expected: {}. Received: {}.'.format(','.join(expectedAttributes), ','.join(reportedAttributes))
}
# Build the input CSV string to send to the ML model endpoint.
reportedValues = []
for attr in expectedAttributes:
reportedValues.append(str(reported[attr]))
input = ','.join(reportedValues)
print('sending this input for inference: {}'.format(input))
endpoint_name = os.environ['SAGEMAKER_ENDPOINT']
content_type = "text/csv"
accept = "application/json"
payload = input
response = client.invoke_endpoint(
EndpointName=endpoint_name,
ContentType=content_type,
Accept=accept,
Body=payload
)
body = response['Body'].read()
print('received this response from inference endpoint: {}'.format(body))
return {
'statusCode': 200,
'body': json.loads(body)['predictions'][0]
}
SAGEMAKER_ENDPOINT
and for Value enter the name of your SageMaker endpoint. You named this resource as the last step of the previous chapter and this module assumes the name is roomOccupancyEndpoint
.thermostatRule
.invokeSageMakerEndpoint
and choose Create policy. You can now close this new browser tab.These steps conclude configuration of your AWS Lambda function. When the Lambda function receives this device shadow update, for example:
{
"state": {
"reported": {
"sound": 20,
"temperature": 58.8,
"hvacStatus": "HEATING",
"roomOccupancy": true
}
},
"timestamp": 1234567890
}
It will return this response after invoking the SageMaker endpoint:
{
"statusCode": 200,
"body": {
"predicted_label": "false",
"probability": "0.9999991655349731"
}
}
The next step is to update your IoT Core rule (assumed name of thermostatRule
) to use this Lambda function integration.
SELECT CASE state.reported.sound > 10 WHEN true THEN true ELSE false END AS state.desired.roomOccupancy FROM '$aws/things/<<CLIENT_ID>>/shadow/update/accepted' WHERE state.reported.sound <> Null
.SELECT cast(get(get(aws_lambda("arn\:aws\:lambda\:REGION\:ACCOUNT_ID\:function\:FUNCTION_NAME", *), "body"), "predicted_label") AS Boolean) AS state.desired.roomOccupancy FROM '$aws/things/<<CLIENT_ID>>/shadow/update/accepted' WHERE state.reported.sound <> Null
us-west-2
and not Oregon
); change ACCOUNT_ID to your 12-digit account id, without hyphens, which is also shown in the console header menu where your username is printed; and change FUNCTION_NAME to the name of the AWS Lambda function you created (assumed name is classifyRoomOccupancy
). Don’t forget to update the «CLIENT_ID» placeholder in the FROM topic as well.At this point, your IoT workflow is now consuming your trained machine learning model from its deployed endpoint to classify messages published by your smart thermostat as new roomOccupancy values!
Before moving on to the next chapter, you can validate that your serverless application is configured as intended:
$aws/things/<<CLIENT_ID>>/shadow/update
(replacing your «CLIENT_ID») and you should see two kinds of messages here. The first is the payload published by your smart thermostat with the state.reported
path. The other is the payload now being published by your thermostat rule with the state.desired.roomOccupancy
value determined by your ML model.If these are working as expected, you have completed this module and can move on to Conclusion .
AWS IoT Kit now features direct access to
M5Stack Forum
, which is a community-driven, questions-and-answers service. Search re:Post using the
Core2 for AWS
tag to see if your question has been asked and answered. If not, ask a new question using the Core2 for AWS
tag.