在IoT Analytics中解析JSON数组输入

问题描述 投票:0回答:1

我一次从我的频道中的IoT设备接收多个数据记录作为JSON数组。收到的消息如下所示:

[
    {
      "Field1": "Value1",
      "Field2": "Value2",
      "Field3": "Value3"
    },
    {
      "Field1": "AnotherValue1",
      "Field2": "AnotherValue2",
      "Field3": "AnotherValue3"
    }
]

我使用以下SQL查询创建数据集:

SELECT * FROM mydatastore

当我运行数据集时,返回的结果是:

array                                              __dt 
-----                                              -----
[{field1=Value1, field2=Value2, field3=Value3}]    2019-02-21 00:00:00.000

我想要的结果是:

Field1           Field2           Field3
------           ------           ------
Value1           Value2           Value3
AnotherValue1    AnotherValue2    AnotherValue3

如何让IoT Analytics在收到的JSON数组中的每个元素的数据存储区中创建新行?

amazon-web-services aws-iot aws-iot-analytics
1个回答
0
投票

如何让IoT Analytics在收到的JSON数组中的每个元素的数据存储区中创建新行?

最简单的方法应该是在管道上利用Lambda Activity,并将单个JSON有效负载解析为所需的结构。这在某种程度上取决于发送到Channel的消息的“原始”结构。

因此,例如,我们可以通过CLI batch-put-message将数据发送到Channel,如下所示:

aws iotanalytics batch-put-message --channel-name sample_channel --messages '[{"messageId": "message1", "payload": "{\"array\": [{\"Field1\": \"Value1\", \"Field2\": \"Value2\", \"Field3\": \"Value3\"},{\"Field1\": \"AnotherValue1\", \"Field2\": \"AnotherValue2\", \"Field3\": \"AnotherValue3\"}]}"}]'

然后,频道将有一条结构如下:

{
  "messageId": "message1",
  "payload": {
    "array": [
      {
        "Field1": "Value1",
        "Field2": "Value2",
        "Field3": "Value3"
      },
      {
        "Field1": "AnotherValue1",
        "Field2": "AnotherValue2",
        "Field3": "AnotherValue3"
      }
    ]
  }
}

如果您的Pipeline具有Lambda活动,那么Channel中的消息将传递给event参数中的Lambda函数。

我使用AWS Lambda控制台内联编辑器创建了一个简单的Lambda函数(使用Python 3.7),并将其命名为sample_lambda

import json
import sys
import logging

# Configure logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
streamHandler = logging.StreamHandler(stream=sys.stdout)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
streamHandler.setFormatter(formatter)
logger.addHandler(streamHandler)


def lambda_handler(event, context):
    # This can be handy to see the raw structure of the incoming event
    # will log to the matching CloudWatch log:
    # /aws/lambda/<name_of_the_lambda>
    # logger.info("raw event: {}".format(event))

    parsed_rows = []

    # Depending on the batchSize setting of the Lambda Pipeline Activity,
    # you may receive multiple messages in a single event
    for message_payload in event:
        if 'array' in message_payload:
            for row in message_payload['array']:
                parsed = {}
                for key, value in row.items():
                    parsed[key] = value
                parsed_rows.append(parsed)

    return parsed_rows

我添加了适当的权限,以便IoT-Analytics可以通过CLI调用lambda函数:

aws lambda add-permission --function-name sample_lambda --statement-id statm01 --principal iotanalytics.amazonaws.com --action lambda:InvokeFunction

重新处理管道,解析的行放在DataStore中;执行DataSet,我得到了这个结果:

"array","field1","field2","field3","__dt"
,"Value1","Value2","Value3","2019-04-26 00:00:00.000"
,"AnotherValue1","AnotherValue2","AnotherValue3","2019-04-26 00:00:00.000"
最新问题
© www.soinside.com 2019 - 2025. All rights reserved.