Quantcast
Channel: Category Name
Viewing all articles
Browse latest Browse all 5971

Use Azure Data Lake Analytics to query AVRO data from IoT Hub

$
0
0

Recently a customer asked me how to read blob data produced from the routing capability of Azure IoT Hub. To provide this customer with a complete answer, I put together a step-by-step guide that I am happy to share with you in the video below.

One of the common patterns of Internet of Things applications is called “cold path” and consists of storing all data produced by IoT devices in the cloud for later processing. To make such an implementation trivial, Azure IoT Hub supports routing of messages coming from devices directly to cloud storage services. IoT Hub can also apply simple rules based on both properties, and the message body can route messages to various custom endpoints of your choice. IoT Hub will write blob content in AVRO format, which has both message body and message properties. Great for data/message preservation, AVRO can be challenging for querying and processing the data. Here is a suggested solution to process this data.

Many of the big data patterns can be used for processing non-relational data files in custom file formats. Focusing on cost and deployment simplicity, Azure Data Lake Analytics (ADLA) is one of the only “pay per query” big data patterns. With ADLA, we don’t have to setup virtual machines, databases, networks, or storage accounts. Using U-SQL, the query language for, and an AVRO “extractor” we can parse, transform, and query our IoT Hub data.

The following video walk through the process of transforming, querying, and exporting data from Azure IoT Hub to a standard file format. This process could also be adapted to place the data in other repositories or relational stores.

You can read the detailed step by step process in our documentation.


Viewing all articles
Browse latest Browse all 5971

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>