Quantcast
Channel: Category Name
Viewing all articles
Browse latest Browse all 5971

Accelerated AI with Azure Machine Learning service on Azure Data Box Edge

$
0
0

FPGA acceleration at the edge

Along with the general availability of Azure Data Box Edge that was announced today, we are announcing the preview of Azure Machine Learning hardware accelerated models on Data Box Edge. The majority of the world’s data in real-world applications is used at the edge. For example, images and videos collected from factories, retail stores, or hospitals are used for manufacturing defect analysis, inventory out-of-stock detection, and diagnostics. Applying machine learning models to the data on Data Box Edge provides lower latency and savings on bandwidth costs, while enabling real-time insights and speed to action for critical business decisions.

Azure Machine Learning service is already a generally available, end-to-end, enterprise-grade, and compliant data science platform. Azure Machine Learning service enables data scientists to simplify and accelerate the building, training, and deployment of machine learning models. All these capabilities are accessed from your favorite Python environment using the latest open-source frameworks, such as PyTorch, TensorFlow, and scikit-learn. These models can run today on CPUs and GPUs, but this preview expands that to field programmable gate arrays (FPGA) on Data Box Edge.

What is in this preview?

This preview enhances Azure Machine Learning service by enabling you to train a TensorFlow model for image classification scenarios, containerize the model in a Docker container, and then deploy the container to a Data Box Edge device with Azure IoT Hub. Today we support ResNet 50, ResNet 152, DenseNet-121, and VGG-16. The model is accelerated by the ONNX runtime on an Intel Arria 10 FPGA that is included with every Data Box Edge.

Why does this matter?

Over the years, AI has been infused in our everyday lives and in industry. Smart home assistants understand what we say, and social media services can tag who’s in the picture we uploaded. Most, if not all, of this is powered by deep neural networks (DNNs), which are sophisticated algorithms that process unstructured data such as images, speech, and text. DNNs are also computationally expensive. For example, it takes almost 8 billion calculations to analyze one image using ResNet 50, a popular DNN.

There are many hardware options to run DNNs today, most commonly on CPUs and GPUs. Azure Machine Learning service brings customers the cutting-edge innovation that originated in Microsoft Research (featured in this recent Fast Company article), to run DNNs on reconfigurable hardware called FPGAs. By integrating this capability and the ONNX runtime in Azure Machine Learning service, we see vast improvements in the latencies of models.

Bringing it together

Azure Machine Learning service now brings the power of accelerated AI models directly to Data Box Edge. Let’s take the example of a manufacturing assembly line scenario, where cameras are photographing products at various stages of development.

The pictures are sent from the manufacturing line to Data Box Edge inside your factory, where AI models trained, containerized and deployed to FPGA using Azure Machine Learning service, are available. Data Box Edge is registered with Azure IoT Hub, so you can control which models you want deployed. Now you have everything you need to process incoming pictures in near real-time to detect manufacturing defects. This enables the machines and assembly line managers to make time-sensitive decisions about the products, improving product quality, and decreasing downstream production costs.

Join the preview

Azure Machine Learning service is already generally available today. To join the preview for containerization of hardware accelerated AI models, fill out the request form and get support on our forum.


Viewing all articles
Browse latest Browse all 5971

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>