Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

Announcing the public preview of Windows Virtual Desktop


Data integration with ADLS Gen2 and Azure Data Explorer using Data Factory

$
0
0

Microsoft announced the general availability of Azure Data Lake Storage (ADLS) Gen2 and Azure Data Explorer in early February, which arms Azure with unmatched price performance and security as one of the best clouds for analytics. Azure Data Factory (ADF), is a fully-managed data integration service, that empowers you to copy data from over 80 data sources with a simple drag-and-drop experience and operationalize and manage the ETL/ELT flows with flexible control flow, rich monitoring, and continuous integration and continuous delivery (CI/CD) capabilities. In this blog post, we’re excited to update you on the latest integration in Azure Data Factory with ADLS Gen2 and Azure Data Explorer. You can now meet the advanced needs of your analytics workloads by leveraging these services.

Ingest and transform data with ADLS Gen2

Azure Data Lake Storage is a no-compromises data lake platform that combines the rich feature set of advanced data lake solutions with the economics, global scale, and enterprise grade security of Azure Blob Storage. Our recent post provides you with a comprehensive insider view on this powerful service.

Azure Data Factory supports ADLS Gen2 as a preview connector since ADLS Gen2 limited public preview. Now the connector has also reached general availability along with ADLS Gen2. Moreover, with ADF, you can now:

  • Ingest data from over 80 data sources located on-premises and in the cloud into ADLS Gen2 with great performance.
  • Orchestrate data transformation using Databricks Notebook, Apache Spark in Python, and Spark JAR against data stored in ADLS Gen2.
  • Orchestrate data transformation using HDInsights with ADLS Gen2 as the primary store and script store on either bring-your-own or on-demand cluster.
  • Egress data from ADLS Gen2 to a data warehouse for reporting.
  • Leverage Azure’s Role Based Access Control (RBAC) and Portable Operating System Interface (POSIX) compliant access control lists (ACLs) that restrict access to only authorized accounts.
  • Invoke control flow operations like Lookup and GetMetadata against ADLS Gen2.

Get started today

Populate Azure Data Explorer for real-time analysis

Azure Data Explorer is a fast and highly scalable data exploration service for log and telemetry data. It helps you handle the many data streams emitted by modern software and is designed for analyzing large volumes of diverse data.

Bringing data into Azure Data Explorer is the first challenge customers often face to adopt the service. Complimentary to Azure Data Explorer’s native support on continuous data ingestion from event streams, Azure Data Factory enables you to batch ingress data from a broad set of data stores in a codeless manner. With simple drag-and-drop features in ADF, you can now:

  • Ingest data from over 80 data sources - on-premises and cloud-based, structured, semi-structured, and unstructured into Azure Data Explorer for real-time analysis.
  • Egress data from Azure Data Explorer based on the Keyword Query Language (KQL) query.
  • Lookup Azure Data Explorer for control flow operations.

Get started

 

We will keep adding new features in ADF to tighten the integration with ADLS Gen2 and Azure Data Explorer. Stay tuned and let us know your feedback!

Expanded Jobs functionality in Azure IoT Central

$
0
0

Since announcing the release of our Jobs feature during the Azure IoT Central general availability launch, we are excited to share how we are working to improve your device management workflow through additional jobs functionalities. Today, you are now able to copy an existing job you’ve created, save a job to continue working on later, stop or resume a running job, and download a job details report once your job has completed running. These additional Jobs functionalities make managing your devices at scale much easier.

In order to copy a job you’ve created, simply select a job from your main jobs list and select “Copy”. This will open a copy of the job where you can optionally update any part of the job configuration. If any changes have been made to your device set since its creation, your copied job will reflect those changes for you to edit.

Copy an existing job created in Azure IoT Central

While you are editing your job, you now have the option to save the job to continue working on later by selecting “Save”. This saved job will appear on your main jobs list with a status of “Saved” and you can open it again at any time to continue editing.

Save a job to continue working on later in Azure IoT Central

Once you have chosen to run your job, you can select the “Stop” button to stop the job from executing any further. You can open a stopped job from your list and select “Run” again at any time you’d like.

Whether your job has been stopped or is completed running, you can select “Download Device Report” near your device list in order to download a .csv file that lists the device ID, time the job was completed or stopped, status of the device, and the error message (if applicable). This can be used to troubleshoot devices or as a sorting tool.

Download a job details report once your job has completed running

We are continually working on improving your device management experience to make managing devices at scale easier than ever. If you have any suggestions for the device management or Jobs functionalities you would find useful in your workflow, please leave us feedback.

Learn more about how to run a job in Azure IoT Central.

AI, Machine Learning and Data Science Roundup: March 2019

$
0
0

A monthly roundup of news about Artificial Intelligence, Machine Learning and Data Science. This is an eclectic collection of interesting blog posts, software announcements and data applications from Microsoft and elsewhere that I've noted over the past month or so.

Open Source AI, ML & Data Science News

TensorFlow Privacy: a Python library for training machine learning models with differential privacy, for use with sensitive data to generate models that don't learn details about specific people.

Tensorflow Federated, an open-source library for Federated Learning, enabling many participating clients to train shared ML models while keeping their data local.

R 3.5.3 has been released.

NNI, an open source AutoML toolkit for neural architecture search and hyper-parameter tuning, from Microsoft Research.

Industry News

Open AI has published a paper describing GPT-2, an unsupervised language model that can generate paragraphs of coherent text that could be mistaken for human writing. Only a scaled-down version has been released, for fear of abuse.

Mozilla releases Common Voices, a public domain dataset of 1,400 hours of recorded voice data in 18 languages.

GCP's Deep Learning Virtual Machine now incorporates RAPIDS, the open-source GPU-acceleration library for Python machine learning.

Google introduces GPipe, an open-source library for training large-scale deep neural networks on devices with otherwise insufficient resources.

Determined AI, platform-agnostic software for automated machine learning and infrastructure management, has been announced.

Microsoft News

Azure Machine Learning Service now incorporates RAPIDS, the open source NVIDIA library that provides accelerated GPU support for Pandas dataframes and scikit-learn machine learning algorithms.

ONNX Runtime adds the NVIDIA TensorRT execution provider, for improved inferencing support on NVIDIA GPU devices.

The Intel Optimized Data Science Data Science Virtual Machine, providing up to 10x performance increase for CPU-based deep learning workloads, is now available on Azure Marketplace.

New Python features in VS Code include validation of breakpoint targets, a new test explorer, and the ability to run selected code without the need to define code cells.

New enhancements for Cognitive Services Computer Vision: bounding boxes for detected objects, and more types of objects detected (including thousands of brand logos).

Azure Cognitive Services containers for Text Analytics and Language Understanding containers are now supported on edge devices with Azure IoT Edge.

MMLSpark 0.16 is released, with improvements to machine learning methods for Apache Spark including Azure Search integration and the Smart Adaptive Recommender.

Microsoft Research releases MT-DNN as a PyTorch package, an architecture for natural language understandings that combines elements of MTL and BERT.

Learning resources

Microsoft launches AI Business School, a collection of learning modules aimed at business leaders on AI strategy, technology and responsible impact.

A high-level article on how and why transfer learning works, and the connection with embeddings.

The Quartz AI Studio, a task-oriented resource for data journalists, provides useful guides for text and image analysis for general practitioners as well.

Causal Inference: The Mixtape, a freely-licensed book by Scott Cunningham that comes with its own playlist.

An architecture for using R at scale on GCP, by Mark Edmondson.

The Python Data Science Handbook (by Jake VanderPlas) in Jupyter Notebook form. Also available in Azure Notebooks and Google Colab.

Datasets for Machine Learning: A list of the biggest datasets from across the web.

Applications

SPADE, a method for synthesizing photo-realistic images from a simple sketch.

Seeing AI now includes a feature to help vision-impaired users explore photographs by touch, and is now supported on iPad devices.

Google deploys a compact RNN transducer to mobile phones that can transcribe speech on-device and streams output letter-by-letter, and a quasi-recurrent neural network for handwriting transcription.

NVIDIA introduces the Kaldi ASR Framework for high-speed speech transcription.

Find previous editions of the monthly AI roundup here.

Easily Choose Your Route with Bing Maps Traffic Coloring

$
0
0

There is an old saying that you don’t know where you are going until you get there. With Bing Maps new route coloring feature, you will know right away where the delays will be along your selected route so you can change your route, your plans or your destination based on the route ahead!

For example, if you are leaving Redmond Town Center for Westlake Center in Seattle, you can see the delays on WA-520 W and I-90 W before you decide which route to take. Also, with our new route labels showing the travel mode, distance and time of each route, you can easily compare and toggle between the different routes quickly on the map.

Bing Maps Traffic Coloring

Bing Maps Traffic Coloring

While blue means no traffic delays, the orange and red colors highlight moderate to heavy traffic delays on the route. These are calculated based on a combination of current traffic updates and predictions from historic data depending on the length of the route.

Traffic coloring not only helps you select the best route for your trip, but can also be very useful when there are major traffic delays due to inclement weather, big events, accidents, or road construction nearby. For example, if there is an MLB or NFL game in town, you can avoid the most impacted roads near the event and choose an option that offers the least delays.

In addition, if you need to take a ferry as part of your route, Bing Maps visualizes the ferry segments using dashes to differentiate that part of the trip. The image below illustrates the route between Redmond and Discovery Bay in Washington State. Bing Maps highlights the ferry segment between Edmond and Kingston with a dashed line.

Bing Maps Traffic Coloring Ferry Route

The Bing Maps Routing and Traffic Team is constantly working to make navigation and route planning easier for our users. To try out the traffic coloring feature, go to https://www.bing.com/maps.

 - Bing Maps Team

Getting Started with .NET Core and Docker and the Microsoft Container Registry

$
0
0

It's super easy to get started with .NET Core and/or ASP.NET Core with Docker. If you have Docker installed you don't need to install anything to try out .NET Core, of course.

To run a little .NET Core console app:

docker run --rm mcr.microsoft.com/dotnet/core/samples:dotnetapp

And the result:

latest: Pulling from dotnet/core/samples

Hello from .NET Core!
...SNIP...

**Environment**
Platform: .NET Core
OS: Linux 4.9.125-linuxkit #1 SMP Fri Sep 7 08:20:28 UTC 2018

To run a quick little ASP.NET Core website just:

docker run -it --rm -p 8000:80 --name aspnetcore_sample mcr.microsoft.com/dotnet/core/samples:aspnetapp

And here it is running on localhost:8000

Simple ASP.NET Core app under Docker

You can also host ASP.NET Core Images with Docker over HTTPS to with this image, or run ASP.NET Core apps in Windows Containers.

Note that Microsoft teams are now publishing container images to the MCR (Microsoft Container Registry) so they can use the Azure CDN and pull faster when they are closer to you globally. The images start at MCR and then can be syndicated to other container registries.

The new repos follow:

When you "docker pull" you can use tag strings for .NET Core and it works across any supported .NET Core version

  • SDK: docker pull mcr.microsoft.com/dotnet/core/sdk:2.1
  • ASP.NET Core Runtime: docker pull mcr.microsoft.com/dotnet/core/aspnet:2.1
  • .NET Core Runtime: docker pull mcr.microsoft.com/dotnet/core/runtime:2.1
  • .NET Core Runtime Dependencies: docker pull mcr.microsoft.com/dotnet/core/runtime-deps:2.1

For example, I can run the .NET Core 3.0 SDK and mess around with it like this:

docker run -it mcr.microsoft.com/dotnet/core/sdk:3.0 

I've been using Docker to run my unit tests on my podcast site within a container locally. Then I volume mount and dump the test results out in a local folder and inspect them with Visual Studio

docker build --pull --target testrunner -t podcast:test .

docker run --rm -v c:githubhanselminutes-coreTestResults:/app/hanselminutes.core.tests/TestResults podcast:test

I can then either host the Docker container in Azure App Service for Containers, or as little one-off per-second billed instances with Azure Container Instances (ACI).

Have you been using .NET Core in Docker? How has it been going for you?


Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.



© 2018 Scott Hanselman. All rights reserved.
     

Building serverless microservices in Azure – sample architecture

$
0
0

Distributed applications take full advantage of living in the cloud to run globally, avoid bottlenecks, and always be available for users worldwide. This not only requires the right infrastructure to be deployed in, but also support for the decoupled architecture an application with these characteristics versus the traditional monolithic approach. This is why most cloud native applications are using a microservices architecture that helps achieve this at global scale.

Microservices benefits slide including independent modules, isolated points of failure, autonomous scalability, tech flexibility, and faster value delivery.

The benefits of using a microservices architecture are maximized when those applications are built in the cloud, with a wide range of managed services that will make it easier to materialize the microservices promise. With those services managing infrastructure and scaling for you, and improving the way you can do critical processes like deployment or monitoring those solutions, you can maximize the amount of value delivered per cycle.

Services to build microservices in Azure like Service Fabric, Azure Kubernetes Service, Azure Functions, and API Management.

There are different patterns you might want to explore and each of them fits a specific scenario. Today we’re focusing on how building serverless microservices is a great fit for event-driven scenarios, and how you can use the Azure Serverless platform.

Building serverless, event-driven microservices

Taking an event-driven approach to build microservices-based applications when fitting the scenario and the problem to solve, can help mitigate some problems of a more traditional approach:

  • Scaling compute resources: With the automated and flexible scaling based on actual demand that’s provided by a serverless platform, you don’t need to worry about how the scaling happens or how to handle it on the code of your application.
  • Operations dependency: When deploying a microservices-based solution, there is usually a strong dependency on the operations teams for allocating infrastructure resources for deployment and execution, both initially and with each incremental change. Taking a serverless approach by using fully managed services removes that necessity, since all the underlying infrastructure is managed for you by the platform.
  • Costs for hosting: With a traditional deployment, the cost is determined by how much you have to pay for each hosting node, and usually implies an over allocation of resources, resulting in increased hosting expenditure. With an event-driven approach, using services with consumption-based pricing models means the price is determined by the number of requests or operations, and the costs for hosting are better adjusted to the real usage of the solution (and are usually lower).
  • Services discovery: Managing services integration, communication, and interactions are common problems on distributed applications. Since each service is performing a very specific action according to the single responsibility principle, more often than not a service will need to communicate with others to achieve its goal. The real challenge is keeping these connections as simple as possible and totally decoupled services. With an event-driven approach, you can take advantage of both of the following:
    • A centralized, unified way of communicating services via events using a pub-sub model, fully managed with Azure Event Grid.
    • An integrated programming model based on triggers to automatically respond to those events and bindings to connect and integrate different services seamlessly, such as the experience offered by Azure Functions and Logic Apps for event-driven compute.

Sample architecture for serverless microservices

In the sample architecture for a rideshare application for a fictitious company named Relecloud, you can learn more about the architectural design of a microservices-based application. The sample uses fully managed services from the Azure Serverless platform to build the main building blocks of microservices solutions such as:

  1. API Gateway: Using API Management to expose the endpoints of the backend services, so the client application can consume them securely. This also helps with decoupling the client side from the backend, since you can easily manage changes on where the services are actually hosted from the gateway without affecting the client application.
  2. Entry points: The public facing APIs that the client application will be using, powered by Azure Functions responding to HTTP requests.
  3. Workflow orchestrator: Middle-tier service to interconnect the public facing APIs with the actual backend services that are tied to the data stores and other critical components, orchestrating the work of these services based on actions on the client side.
  4. Async queue: Messaging service to handle services intercommunication and pass along information and data between the different services, represented by Azure Event Grid. By using an event-driven approach, we’re also favoring services decoupling, since the information exchange will have a fire-and-forget approach, with services pushing events and handlers subscribed to those events and handlers subscribing to those events for processing.
  5. Backend services: The services that are directly operating with the data layer and other components of the solution, isolated from the rest and easily replaceable if needed (e.g. changing the type of database used to store data) without affecting the rest of the application and interactions.

Sample architecture overview.

Next steps

- Register for this webinar to learn how to develop microservices-based applications with a serverless architecture using fully managed Azure services.

- Browse the Relecloud Rideshare sample architecture to get step-by-step guidance on how to build the sample application and detailed information of the solution design.

- Sign up for an Azure free account if you don’t have one already, and start building serverless applications today.

Larger, more powerful Managed Disks for Azure Virtual Machines

$
0
0

Today, we are excited to announce the general availability of larger and more powerful Azure Managed Disk sizes of up to 32 TiB on Premium SSD, Standard SSD, and Standard HDD disk offerings. In addition, we support disk sizes up to 64 TiB on Ultra Disks in preview.

We are also increasing the performance scale targets for Premium SSD to 20,000 IOPS and 900 MB/sec. With the general availability (GA) of larger disk sizes, Azure now offers a broad range of disk sizes for your production workload needs, with unmatched scale and performance.

Some of the benefits of using larger disk sizes are:

  • Grow your virtual machine (VM) disks’ capacity independently of the VM size you are using. You can attach more disk capacity per VM without having to upgrade to larger VM sizes or leveraging multiple VMs. For instance, you can now attach two 32 TiB data disks on the smallest B1 Azure VM to achieve total disks capacity of 64 TiB. Our largest VMs can now support up to 2 PiB of Disks Storage on a single VM. This provides a cost-effective solution to migrate data hosted on-premises disks, NAS, or backup data to Azure.
  • Lift and shift a new range of workloads to Azure with the higher performance scale limits. With Premium SSDs offering up to 20,000 IOPS and 900 MB/sec, you can accelerate a wide variety of transactional workloads and exceed the on-premises runtime performance of data analytics applications. Moreover, the higher scalability limits on Standard SSDs make it the perfect fit for hosting big data workloads and scale-out file servers that are extremely throughput intensive.
  • Simplify your deployments, service management, and reduce VM maintenance costs by avoiding the need to stripe multiple disks to achieve larger capacity or high performance. You can still stripe multiple large disks to achieve even higher capacity and performance.

For more information on the new disk SKUs and the scalability targets, please see the section below. To achieve the expected disk IOPS and bandwidth, we recommend that you review our guidance on how to optimize your disk performance.

Premium SSD disks

Premium SSDs are ideal for enterprise applications like Dynamics AX, Dynamics CRM, and database workloads like SQL Server, Cassandra, and MongoDB that require consistent high performance and low latency.

Table displaying Premium SSD disks type, size, IOPS per disk, and throughput per disk.

Standard SSD disks

Standard SSDs are suitable for web servers, low IOPS application servers, big data, and enterprise applications that need consistent performance at lower IOPS levels. Table displaying Standard SSD disks type, size, IOPS per disk, and throughput per disk.

Standard HDD disks

Standard HDDs based on magnetic drives offer the most cost-effective solution for Dev/Test and backup scenarios. Table displaying Standard HDD disks type, size, IOPS per disk, and throughput per disk.

Note:

  1. To achieve the target performance of Standard SSD and Standard HDD new disk SKUs, you should use these recommended VM series.
  2. P60/70/80 and E60/70/80 disks now have higher performance targets than scale limits offered in preview. If you deployed these disk SKUs in preview, you can follow the guidelines to make sure your existing disks are updated for the higher GA performance.

Getting started

You can create new Managed Disks or extend your existing disks to larger sizes using the Azure portal, Powershell, or CLI today! The newly introduced sizes are generally available in all regions in Azure Public Cloud, and support for national clouds including Azure US Government and China 21Vianet will be available in the coming weeks. Larger disk sizes are only supported on Managed Disks. If you are not using Managed Disks yet, convert from Unmanaged to Managed Disks now and take advantage of the unique capabilities.

We have also added support for the following scenarios:

  • Azure portal experience for new disk sizes up to 32 TiB
  • Resize your existing Managed Disks up to 32 TiB using Powershell and CLI

Our next step is to enable the preview of Azure Backup for larger disk sizes providing you full coverage for enterprise backup scenarios by the end of May 2019. Similarly, Azure Site Recovery support for on-premises to Azure, and Azure to Azure Disaster Recovery will be extended to all disk sizes soon.

Visit our service website to explore the Azure Disks portfolio. To learn about pricing, you can visit the Managed Disks pricing page.

General feedback

We look forward to hearing your feedback on the new disk sizes. Please email us at AzureDisks@microsoft.com.


Azure.Source – Volume 75

$
0
0

Preview | Generally available | News & updates | GTC 2019 | Technical content | Azure shows | Events | Customers, partners, and industries

Now in preview

Windows Virtual Desktop now in public preview on Azure

The public preview of the Windows Virtual Desktop service is now available on Azure. Customers can now access the only service that delivers simplified management, multi-session Windows 10, optimizations for Office 365 ProPlus, and support for Windows Server Remote Desktop Services (RDS) desktops and apps. With Windows Virtual Desktop, you can deploy and scale your Windows desktops and apps on Azure in minutes, while enjoying built-in security and compliance. Access to Windows Virtual Desktop is available through applicable RDS and Windows Enterprise licenses.

Thumbnail from What is Windows Virtual Desktop? by Microsoft Mechanics

Azure Data Studio: An Open Source GUI Editor for Postgres

Support for PostgreSQL in Azure Data Studio is now available in preview. Azure Data Studio is a cross-platform modern editor focused on data development. It's available for Linux, MacOS, and Windows. We're also introducing a corresponding preview PostgreSQL extension in Visual Studio Code (VS Code). Both Azure Data Studio and Visual Studio Code are open source and extensible - two things that PostgreSQL itself is based on. If your primary use case is data, choose Azure Data Studio to manage multiple database connections, explore database object hierarchy, set up dashboards, and more.

Azure Container Registry virtual network and Firewall rules preview support

Announcing Azure Container Registry (ACR) now supports limiting public endpoint access. Customers can now limit registry access within an Azure Virtual Network (VNet), as well as whitelist IP addresses and ranges for on-premises services. VNet and Firewall rules are now supported with virtual machines (VM) and Azure Kubernetes Services (AKS). VNet and Firewall rules are available for public preview in all 25 public cloud regions. General availability (GA) will be based on a curve of usage and feedback.

Also available in preview

Now generally available

Azure Backup for SQL Server in Azure Virtual Machines now generally available

Now generally available, Azure Backup for SQL Server Virtual Machines (VMs), an enterprise scale, zero-infrastructure solution that eliminates the need to deploy and manage backup infrastructure while providing a simple and consistent experience to centrally manage and monitor the backups on standalone SQL instances and Always On Availability Groups. Built into Azure, the solution combines the core cloud promises of simplicity, scalability, security and cost effectiveness with inherent SQL backup capabilities that are leveraged by using native APIs, to yield high fidelity backups and restores.

Thumbnail from How to back up SQL Server running in Azure VMs with Azure Backup

Also generally available

News and updates

Microsoft’s Azure Cosmos DB is named a leader in the Forrester Wave: Big Data NoSQL

Announcing that Forrester has named Microsoft as a Leader in The Forrester Wave™: Big Data NoSQL for the first quarter of 2019 based on their evaluation of Azure Cosmos DB validating the exceptional market momentum and customer satisfaction. According to Forrester, “half of global data and analytics technology decision makers have either implemented or are implementing NoSQL platforms, taking advantage of the benefits of a flexible database that serves a broad range of use cases.” We are committed to making Azure Cosmos DB the best globally distributed database for all businesses and modern applications. With Azure Cosmos DB, we believe that you will be able to write amazingly powerful, intelligent, modern apps and transform the world.

Marketing graphic for Azure Cosmos DB

March 2019 changes to Azure Monitor Availability Testing

Azure Monitor Availability Testing allows you to monitor the availability and responsiveness of any HTTP or HTTPS endpoint that is accessible from the public internet. At the end of this month we are deploying some major changes to improve performance and reliability, as well as to allow us to make more improvements for the future. This post highlights and describes some of the changes needed to ensure that your tests continue running without any interruption.

Data integration with ADLS Gen2 and Azure Data Explorer using Data Factory

Introducing the latest integration in Azure Data Factory. Azure Data Lake Storage Gen2 is a data lake platform that combines advanced data lake solutions with the economic, global scale, and enterprise grade security of Azure Blob Storage. Azure Data Explorer is a fully-managed data integration service to operationalize and manage the ETL/ELT flows with flexible control flow, rich monitoring, and continuous integration and continuous delivery (CI/CD) capabilities. You can now meet the advanced needs of your analytics workloads with unmatched price performance and the security of one of the best clouds for analytics.

Additional news and updates

News from NVIDIA GPU Technology Conference

Over the years, Microsoft and NVIDIA have helped customers run demanding applications on GPUs in the cloud. Last week at NVIDIA GPU Technology Conference 2019 (GTC 2019) in San Jose, Microsoft made several announcements on our collaboration with NVIDIA to help developers and data scientists deliver innovation faster.

Microsoft and NVIDIA extend video analytics to the intelligent edge

Microsoft and NVIDIA are partnering on a new approach for intelligent video analytics at the edge to transform raw, high-bandwidth videos into lightweight telemetry. This delivers real-time performance and reduces compute costs for users. With this latest collaboration, NVIDIA DeepStream and Azure IoT Edge extend the AI-enhanced video analytics pipeline to where footage is captured, securely and at scale. Now, our customers can get the best of both worlds—accelerated video analytics at the edge with NVIDIA GPUs and secure connectivity and powerful device management with Azure IoT Edge and Azure IoT Hub.

Edge appliance and Azure cloud diagram

Microsoft and NVIDIA bring GPU-accelerated machine learning to more developers

GPUs have become an indispensable tool for doing machine learning (ML) at scale. Our collaboration with NVIDIA marks another milestone in our venture to help developers and data scientists deliver innovation faster. We are committed to accelerating the productivity of all machine learning practitioners regardless of their choice of framework, tool, and application. Two of the integrations that Microsoft and NVIDIA have built together to unlock industry-leading GPU acceleration for more developers and data scientists are covered in the next two posts.

Azure Machine Learning service now supports NVIDIA’s RAPIDS

Azure Machine Learning service is the first major cloud ML service to support NVIDIA’s RAPIDS, a suite of software libraries for accelerating traditional machine learning pipelines with NVIDIA GPUs. With RAPIDS on Azure Machine Learning service, users can accelerate the entire machine learning pipeline, including data processing, training and inferencing, with GPUs from the NC_v3, NC_v2, ND or ND_v2 families. Azure Machine Learning service users are able to use RAPIDS in the same way they currently use other machine learning frameworks, and can use RAPIDS in conjunction with Pandas, Scikit-learn, PyTorch, TensorFlow, etc.

ONNX Runtime integration with NVIDIA TensorRT in preview

Announcing the open source the preview of the NVIDIA TensorRT execution provider in ONNX Runtime. Taking another step towards open and interoperable AI by enabling developers to easily leverage industry-leading GPU acceleration regardless of their choice of framework, developers can now tap into the power of TensorRT through ONNX Runtime to accelerate inferencing of ONNX models, which can be exported or converted from PyTorch, TensorFlow, and many other popular frameworks.

Technical content

Reducing security alert fatigue using machine learning in Azure Sentinel

Alert fatigue is real. Security analysts face a huge burden of triage as they not only have to sift through a sea of alerts, but to also correlate alerts from different products manually. Machine learning (ML) in Azure Sentinel is built-in right from the beginning and focuses on reducing alert fatigue while offering ML toolkits tailored to the security community; including ML innovations aimed at making security analysts, security data scientists, and engineers productive.

Screenshot of Fusion and two composite alerts

Breaking the wall between data scientists and app developers with Azure DevOps

Data scientists are used to developing and training machine learning models for their favorite Python notebook or an integrated development environment (IDE). The app developer is focused on the application lifecycle – building, maintaining, and continuously updating the larger business application. As AI is infused into more business-critical applications, it is increasingly clear that we need to collaborate closely to build and deploy AI-powered applications more efficiently. Together, Azure Machine Learning and Azure DevOps enable data scientists and app developers to collaborate more efficiently while continuing to use the tools and languages that are already familiar and comfortable.

Step-By-Step: Getting Started with Azure Machine Learning

In this comprehensive guide, Anthony Bartolo explains how to set up a prediction model using Azure Machine Learning Studio. The example comes from a real-life hackfest with Toyota Canada and predicts the pricing of vehicles.

Intro to Azure Container Instances

Aaron Powell covers using Azure Container Instances to run containers in a really simple approach in this handy introductory guide. It walks through a hello world demo and then some advanced scenarios on using ACR and connecting to Azure resources (such as a SQL server).

Getting started with Azure Monitor Dynamic Thresholds

This overview from Sonia Cuff discusses the new Dynamic Thresholds capability in Azure Monitor, where machine learning sets the alert threshold levels when you are monitoring metrics (e.g., CPU percentage use in a VM or HTTP request time in an application).

How to Deploy a Static Website Into Azure Blob Storage with Azure DevOps Pipeline

In this third video and blog post of Frank Boucher’s CI/CD series, he creates a release Pipeline and explains how to deploy an ARM template to create or update Azure resources and deploy a static website into a blob storage. This series explains how to build a continuous integration and continuous deployment system using Azure DevOps Pipeline to deploy a Static Website into Azure Bob Storage.

Thumbnail from How to Deploy a Static Website Into Azure Blob Storage with Azure DeOps Pipeline - part 3

Fixing Azure Functions and Azure Storage Emulator 5.8 issue

If you happen to run into an error after updating Azure Functions to the latest version, Maxime Rouille's hotfix is for you. He not only explains what to do, but why this might happen and what his hot fix does.

Using Key Vault with Your Mobile App the Right Way

You know you need to keep your app’s secrets safe and follow best practices - but how? In this post, Matt Soucoup uses a Xamarin app to walk through how and why to use Azure Key Vault, Active Directory, and Azure Functions to keep application secrets off of your mobile app and in Key Vault - without tons of extra work for you.

How do You Structure Your Code When Moving Your API from Express to Serverless Functions?

There are a lot of articles showing how to use serverless functions for a variety of purposes. A lot of them cover how to get started, and they are very useful. But what do you do when you want to organize them a bit more as you do for your Node.js Express APIs? There's a lot to talk about on this topic, but in this post, John focuses specifically on one way you can organize your code.

Securely monitoring your Azure Database for PostgreSQL Query Store

Long running queries may interfere with the overall database performance and likely get stuck on some background process, which means that from time to time you need to investigate if there are any queries running indefinitely on your databases. See how you can set up alerting on query performance-related metrics using Azure Functions and Azure Key Vault.

Expanded Jobs functionality in Azure IoT Central

We have improved device management workflow through additional jobs functionalities that make managing your devices at scale much easier. In this brief post, learn to copy an existing job, save a job to continue working on later, stop or resume a running job, and download a job details report once your job has completed running.

Screenshot showing an example of download details of Jobs in Azure IoT Central

Azure Stack IaaS – part five

Self-service is core to Infrastructure-as-a-Service (IaaS). Azure's IaaS gives the owner of the subscription everything they need to create virtual machines (VMs) and other resources on their own, without involving an administrator. This post shows a few examples of Azure and Azure Stack self-service management of VMs.

Additional technical content

Azure shows

Episode 271 - Azure Stack - Tales from the field | The Azure Podcast

Azure Stack experts from Microsoft Services, Heyko Oelrichs and Rathish Ravikumar, give us an update on Azure Stack and some valuable tips and tricks based on their real-world experiences deploying it for customers.


Read the transcript

One Dev Question: What new HoloLens and Azure products were released in Barcelona? | One Dev Minute

In this episode of One Dev Question, Alex Kipman discusses Microsoft Mixed Reality, featuring announcements from Mobile World Congress.

Data Driven Styling with Azure Maps | Internet of Things Show

Ricky Brundritt, PM in the Azure Maps team, walks us through data driven styling with Azure Maps. Data driven styling allows you to dynamically style layers at render time on the GPU using properties on your data. This provides huge performance benefits and allows large datasets to be rendered on the map. Data driven style expressions can greatly reduce the amount of code you would normally need to write and define this type of business logic using if-statements and monitoring map events.

New Microsoft Azure HPC Goodness | Tuesdays with Corey

Corey Sanders and Evan Burness (Principal Program Manager on the Azure Compute team) sat down to talk about new things in the High Performance Computing space in Azure.

Get ready for Global Azure Bootcamp 2019 | Azure Friday

Global Azure Bootcamp is a free, one-day, local event that takes place globally. It's an annual event run by the Azure community. This year, Global Azure Bootcamp is on Saturday, April 27, 2019. Event locations range from New Zealand to Hawaii and chances are good that you can find a location near you. You may even organize your own local event and receive sponsorship so long as you register by Friday, March 29, 2019. Join in to receive sponsorship, Azure passes, Azure for Students, lunch, and a set of content to use for the day.

How to use Azure Monitor Application Insights to record custom events | Azure Tips and Tricks

Learn how to use Azure Monitor Application Insights to make your application logging smarter with Custom Event Tracking.

Thumbnail from How to use Azure Monitor Application Insights to recordcustom events by Azure Tips and Tricks

How to create an Azure Kubernetes Service cluster in the Azure Portal | Azure Portal Series

The Azure Portal enables you to get started quickly with Kubernetes and containers. In this video, learn how to easily create an Azure Kubernetes Service cluster.

Thumbnail from How to create an Azure Kubernetes Service cluster in theAzure Portal by AzurePortal Series

Phil Haack on DevOps at GitHub | Azure DevOps Podcast

Jeffrey Palermo and Phil Haack take a dive deep into DevOps at GitHub. They talk about Phil's role as Director of Engineering; how GitHub, as a company, grew while Phil worked there; the inner workings of how the GitHub website ran; and details about how various protocols, continuous integration, automated testing, and deployment worked at GitHub.

Episode 3 - ExpressRoute, Networking & Hybridity, Oh My! | AzureABILITY

AzureABILITY host Louis Berman discusses ExpressRoute, networking and hybridity with Microsoft's Bryan Woodworth, who specializes in networking, connectivity, high availability, and routing for hybrid workloads in Azure.


Read the transcript

Events

Microsoft Azure for the Gaming Industry

Cloud computing is increasingly important for today’s global gaming ecosystem, empowering developers of any size to reach gamers in any part of the world. In this wrap-up post from GDC 2019, learn how Azure and PlayFab are a powerful combination for game developers. Azure brings reliability, global scale, and enterprise-level security, while PlayFab provides Game Stack with managed game services, real-time analytics, and comprehensive LiveOps capabilities.

The Value of IoT-Enabled Intelligent Manufacturing

Learn how you can apply insights from real-world use cases of IoT-enabled intelligent manufacturing when you attend the Manufacturing IoT webinar on March 28th: Go from Reaction to Prediction – IoT in Manufacturing. In addition, you'll learn how you can use IoT solutions to move from a reactive to predictive model. For additional hands-on, actionable insights around intelligent edge and intelligent cloud IoT solutions, join us on April 19th for the Houston Solution Builder Conference.

Customers, partners, and industries

Power IoT and time-series workloads with TimescaleDB for Azure Database for PostgreSQL

Announcing a partnership with Timescale that introduces support for TimescaleDB on Azure Database for PostgreSQL for customers building IoT and time-series workloads. TimescaleDB allows you to scale for fast ingest and complex queries while natively supporting full SQL and has a proven track record of being deployed in production in a variety of industries including oil & gas, financial services, and manufacturing. The partnership reinforces our commitment to supporting the open-source community to provide our users with the most innovative technologies PostgreSQL has to offer.


This Week in Azure - 22 March 2019 | A Cloud Guru - Azure This Week

Lars is away this week and so Alex Mackey covers Azure Portal improvements, the Azure mobile app, Azure SQL Data Warehouse Workload Importance and Microsoft Game Stack.

Thumbnail from This Week in Azure - 22 March 2019 by A Cloud Guru- Azure This Week

Incrementally copy new files by LastModifiedDate with Azure Data Factory

$
0
0

Azure Data Factory (ADF) is the fully-managed data integration service for analytics workloads in Azure. Using ADF, users can load the lake from 80 plus data sources on-premises and in the cloud, use a rich set of transform activities to prep, cleanse, and process the data using Azure analytics engines, while also landing the curated data into a data warehouse for getting innovative analytics and insights.

When you start to build the end to end data integration flow the first challenge is to extract data from different data stores, where incrementally (or delta) loading data after an initial full load is widely used at this stage. Now, ADF provides a new capability for you to incrementally copy new or changed files only by LastModifiedDate from a file-based store. By using this new feature, you do not need to partition the data by time-based folder or file name. The new or changed file will be automatically selected by its metadata LastModifiedDate and copied to the destination store.

The feature is available when loading data from Azure Blob Storage, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Amazon S3, File System, SFTP, and HDFS.

The resources for this feature are as follows:

1. You can visit our tutorial, “Incrementally copy new and changed files based on LastModifiedDate by using the Copy Data tool” to help you get your first pipeline with incrementally copying new and changed files only based on their LastModifiedDate from Azure Blob storage to Azure Blob storage by using copy data tool.

2. You can also leverage our template from template gallery, “Copy new and changed files by LastModifiedDate with Azure Data Factory” to increase your time to solution and provide you enough flexibility to build a pipeline with the capability of incrementally copying new and changed files only based on their LastModifiedDate.

3. You can also start from scratch to get that feature from ADF UI.

4. You can write code to get that feature from ADF SDK.

You are encouraged to give these additions a try and provide us with feedback. We hope you find them helpful in your scenarios. Please post your questions on Azure Data Factory forum or share your thoughts with us on Data Factory feedback site.

Azure Storage support for Azure Active Directory based access control generally available

$
0
0

We are pleased to share the general availability of Azure Active Directory (AD) based access control for Azure Storage Blobs and Queues. Enterprises can now grant specific data access permissions to users and service identities from their Azure AD tenant using Azure’s Role-based access control (RBAC).  Administrators can then track individual user and service access to data using Storage Analytics logs. Storage accounts can be configured to be more secure by removing the need for most users to have access to powerful storage account access keys.

By leveraging Azure AD to authenticate users and services, enterprises gain access to the full array of capabilities that Azure AD provides, including features like two-factor authentication, conditional access, identity protection, and more. Azure AD Privileged Identity Management (PIM) can also be used to assign roles “just-in-time” and reduce the security risk of standing administrative access.

In addition, developers can use Managed identities for Azure resources to deploy secure Azure Storage applications without having to manage application secrets.

When Azure AD authentication is combined with the new Azure Data Lake Storage Gen 2 capabilities, users can also take advantage of granular file and folder access control using POSIX-style access permissions and access control lists (ACL’s).

RBAC for Azure Resources can be used to grant access to broad sets of resources across a subscription, a resource group, or to individual resources like a storage account and blob container. Role assignments can be made through the Azure portal or through tools like Azure PowerShell, Azure CLI, or Azure Resource Manager templates.

Assign a role to a user

Azure AD authentication is available from the standard Azure Storage tools including the Azure portal, Azure CLI, Azure PowerShell, Azure Storage Explorer, and AzCopy.

Browse Azure Storage blobs using the Azure portal

$ az login
Note, we have launched a browser for you to login. For old experience with device code, use "az login --use-device-code"
You have logged in. Now let us find all the subscriptions to which you have access...
[
  {
    "cloudName": "AzureCloud",
    "id": "XXXXXXXX-YYYY-ZZZZ-AAAA-BBBBBBBBBBBB",
    "isDefault": true,
    "name": "My Subscription",
    "state": "Enabled",
    "tenantId": "00000000-0000-0000-0000-000000000000",
    "user": {
      "name": "cbrooks@microsoft.com",
      "type": "user"
    }
  }
]
$ export AZURE_STORAGE_AUTH_MODE="login"
$ az storage blob list --account-name mysalesdata --container-name mycontainer --query [].name
[
  "salesdata.csv"
]​

We encourage you to use Azure AD to grant users access to data, and to limit user access to the storage account access keys. A typical pattern for this would be to grant users the "Reader" role make the storage account visible to them in the portal along with the "Storage Blob Data Reader" role to grant read access to blob data. Users who need to create or modify blobs can be granted the "Storage Blob Data Contributor" role instead.

Developers are encouraged to evaluate Managed Identities for Azure resources to authenticate applications in Azure or Azure AD service principals for apps running outside Azure.

Azure AD access control for Azure Storage is available now for production use in all Azure cloud environments

Azure Premium Block Blob Storage is now generally available

$
0
0

As enterprises accelerate cloud adoption and increasingly deploy performance sensitive cloud-native applications, we are excited to announce general availability of Azure Premium Blob Storage. Premium Blob Storage is a new performance tier in Azure Blob Storage for block blobs and append blobs, complimenting the existing Hot, Cool, and Archive access tiers. Premium Blob Storage is ideal for workloads that require very fast response times and/or high transactions rates, such as IoT, Telemetry, AI, and scenarios with humans in the loop such as interactive video editing, web content, online transactions, and more.

Premium Blob Storage provides lower and more consistent storage latency, providing low and consistent storage response times for both read and write operations across a range of object sizes, and is especially good at handling smaller blob sizes. Your application should be deployed to compute instances in the same Azure region as the storage account to realize low latency end-to-end. For more details on performance see, “Premium Block Blob Storage - a new level of performance.”

image

Figure 1 - Latency comparison of Premium and Standard Blob Storage

Premium Blob Storage is available with Locally-Redundant Storage (LRS) and comes with High-Throughput Block Blobs (HTBB), which provides very high and instantaneous write throughput when ingesting block blobs larger than 256KB.

Pricing and region availability

Premium Blob Storage has higher data storage cost, but lower transaction cost compared to data stored in the regular Hot tier. This makes it cost effective and can be less expensive for workloads with high transaction rates. Check out the pricing page for more details.

Premium Blob Storage is initially available in US East, US East 2, US Central, US West, US West 2, North Europe, West Europe, Japan East, Australia East, Korea Central, and Southeast Asia regions with more regions to come. Stay up to date on region availability through the Azure global infrastructure page.

Platform interoperability

At present, data stored in Premium cannot be tiered to Hot, Cool, or Archive access tiers. We are working on supporting object tiering in the future. To move data, you can synchronously copy blobs using the new PutBlockFromURL API (sample code) or AzCopy v10, which supports this API. PutBlockFromURL synchronously copies data server side, which means all data movement happens inside Azure Storage.

In addition, Storage Analytics Logging, Static website, and Lifecycle Management (preview) are not currently available with Premium Blob Storage.

Next steps

To get started with Premium Blob Storage you provision a Block Blob storage account in your subscription and start creating containers and blobs using the existing Blob Service REST API and/or any existing tools such as AzCopy or Azure Storage Explorer.

We are very excited about being able to deliver Azure Premium Blob Storage with low and consistent latency and look forward to hearing your feedback at premiumblobfeedback@microsoft.com. To learn more about Blob Storage please visit our product page.

Why IoT is not a technology solution—it’s a business play

$
0
0

IoT Hero

Enterprise leaders understand how important the Internet of Things (IoT) will be to their companies—in fact, according to a report by McKinsey & Company, 92 percent of them believe IoT will help them innovate products and improve operations by 2020. However, like many business-enabling systems, IoT is not without its growing pains. Even early adopters have concerns about the cost, complexity, and security implications of applying IoT to their businesses.

These growing pains can make it daunting for organizations to pick an entry point for applying IoT to their business.

Many companies start by shifting their thinking. It’s easy to get lost in technology, opting for platforms with the newest bells and whistles, and then leaning on those capabilities to drive projects. But sustainable change doesn’t happen that way—it happens when you consider business needs first, and then look at how the technology can fulfill those needs better than current processes can.

You might find it helpful to start by thinking about how IoT can transform your business. You know connected products will be an important part of your business model, but before you start building them, you need to make sure you understand where the market is headed so you can align innovation with emerging needs. After all, the biggest wins come when you can use emerging technology to create a “platform” undergirding products and services that can be extended into new opportunity areas.

To help you plan your IoT journey, we’re rolling out a four-part blog series. In the upcoming posts, we’ll cover how to create an IoT business case, overcome capability gaps, and simplify execution; all advice to help you maximize your gains with IoT.

Let’s get started by exploring the mindset it takes to build IoT into your business model.

Make sure business sponsors are invested

With any business-enabling system, organizations instinctively engage in focused exploration before they leap in. The business and IT can work together to develop a business case, identifying and prioritizing areas that can be optimized and provide real value. Involving the right business decision-makers will ensure you have sponsorship, budget, and commitment when it comes to applying IoT to new processes and systems and make the necessary course corrections as implementation grows and scales. Put your business leaders at the center of the discussion and keep them there.

Seize early-mover advantage

Organizations that are early in and commit to developing mastery in game-changing technologies may only see incremental gains in the beginning. But their leadership position often becomes propulsive, eventually creating platform business advantages that allow them to outdistance competitors for the long term. Don’t believe it? Just look at the history of business process automation, operational optimization, ecommerce, cloud services, digital business, and other tech-fueled trends to see how this has played out.

Consider manufacturing. The industry was a leader in operational optimization, using Six Sigma and other methodologies to strip cost and waste out of processes. After years of these efforts, process improvements became a game of inches with ever-smaller benefits.

Enter IoT. Companies have used IoT to achieve significant gains with improving production and streamlining operations in both discrete and process manufacturing. IoT can help companies predict changing market demand, aligning output to real needs. In addition, many manufacturing companies use IoT to help drive throughput of costly production equipment. Sensors and advanced analytics predict when equipment needs preventive maintenance, eliminating costly downtime.

How companies are using IoT today

Johnson Controls makes building-automation solutions, enabling customers to fine-tune energy use, lowering costs and achieving sustainability goals. The company built out its connected platform in the cloud with the Microsoft Azure IoT Suite to be able to aggregate, manage, and make sense of the torrent of facility data it receives. Through its Smart Connected Chillers initiative, Johnson Controls was able to identify a problem with a customer’s chiller plant, take corrective action, and prevent unplanned downtime that would have cost the customer $300,000 an hour.

IoT can also enable new business models. Adobe used to run its business with one-time software sales, but the company pivoted to online subscription sales as part of its drive to create a digital business, according to Forbes. While the move seemed risky at the time (and initially hurt revenue), Adobe’s prescient move has enabled it to dominate digital marketing, creative services, and document management. Now, of course, the software industry is predominantly Software as a Service (SaaS). Adobe is building on its success by pushing even deeper into analytics. The Adobe Experience Cloud uses Microsoft Azure and Microsoft Dynamics 365 to provide businesses with a 360-degree customer view and the tools to create, deliver, and manage digital experiences and a globally scalable cloud platform.

When connections with customers are constant and always adding value—everyone wins.

Think about IoT as a business enabler

It’s helpful to constantly stress to your team that IoT is a new way to enable a business strategically. IoT isn’t a bolt-on series of technology applications to incrementally optimize what you have. That may seem at odds with the prevailing wisdom to optimize current services. Yes, organizations should seek efficiencies, but only after they have considered where the business is headed, how things need to change to support targeted growth, and whether current processes can be improved or need to be totally transformed. In the case of Johnson Controls, service optimization is a core part of the company’s value proposition to its customers.

Yes, IoT can reinvent your business. And yes, constant, restless innovation can be expected until IoT is fully embedded in your business in a way that’s organic and self-sustaining. Adobe has used its digital platform to extend further and further into its customers’ businesses, providing value they can measure.

If you haven’t started with IoT, now is the right time, because your competitors are grappling with these very issues and creating smart strategies to play the IoT long game. Disruption is here and will only get more pronounced as platform leaders rocket ahead.

As you plan your journey with IoT, there’s help. In forthcoming blogs, we’ll be looking at:

  • Building a business case for IoT—Why this is the moment to be thinking about IoT and developing a solid business case to capture market opportunity and future-proof your organization.
  • Paving the way for IoT—Understanding and addressing what gaps need to be overcome to achieve IoT’s promise.
  • Becoming an IoT leader now—It’s simpler than you think to start with IoT. You can use what you have and SaaS-ify your approach with technology that makes it easy to connect, monitor, and manage your IoT assets at scale.

Need inspiration? Watch this short video to hear insights from IoT leaders Henrik Fløe of Grundfos, Doug Weber from Rockwell Automation, Michael MacKenzie from Schneider Electric, and Alasdair Monk of The Weir Group. 

Clean up files by built-in delete activity in Azure Data Factory

$
0
0

Azure Data Factory (ADF) is a fully-managed data integration service in Azure that allows you to iteratively build, orchestrate, and monitor your Extract Transform Load (ETL) workflows. In the journey of data integration process, you will need to periodically clean up files from the on-premises or the cloud storage server when the files become out of date. For example, you may have a staging area or landing zone, which is an intermediate storage area used for data processing during your ETL process. The data staging area sits between the data source stores and the data destination store. Given the data in staging areas are transient by nature, you need to periodically clean up the data in the staging area after the ETL process has being completed.

We are excited to share ADF built-in delete activity, which can be part of your ETL workflow to deletes undesired files without writing code. You can use ADF to delete folder or files from Azure Blob Storage, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, File System, FTP Server, sFTP Server, and Amazon S3.

You can find ADF delete activity under the “Move & Transform” section from the ADF UI to get started.

1. You can either choose to delete files or delete the entire folder. The deleted files and folder name can be logged in a csv file.

2. The file or folder name to be deleted can be parameterized, so that you have the flexibility to control the behavior of delete activity in your data integration flow.

3. You can delete expired files only rather than deleting all the files in one folder. For example, you may want to only delete the files which were last modified more than 30 days ago.

4. You can start from ADF template gallery to quickly deploy common use cases involving delete activity.

You are encouraged to give these additions a try and provide us with feedback. We hope you find them helpful in your scenarios. Please post your questions on Azure Data Factory forum or share your thoughts with us on Data Factory feedback site.

Enabling customers’ hybrid strategy with new Microsoft innovation

$
0
0

Customers who are taking a hybrid cloud approach are seeing real business value – I see this in organizations across the globe. The ability for customers to embrace both public cloud and local datacenter, plus edge capability, is enabling customers to improve their IT agility and maximize efficiency. The benefit of a hybrid approach is also what continues to bring customers to Azure, the one cloud that has been uniquely built for hybrid. We haven’t slowed our investment in enabling a hybrid strategy, particularly as this evolves into the new application pattern of using intelligent cloud and intelligent edge.

Before I dive into what’s new, I want to take a moment to share why Microsoft is so passionate about enabling a hybrid approach. It stems from a deep understanding of our customers and their businesses over the past several decades. We want every organization on the planet to benefit from cloud innovation. Fundamentally, hybrid enables every organization to participate in this technology transformation. Beyond this, we see the leading experiences enabled by tapping into both the intelligent cloud and intelligent edge, creating optimized experiences for literally every use case. 

Today, I’m pleased to share some new products and updates to our Azure hybrid portfolio that ultimately help address a wider range of customer needs.

Expanding the Azure Stack family

Since Azure Stack become globally available in 2017, it has captivated customers as a unique offering to build and run cloud-native applications with consistent Azure services on premises including disconnected locations. Today, Azure Stack is available in 92 countries with 8 announced hardware partners and customers like Airbus Defense & SpaceKPMG Norway, and iMOKO are using it to enable hybrid and edge scenarios. We continue to expand Azure Stack offerings to meet a broader set of customer needs, so they can run virtualized applications in their own datacenter. 

Introducing Azure Stack HCI Solutions. When considering their full application portfolio, customers want to upgrade a set of existing applications to run on modern hyperconverged infrastructure (HCI) for efficiency and performance gains. And, while this is often an approach for “legacy” applications, in moving to HCI, customers can now also benefit from a hybrid cloud HCI solution. We are bringing our existing HCI technology into the Azure Stack family for customers to run virtualized applications on-premises with direct access to Azure management services such as backup and disaster recovery. Azure Stack HCI solutions feature the same software-defined compute, storage, and networking technology as the existing Azure Stack, and include simplified cloud access via the Azure hybrid services in Windows Admin Center. Azure Stack HCI is the newest solution to join the broad set of hybrid capabilities from Azure. You can learn more about Azure Stack HCI in Arpan Shah’s blog.

More customer options for Azure Stack. The Azure Stack ecosystem is growing in more ways than one today. Dell EMC Tactical Microsoft Azure Stack is now generally available, bringing Azure-consistent cloud to operating environments where network connectivity is limited and/or mobility and high portability are required – in remote and rugged circumstances. This new offering enables Azure Stack to support an even wider range of use-cases, while still delivering a consistent hybrid cloud approach.

Delivering more innovation at the edge

Hybrid cloud is evolving from being only the integration of a datacenter with the public cloud, to becoming units of computing available at the edge, including even the world’s most remote destinations, working in concert with public cloud. You can read more about this new era we call intelligent cloud and intelligent edge in an earlier blog. What’s compelling about the intelligent edge is many of the same patterns and principles for hybrid applications apply to edge applications. That means investments our customers make today in a hybrid cloud, and the skillsets developed from running these hybrid applications, position them to take advantage of edge computing and tools that can yield even bigger benefits in the future. Toward the goal of helping customers tap into this potential, I am very excited to announce some new edge capabilities.

Azure Data Box Edge is now available. Today is the general availability of Azure Data Box Edge. We previewed the Data Box Edge appliance with edge compute and network data transfer capabilities last September. Data Box Edge provides a cloud managed compute platform for containers at the edge, enabling customers to process data at the edge and accelerate machine learning workloads, powered by Azure Machine Learning and Intel Arria 10 FPGA. Data Box Edge also enables customers to transfer data over the internet to Azure in real-time for deeper analytics or model retraining at cloud scale or for long term storage, as does the Azure Data Box Gateway virtual appliance that is also available today. You can read more about both and how customers like Cree and Esri are already using Data Box Edge via Dean Paron’s blog.

This week: Join me for a Hybrid Cloud Virtual Event

You can learn more about Azure hybrid cloud, and in the meantime, I hope you’ll join me later this week as I host a virtual event focused on hybrid cloud for enterprises. I’m looking forward to talking with customers, analysts and technical leaders about how they’re approaching hybrid cloud – including what’s working and what they’ve learned. We’ll cover topics like application innovation, security and governance, migration, edge computing – and more. The event is Thursday, March 28, 2019 at 8:00 AM Pacific Time, and you can register here.


Announcing Azure Stack HCI: A new member of the Azure Stack family

$
0
0

It has been inspiring to watch how customers use Azure Stack to innovate and drive digital transformation across cloud boundaries. In her blog post today, Julia White shares examples of how customers are using Azure Stack to innovate on-premises using Azure services. Azure Stack shipped in 2017, and it is the only solution in the market today for customers to run cloud applications using consistent IaaS and PaaS services across public cloud, on-premises, and in disconnected environments. While customers love the fact that they can run cloud applications on-premises with Azure Stack, we understand that most customers also run important parts of their organization on traditional virtualized applications. Now we have a new option to deliver cloud efficiency and innovation for these workloads as well.

Today, I am pleased to announce Azure Stack HCI solutions are available for customers who want to run virtualized applications on modern hyperconverged infrastructure (HCI) to lower costs and improve performance. Azure Stack HCI solutions feature the same software-defined compute, storage, and networking software as Azure Stack, and can integrate with Azure for hybrid capabilities such as cloud-based backup, site recovery, monitoring, and more.

Adopting hybrid cloud is a journey and it is important to have a strategy that takes into account different workloads, skillsets, and tools. Microsoft is the only leading cloud vendor that delivers a comprehensive set of hybrid cloud solutions, so customers can use the right tool for the job without compromise. 

Choose the right option for each workload

Azure Stack HCI solutions

Azure Stack HCI: Use existing skills, gain hyperconverged efficiency, and connect to Azure

Azure Stack HCI solutions are designed to run virtualized applications on-premises in a familiar way, with simplified access to Azure for hybrid cloud scenarios. This is a perfect solution for IT to leverage existing skills to run virtualized applications on new hyperconverged infrastructure while taking advantage of cloud services and building cloud skills.

Customers that deploy Azure Stack HCI solutions get amazing price/performance with Hyper-V and Storage Spaces Direct running on the most current industry-standard x86 hardware. Azure Stack HCI solutions include support for the latest hardware technologies like NVMe drives, persistent memory, and remote-direct memory access (RDMA) networking.

IT admins can also use Windows Admin Center for simplified integration with Azure hybrid services to seamlessly connect to Azure for:

  • Azure Site Recovery for high availability and disaster recovery as a service (DRaaS).
  • Azure Monitor, a centralized hub to track what’s happening across your applications, network, and infrastructure – with advanced analytics powered by AI.
  • Cloud Witness, to use Azure as the lightweight tie breaker for cluster quorum.
  • Azure Backup for offsite data protection and to protect against ransomware.
  • Azure Update Management for update assessment and update deployments for Windows VMs running in Azure and on-premises.
  • Azure Network Adapter to connect resources on-premises with your VMs in Azure via a point-to-site VPN.
  • Azure Security Center for threat detection and monitoring for VMs running in Azure and on-premises (coming soon).

Buy from your choice of hardware partners

Azure Stack HCI solutions are available today from 15 partners offering Microsoft-validated hardware systems to ensure optimal performance and reliability. Your preferred Microsoft partner gets you up and running without lengthy design and build time and offers a single point of contact for implementation and support services. 

How to buy Azure Stack HCI solutions

Visit our website to find more than 70 Azure Stack HCI solutions currently available from these Microsoft partners: ASUS, Axellio, bluechip, DataON, Dell EMC, Fujitsu, HPE, Hitachi, Huawei, Lenovo, NEC, primeLine Solutions, QCT, SecureGUARD, and Supermicro.

Learn more

We know that a great hybrid cloud strategy is one that meets you where you are, delivering cloud benefits to all workloads wherever they reside. Check out these resources to learn more about Azure Stack HCI and our other Microsoft hybrid offerings:

FAQ

What do Azure Stack and Azure Stack HCI solutions have in common?

Azure Stack HCI solutions feature the same Hyper-V based software-defined compute, storage, and networking technologies as Azure Stack. Both offerings meet rigorous testing and validation criteria to ensure reliability and compatibility with the underlying hardware platform.

How are they different?

With Azure Stack, you can run Azure IaaS and PaaS services on-premises to consistently build and run cloud applications anywhere.

Azure Stack HCI is a better solution to run virtualized workloads in a familiar way – but with hyperconverged efficiency – and connect to Azure for hybrid scenarios such as cloud backup, cloud-based monitoring, etc.

Why is Microsoft bringing its HCI offering to the Azure Stack family?

Microsoft’s hyperconverged technology is already the foundation of Azure Stack.

Many Microsoft customers have complex IT environments and our goal is to provide solutions that meet them where they are with the right technology for the right business need. Azure Stack HCI is an evolution of Windows Server Software-Defined (WSSD) solutions previously available from our hardware partners. We brought it into the Azure Stack family because we have started to offer new options to connect seamlessly with Azure for infrastructure management services.

Will I be able to upgrade from Azure Stack HCI to Azure Stack?

No, but customers can migrate their workloads from Azure Stack HCI to Azure Stack or Azure.

How do I buy Azure Stack HCI solutions?

Follow these steps:

  1. Buy a Microsoft-validated hardware system from your preferred hardware partner.
  2. Install Windows Server 2019 Datacenter edition and Windows Admin Center for management and the ability to connect to Azure for cloud services
  3. Option to use your Azure account to attach management and security services to your workloads.

How does the cost of Azure Stack HCI compare to Azure Stack?

This depends on many factors.

Azure Stack is sold as a fully integrated system including services and support. It can be purchased as a system you manage, or as a fully managed service from our partners. In addition to the base system, the Azure services that run on Azure Stack or Azure are sold on a pay-as-you-use basis.

Azure Stack HCI solutions follow the traditional model. Validated hardware can be purchased from Azure Stack HCI partners and software (Windows Server 2019 Datacenter edition with software-defined datacenter capabilities and Windows Admin Center) can be purchased from various existing channels. For Azure services that you can use with Windows Admin Center, you pay with an Azure subscription.

We recommend working with your Microsoft partner or account team for pricing details.

What is the future roadmap for Azure Stack HCI solutions?

We’re excited to hear customer feedback and will take that into account as we prioritize future investments.

Blob storage interface on Data Box is now generally available

$
0
0

The blob storage interface on the Data Box has been in preview since September 2018 and we are happy to announce that it's now generally available. This is in addition to the server message block (SMB) and network file system (NFS) interface already generally available on the Data Box.

The blob storage interface allows you to copy data into the Data Box via REST. In essence, this interface makes the Data Box appear like an Azure storage account. Applications that write to Azure blob storage can be configured to work with the Azure Data Box in exactly the same way. 

This enables very interesting scenarios, especially for big data workloads. Migrating large HDFS stores to Azure as part of a Apache Hadoop® migration is a popular ask. Using the blob storage interface of the Data Box, you can now easily use common copy tools like DistCp to directly point to the Data Box, and access it as though it was another HDFS file system! Since most Hadoop installations come pre-loaded with the Azure Storage driver, most likely you will not have to make changes to your existing infrastructure to use this capability. Another key benefit of migrating via the blob storage interface is that you can choose to preserve metadata. For more details on migrating HDFS workloads, please review the Using Azure Data Box to migrate from an on premises HDFS store documentation.

Blob storage on the Data Box enables partner solutions using native Azure blob storage to write directly to the Data Box. With this capability, partners like Veeam, Rubrik, and DefendX were able to utilize the Data Box to assist customers moving data to Azure.

For a full list of supported partners please visit the Data Box partner page.

For more details on using blob storage with Data Box, please see our official documentation for Azure Data Box Blob Storage requirements and a tutorial on copying data via Azure Data Box Blob Storage REST APIs.

Accelerated AI with Azure Machine Learning service on Azure Data Box Edge

$
0
0

FPGA acceleration at the edge

Along with the general availability of Azure Data Box Edge that was announced today, we are announcing the preview of Azure Machine Learning hardware accelerated models on Data Box Edge. The majority of the world’s data in real-world applications is used at the edge. For example, images and videos collected from factories, retail stores, or hospitals are used for manufacturing defect analysis, inventory out-of-stock detection, and diagnostics. Applying machine learning models to the data on Data Box Edge provides lower latency and savings on bandwidth costs, while enabling real-time insights and speed to action for critical business decisions.

Azure Machine Learning service is already a generally available, end-to-end, enterprise-grade, and compliant data science platform. Azure Machine Learning service enables data scientists to simplify and accelerate the building, training, and deployment of machine learning models. All these capabilities are accessed from your favorite Python environment using the latest open-source frameworks, such as PyTorch, TensorFlow, and scikit-learn. These models can run today on CPUs and GPUs, but this preview expands that to field programmable gate arrays (FPGA) on Data Box Edge.

What is in this preview?

This preview enhances Azure Machine Learning service by enabling you to train a TensorFlow model for image classification scenarios, containerize the model in a Docker container, and then deploy the container to a Data Box Edge device with Azure IoT Hub. Today we support ResNet 50, ResNet 152, DenseNet-121, and VGG-16. The model is accelerated by the ONNX runtime on an Intel Arria 10 FPGA that is included with every Data Box Edge.

Why does this matter?

Over the years, AI has been infused in our everyday lives and in industry. Smart home assistants understand what we say, and social media services can tag who’s in the picture we uploaded. Most, if not all, of this is powered by deep neural networks (DNNs), which are sophisticated algorithms that process unstructured data such as images, speech, and text. DNNs are also computationally expensive. For example, it takes almost 8 billion calculations to analyze one image using ResNet 50, a popular DNN.

There are many hardware options to run DNNs today, most commonly on CPUs and GPUs. Azure Machine Learning service brings customers the cutting-edge innovation that originated in Microsoft Research (featured in this recent Fast Company article), to run DNNs on reconfigurable hardware called FPGAs. By integrating this capability and the ONNX runtime in Azure Machine Learning service, we see vast improvements in the latencies of models.

Bringing it together

Azure Machine Learning service now brings the power of accelerated AI models directly to Data Box Edge. Let’s take the example of a manufacturing assembly line scenario, where cameras are photographing products at various stages of development.

The pictures are sent from the manufacturing line to Data Box Edge inside your factory, where AI models trained, containerized and deployed to FPGA using Azure Machine Learning service, are available. Data Box Edge is registered with Azure IoT Hub, so you can control which models you want deployed. Now you have everything you need to process incoming pictures in near real-time to detect manufacturing defects. This enables the machines and assembly line managers to make time-sensitive decisions about the products, improving product quality, and decreasing downstream production costs.

Join the preview

Azure Machine Learning service is already generally available today. To join the preview for containerization of hardware accelerated AI models, fill out the request form and get support on our forum.

Azure Data Box Family meets customers at the edge

$
0
0

Today I am pleased to announce the general availability of Azure Data Box Edge and the Azure Data Box Gateway. You can get these products today in the Azure portal.

Compute at the edge

Data-Box-Edge-2019-Draft2-DarkR[4]We’ve heard your need to bring Azure compute power closer to you – a trend increasingly referred to as edge computing. Data Box Edge answers that call and is an on-premises anchor point for Azure. Data Box Edge can be racked alongside your existing enterprise hardware or live in non-traditional environments from factory floors to retail aisles. With Data Box Edge, there's no hardware to buy; you sign up and pay-as-you-go just like any other Azure service and the hardware is included.

This 1U rack-mountable appliance from Microsoft brings you the following:

  • Local Compute – Run containerized applications at your location. Use these to interact with your local systems or to pre-process your data before it transfers to Azure.
  • Network Storage Gateway – Automatically transfer data between the local appliance and your Azure Storage account. Data Box Edge caches the hottest data locally and speaks file and object protocols to your on-premise applications.
  • Azure Machine Learning utilizing an Intel Arria 10 FPGA - Use the on-board Field Programmable Gate Array (FPGA) to accelerate inferencing of your data, then transfer it to the cloud to re-train and improve your models. Learn more about the Azure Machine Learning announcement.
  • Cloud managed – Easily order your device and manage these capabilities for your fleet from the cloud using the Azure Portal.

Since announcing Preview at Ignite 2018 just a few months ago, it has been amazing to see how our customers across different industries are using Data Box Edge to unlock some innovative scenarios:

 

Kroger

Sunrise Technology, a wholly owned division of The Kroger Co., plans to use Data Box Edge to enhance the Retail as a Service (RaaS) platform for Kroger and the retail industry to enable the features announced at NRF 2019: Retail's Big Show, including personalized, never-before-seen shopping experiences like at-shelf product recommendations, guided shopping and more. The live video analytics on Data Box Edge can help store employees identify and address out-of-stocks quickly and enhance their productivity. Such smart experiences will help retailers provide their customers with more personalized, rewarding experiences.

EsriEsri, a leader in location intelligence, is exploring how Data Box Edge can help those responding to disasters in disconnected environments. Data Box Edge will allow teams in the field to collect imagery captured from the air or ground and turn it into actionable information that provides updated maps. The teams in the field can use updated maps to coordinate response efforts even when completely disconnected from the command center. This is critical in improving the response effectiveness in situations like wildfires and hurricanes.

Data Box Gateway – Hardware not required

Data Box Edge comes with a built-in storage gateway. If you don’t need the Data Box Edge hardware or edge compute, then the Data Box Gateway is also available as a standalone virtual appliance that can be deployed anywhere within your infrastructure.

You can provision it in your hypervisor, using either Hyper-V or VMware, and manage it through the Azure Portal. Server message block (SMB) or network file system (NFS) shares will be set up on your local network. Data landing on these shares will automatically upload to your Azure Storage account, supporting Block Blob, Page Blob, or Azure Files. We’ll handle the network retries and optimize network bandwidth for you. Multiple network interfaces mean the appliance can either sit on your local network or in a DMZ, giving your systems access to Azure Storage without having to open network connections to Azure.

Whether you use the storage gateway inside of Data Box Edge or deploy the Data Box Gateway virtual appliance, the storage gateway capabilities are the same.

 

More solutions from the Data Box family

In addition to Data Box Edge and Data Box Gateway, we also offer three sizes of Data Box for offline data transfer:

  • Data Box – a ruggedized 100 TB transport appliance
  • Data Box Disk – a smaller, more nimble transport option with individual 8 TB disks and up to 40 TB per order
  • Data Box Heavy Preview - a bigger version of Data Box that can scale to 1 PB.

All Data Box offline transport products are available to order through the Azure Portal. We ship them to you and then you fill them up and ship them back to our data center for upload and processing. To make Data Box useful for even more customers, we’re enabling partners to write directly to Data Box with little required change to their software via our new REST API feature which has just reached general availability – Blob Storage on Data Box!

Get started

Thank you for partnering with us on our journey to bring Azure to the edge. We are excited to see how you use these new products to harness the power of edge computing for your business. Here’s how you can get started:

  • Order Data Box Edge or the Data Box Gateway today via the Azure portal.
  • Review server hardware specs on the Data Box Edge datasheet.
  • Learn more about our family of Azure Data Box products.

New updates to Azure AI expand AI capabilities for developers

$
0
0

As companies increasingly look to transform their businesses with AI, we continue to add improvements to Azure AI to make it easy for developers and data scientists to deploy, manage, and secure AI functions directly into their applications with a focus on the following solution areas:

  1. Leveraging machine learning to build and train predictive models that improve business productivity with Azure Machine Learning.
  2. Applying an AI-powered search experience and indexing technologies that quickly find information and glean insights with Azure Search.
  3. Building applications that integrate pre-built and custom AI capabilities like vision, speech, language, search, and knowledge to deliver more engaging and personalized experiences with our Azure Cognitive Services and Azure Bot Service.

Today, we’re pleased to share several updates to Azure Cognitive Services that continue to make Azure the best place to build AI. We’re introducing a preview of the new Anomaly Detector Service which uses AI to identify problems so companies can minimize loss and customer impact. We are also announcing the general availability of Custom Vision to more accurately identify objects in images. 

From using speech recognition, translation, and text-to-speech to image and object detection, Azure Cognitive Services makes it easy for developers to add intelligent capabilities to their applications in any scenario. To this date more than a million developers have already discovered and tried Cognitive Services to accelerate breakthrough experiences in their application.

Anomaly detection as an AI service

Anomaly Detector is a new Cognitive Service that lets you detect unusual patterns or rare events in your data that could translate to identifying problems like credit card fraud.

Today, over 200 teams across Azure and other core Microsoft products rely on Anomaly Detector to boost the reliability of their systems by detecting irregularities in real-time and accelerating troubleshooting. Through a single API, developers can easily embed anomaly detection capabilities into their applications to ensure high data accuracy, and automatically surface incidents as soon as they happen.

Common use case scenarios include identifying business incidents and text errors, monitoring IoT device traffic, detecting fraud, responding to changing markets, and more. For instance, content providers can use Anomaly Detector to automatically scan video performance data specific to a customer’s KPIs, helping to identify problems in an instant. Alternatively, video streaming platforms can apply Anomaly Detector across millions of video data sets to track metrics. A missed second in video performance can translate to significant revenue loss for content providers that monetize on their platform.

Custom Vision: automated machine learning for images

With the general availability of Custom Vision, organizations can also transform their business operations quickly and accurately identifying objects in images.

Powered by machine learning, Custom Vision makes it easy and fast for developers to build, deploy, and improve custom image classifiers to quickly recognize content in imagery. Developers can train their own classifier to recognize what matters most in their scenarios, or export these custom classifiers to run them offline and in real time on iOS (in CoreML), Android (in TensorFlow), and many other devices on the edge. The exported models are optimized for the constraints of a mobile device providing incredible throughput while still maintaining high accuracy.

Today, Custom Vision can be used for a variety of business scenarios. Minsur, the largest tin mine in the western hemisphere, located in Peru, applies Custom Vision to create a sustainable mining practice by ensuring that water used in the mineral extraction process is properly treated for reuse on agriculture and livestock by detecting treatment foam levels. They used a combination of Cognitive Services Custom Vision and Azure video analytics to replace a highly manual process so that employees can focus on more strategic projects within the operation.

Screenshot of the Custom Vision platform

Screenshot of the Custom Vision platform, where you can train the model to detect unique objects in an image, such as your brand’s logo.

Starting today, Custom Vision delivers the following improvements:

  • High quality models – Custom Vision features advanced training with a new machine learning backend for improved performance, especially on challenging datasets and fine-grained classification. With advanced training, you can specify a compute time budget and Custom Vision will experimentally identify the best training and augmentation settings.
  • Iterate with ease – Custom Vision makes it simple for developers to integrate computer vision capabilities into applications with 3.0 REST APIs and SDKs. The end to end pipeline is designed to support the iterative improvement of models, so you can quickly train a model, prototype in real world conditions, and use the resulting data to improve the model which gets models to production quality faster.
  • Train in the cloud, run anywhere – The exported models are optimized for the constraints of a mobile device, providing incredible throughput while still maintaining high accuracy. Now, you can also export classifiers to support Azure Resource Manager (ARM) for Raspberry Pi 3 and the Vision AI Dev Kit.

For more information, visit the Custom Vision Service Release Notes.

Get started today

Today’s milestones illustrate our commitment to make the Azure AI platform suitable for every business scenario, with enterprise-grade tools that simplify application development, and industry leading security and compliance for protecting customers’ data.

To get started building vision and search intelligent apps, please visit the Cognitive Services site.

Viewing all 5971 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>