Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

Deploy a FHIR sandbox in Azure

$
0
0

This blog post was authored by Michael Hansen, Senior Program Manager, Microsoft Azure.

In connection with HIMSS 2019, we announced the Azure API for FHIR, which provides our customers with an enterprise grade, managed FHIR® API in Azure. Since then, we have been busy improving the service with new configuration options and features. Some of the features we have been working on include authentication configuration and the SMART on FHIR Azure Active Directory Proxy, which enable the so-called SMART on FHIR EHR launch with the Azure API for FHIR.

We have developed a sandbox environment that illustrates how the service and the configuration options are used. In this blog post, we focus on how to deploy the sandbox in Azure. Later blog posts will dive into some of the technical details of the various configuration options.

The Azure API for FHIR team maintains a GitHub repository with sample applications. It is maintained by the product engineering team to ensure that it works with the latest features of the Azure API for FHIR. The repository contains a patient dashboard application, Azure Function that will load patient data generated with Synthea, and example templates for SMART on FHIR applications:

Diagram displaying SMART on FHIR applications

Deployment instructions

The repository contains fully automated PowerShell scripts that you can use to deploy the sandbox scenario. The deployment script will create Azure Active Directory application registrations and a test user. If you do not want to create these Azure Active Directory objects in the tenant associated with your Azure subscription, we recommend you create a separate Azure Active Directory tenant to use for data plane access control.

The deployment script is written for PowerShell and uses the AzureAd PowerShell module. If you don’t have access to PowerShell on your computer, you can use the Azure Cloud Shell. In the cloud shell, you can deploy the sandbox environment with:

# Clone source code repository
cd $HOME
git clone https://github.com/Microsoft/fhir-server-samples
cd fhir-server-samples/deploy/scripts

# Log in to Azure AD:
Connect-AzureAd -TenantDomain <mytenantdomain>.onmicrosoft.com

# Connect to Azure Subscription
Login-AzureRmAccount

# Selection subscription
Select-AzureRmSubscription -SubsciptionName “Name of your subscription”

# Deploy Sandbox
.Create-FhirServerSamplesEnvironment.ps1 -EnvironmentName <NameOfEnvironment> -EnvironmentLocation westus2 -AdminPassword $(ConvertTo-SecureString -AsPlainText -Force "MySuperSecretPassword")

It will take around 5 minutes to deploy the environment. The deployment script will create a resource group with the same name as the environment. In there, you will find all the resources associated with the sandbox.

Loading synthetic data

The environment resource group will contain a storage account with a container named “FhirImport.” If you upload Synthea patient bundles to this storage account, they will be ingested.

Using the patient dashboard

There are two versions of the patient dashboard, they can be located at:

  • https://<NameOfEnvironment>dash.azurewebsites.net: This is an ASP.NET patient dashboard. The GitHub repository contains the source code for this patient dashboard.
  • https://<NameOfEnvironment>js.azurewebsites.net: This is a single page JavaScript application. The source code is also in the GitHub repository.

When you navigate to either of those URLs, you will be prompted to log in. The administrator user is created by the deployment script and will have the username <NameOfEnvironment>-admin@<mytenantdomain>.onmicrosoft.com and the password is whatever you chose it to be during deployment. If you have uploaded some patients using the Synthea uploader, you should be able to display a list of patients. This shows the view in the JavaScript dashboard.

JavaScript dashboard

You can click details on a specific patient to get more information:

Details on a specific patient information on JavaScript

You can also use the links for the SMART on FHIR applications to get the growth chart application this patient:

SMART on FHIR applications showing the growth chart application of a patient

The sandbox provides other useful tools. As an example, the “About me” link will provide you with details about the FHIR endpoint including a token that can be used to access the FIR API using tools like Postman.

Deleting the sandbox

When you are done exploring the Azure API for FHIR and the FHIR sandbox, it is easily deleted with:

.Delete-FhirServerSamplesEnvironment.ps1 -EnvironmentName <NameOfEnvironment>

FHIR® is the registered trademark of HL7 and is used with the permission of HL7


Migrating big data workloads to Azure HDInsight

Azure Stack IaaS – part seven

$
0
0

If you do it often, automate it

In the virtualization days, before cloud and self-service, it took a while to get all the approvals, credentials, virtual LANs (VLANs), logical unit numbers (LUNs), etc. It took so long, that the actual creation part was easy. When cloud came along with self-service, not only was it easier to create a virtual machine (VM) without relying on others, but it changed our thinking about whether VMs were precious or disposable. At the same time developers were moving to a world of continuous delivery to serve their customers who expect apps with a constant stream of new features. This is the reason all real clouds provide automation APIs to quickly create VMs and other resources, including the infrastructure they rely on. This is often called “Infrastructure as code.” Azure’s API is governed by the Azure Resource Manager (ARM). When you set up Azure Stack, you get your own private instance of ARM.

In this blog post I will cover the automation options in your Cloud IaaS toolkit.

Azure portal

The best place to learn this is from first completing a VM deployment through the portal. Azure Stack provides the same portal as Azure. When you get to the last confirmation page, click the Download Template link. This will show you the template that will be used to deploy the VM you just specified on the previous screens.

Placeholder

As you look through the template – which is in JSON format - you can see how the virtual machine, network, and storage is defined. You’ll see an OS profile and hardware profile for your VM. Using this template, you could deploy this same VM over and over. This is perfect if you’re helping your developers with Continuous Integration and Delivery (CI/CD).

Placeholder

One thing about an ARM template is it is not a procedural script like you might get with standard automation tools. This template creates the resources as a single deployment transaction. ARM will either get to the goal state or the deployment will fail. If it fails, you have a chance to fix failure condition and move forward or delete the deployment and start over. This way you don’t get something other than what you specify.

Once you save your template, you can redeploy it in the portal, using the Template deployment item in the marketplace:

Placeholder

Learn more:
Azure Resource Manager overview
Quickstart: Create templates using the Azure Portal
Deploy resources in the portal using a custom template

Visual Studio and Visual Studio Code

Visual Studio can be scary for an infrastructure person, but if you can master some simple things, you can really help your team down the road of automation. The biggest thing that Visual Studio provides is the real-time feedback that you’re authoring the ARM JSON template correctly in terms of syntax. You don’t get this in Notepad. Get started by creating a new Azure Resource Group project:

Placeholder

A great way to start authoring a template is to use Quickstart templates. Azure Stack has a number of Quickstart templates you can use. When you start your new project in Select Azure Template, pick Azure Stack Quickstart from the drop-down list.

Placeholder

Visual Studio not only helps you author the template, it allows you to deploy the template directly to Azure Stack. When you sign into Visual Studio, all of your subscriptions in both Azure and Azure Stack show up as deployment targets. Since I use a number of Azure Stack environments, I have lots of deployment options:

Placeholder

Another way to create these templates is in Visual Studio Code. The ARM template is a JSON file, so you need a good lightweight authoring tool to work on the JSON file. Visual Studio Code (VS Code) is a great option.

After installing VS Code, you need to add the Azure Resource Manager Tools extension. This extension adds many features that simplify template authoring that help you manage variables, parameters, and resource blocks with features like including formatting and color coding. Check it out in the image below:

Placeholder

Learn more:
Install Visual Studio and Connect to Azure Stack
Deploy templates to Azure Stack using Visual Studio
Create a template in Visual Studio
Create a template in Visual Studio Code

PowerShell and Azure command-line interface

The Portal and Visual Studio are both deployment options for your template. PowerShell and the Azure command-line interface (CLI) are two other options. Since Azure Stack is your own private instance of the Azure Resource Manager, you need to connect to your unique instance.

To connect to Azure Stack with PowerShell, use the Add-AzureRMEnvironment cmdlet, specifying the ARM endpoint for your environment. Depending on how your Azure Stack environment has been set up, you will either authenticate using your Azure Active Directory credentials or your organization’s Azure Directory (referred to in the documentation as ADFS).

Azure CLI is Microsoft's cross-platform command-line experience for managing Azure resources. It works on Mac, Linux, and Windows. In Azure you can run the CLI in Cloud Shell. You can use Azure CLI to connect to Azure Stack and deploy IaaS templates. Just like PowerShell, you need to first connect to your Azure Stack’s unique ARM endpoint. For CLI you use the az cloud register command. To deploy your ARM template you use the az group deployment create command.

Placeholder

Please note: We have not implemented Cloud Shell on Azure Stack yet. Cloud Shell allows you to run the Azure CLI directly inside your browser. If you would like to see this, make sure you put in a vote on Azure Cloud Shell UserVoice.

Learn more:
Install PowerShell for Azure Stack
Connect to Azure Stack with PowerShell
Deploy in Azure Stack with PowerShell
Use Azure CLI on Azure Stack
Deploy Azure Resource Manager Templates with Azure CLI

The cloud is for automation

The more people use clouds like Azure and Azure Stack, the less they use the portal and the more they use automation. There is an automation option in Azure and Azure Stack for all your needs. Say goodbye to virtualization and say hello to Cloud IaaS with your automation toolkit.

In this blog series

We hope you come back to read future posts in this blog series. Here are some of our past and upcoming topics:

Best practices in migrating SAP applications to Azure – part 3

$
0
0

This is the third blog in a three-part blog post series on best practices in migrating SAP to Azure.

BWoH and BW/4HANA on Azure

Many SAP customers’ compelling events in their migration of SAP HANA to the cloud have been driven by two factors:

  • End of life first-generation SAP HANA appliances causing customers to re-evaluate their platform.
  • The desire to take advantage of the early value proposition of SAP Business Warehouse (BW) on HANA in a flexible TDI model over traditional databases and later BW/4HANA.

As a result, numerous initial migrations of SAP HANA to Microsoft Azure have focused on SAP BW to take advantage of SAP HANA’s in-memory capability for the BW workload. This means migration of the BW application to utilize SAP HANA at the database layer, and eventually the more involved migration of BW on HANA to BW/4HANA.

The SAP Database Migration Option (DMO) with System Move option of SUM, used as part of the migration allows customer the options to perform the migration in a single step, from source system on-premises, or to the target system residing in Microsoft Azure, minimizing overall downtime.

As with the S/4HANA example my colleague Marshal used in “Best practices in migrating SAP applications to Azure – part 2,” the move of traditional BW on AnyDB from a customer datacenter to BW on HANA, eventually will accomplish BW/4HANA in two steps. The hyper-scale and global reach of Microsoft Azure can provide you the flexibility to complete the migration while minimizing your capital investments.

SAP BW on HANA and BW 4HANA migrations

Figure 1: SAP BW on HANA and BW/4HANA migrations

Step one

The first step is a migration of SAP BW on AnyDB to SAP BW on HANA, running in Microsoft Azure. My colleague Bartosz Jarkowski has written an excellent blog describing the Database Migration Option (DMO) of SAP’s Software Update Manager (SUM).

This migration step can be accomplished using one of the HANA-certified SKUs listed in the certified and supported SAP HANA hardware directory – IaaS platforms, as the target server running SAP HANA.

Microsoft Azure today offers the most choice of HANA-certified SKUs of any hyper-scale cloud provider, ranging from 192 GiB to 20 TiB RAM, including 12 dedicated hardware options across OLTP and OLAP scenarios.

If customers opt for a SAP HANA OLAP scale-out configuration, Azure offers certified options up to 60 TiB in TDIv4 and up to 120 TB with a TDIv5 certification.

Many of our SAP HANA customers on Azure have already taken step one of the journey to BW/4HANA by executing the DMO migration to BW on HANA in Azure. This is similar to a scale-out scenario, which is sometimes on virtual machines and on the HANA Large Instance dedicated hardware.

This initial step has allowed customers to ready their cloud infrastructure such as network, virtual machines, and storage. It also helps enable operational capabilities and support models on Microsoft Azure such as backup/restore, data tiering, high availability, and disaster recovery.

Step two

Once customers have SAP BW on HANA running in Azure, the real benefits of hyper-scale cloud flexibility start to be realized. Since the BW/4HANA migration exercise can be tested iteratively, if needed in parallel, while only paying for what is used, you can throw it away once you have the migration recipe perfected and are ready to execute against production.

Run SAP HANA best on Microsoft Azure

The SAP BW on HANA and BW/4HANA scenarios, running on Microsoft Azure, are enabled through a combination of IaaS services, such as:

  • High-performing Azure Virtual Machines
  • Azure Premium Storage
  • Azure Accelerated Networking

These highly flexible, performant virtual machines (VM) offer the ability for customers to quickly spin up SAP HANA workloads and deploy infrastructure for n+0 scale-out configurations. Additionally, Azure also offers customers the ability to scale OLAP workloads through the scale-out scenario on dedicated, purpose-built SAP HANA Large Instance hardware.

In these HANA Large Instance configurations, customers would install SAP HANA in an “N+1” configuration, which usually means:

  • 1 master node
  • N worker nodes
  • 1 standby node (optional)

While there are scale-out configurations with 50 plus worker nodes, speaking practically the majority of scale-out configurations will be in the three to 15 node range.

What does this look like on Microsoft Azure?

Adhering to both Microsoft and SAP’s guidance to ensure a performant and available infrastructure, it is important that we design the compute, network, and storage for SAP HANA, taking advantage of the Azure infrastructure at our disposal in the VM scenario:

  • Availability Sets
  • Accelerated Networking
  • Azure Virtual Networks

In the HANA Large Instance scenario the use of dedicated, shared NFS storage and snapshots allows us the ability to scale-out with a standby-node. Ensuring availability for the BW system with minimal additional overhead.

Networking

SAP HANA scale-out configurations using NFS require three distinct network zones, to isolate SAP HANA client traffic, SAP HANA intranode communication and SAP HANA storage communication (NFS) to ensure predictive performance. This is illustrated below in a diagram of a sample 8-node SAP HANA system, with a standby node (1+6+1) to illustrate the above concepts.

SAP HANA scale-out client, intranode and storage network zones

Figure 2: SAP HANA scale-out client, intranode and storage network zones

We need to follow SAP recommended TDI practices when designing for SAP HANA:

  • Adequate and performant storage in proportion to the allocated RAM.
  • Correct configuration of NFS shared storage to ensure proper failover operations in case of n+1 configuration with HANA Large Instances.
  • Distinct network zones to isolate network traffic into client and intra-node zones.

These pre-defined HANA Large Instances compute SKUs, including the correct storage configuration according to the SAP Tailored Datacenter Initiative v4 specification (TDIv4) have been architected in advance and are ready for customers to use as needed. These SKUs can be adapted to specific needs resulting out of customer specific sizing.

Storage

SAP HANA Large Instance scale-out configurations in the TDI model use high-performant NFS shared storage to enable the standby node ability to take over a failed node.

There is a minimum of four types of shared filesystems set up for SAP HANA installations in a TDI scale-out configuration of Azure HANA Large Instances:

Shared Filesystem

Description

/usr/sap/SID

Low IO performance requirements

/hana/shared/SID

Low IO performance requirements

/hana/data/SID/mnt0000*

Medium / High IO performance requirements

/hana/log/SID/mnt0000*

Medium / High IO performance requirements

The number and name of HANA data and log filesystems will match the number of scale-out nodes in use, diagram below.

While there are other filesystems not depicted here necessary for proper SAP HANA operations in a scale-out configuration, these four filesystems are those required to offer highly performant IO characteristics and the ability to be shared across all the nodes in a scale-out configuration.

The below example illustrates eight Azure HANA Large Instance units in HANA scale-out, where each unit can see every other unit’s HANA filesystems provisioned, most critically the HANA and Log volumes, however at any point in time each unit has read/write control of its own data and log filesystems enforced by NFS v4 locking.

SAP HANA scale-out filesystem on Microsoft Azure HANA Large Instances

Figure 2: SAP HANA Scale-out filesystem on Microsoft Azure HANA Large Instances

In the event of a failure, the SAP HANA standby node can take over ownership of that node’s storage and assumes that nodes’ place in the operating scale-out HANA system.

SAP HANA standby bode taking ownership of a failed node's storage

Figure 3: SAP HANA standby node taking ownership of a failed node’s storage

The SAP HANA standby node capability on Microsoft Azure deepens customer choice for performance paired with availability when choosing Microsoft Azure as the platform to run SAP HANA.

I would encourage you to stop by the Microsoft booth #729 if you will be at SAP SAPPHIRENOW 2019 to learn more about these solutions and to see hands-on demo of our offerings. See you in Orlando!

Microsoft Office brings you new privacy controls

Java on Visual Studio Code April Update

$
0
0

Welcome to April update! Java 12 is now officially supported with Visual Studio Code. We’d also like to show you some new and helpful code actions now available, along with new features from Debugger, Maven and CheckStyle.

Try these new features by installing Java Extension Pack with Visual Studio Code. See below for more details!

Java 12 Support

Java is now updating with a faster pace and we’re following closely. Thanks to the upstream update from JDT, you can build your project with Java 12 features now with VS Code as well. To use the experimental language features such as the new switch statement, add the following settings to pom.xml:

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.8.0</version>
            <configuration>
                <source>12</source>
                <compilerArgs>--enable-preview</compilerArgs>
            </configuration>
        </plugin>
    </plugins>
</build>

Easier Getting Started

Although Java has been there for a long time, it still attracts new developers, and we’d like to make sure it’s easy enough for anyone to start programming Java with VS Code. One of the improvement we made recently is to provide you more detailed information to fix your environment setting.

What if you don’t yet have JDK installed? No problem. When the Java extension pack is loaded, we will automatically detect whether a JDK is present. If not, we will provide you links to download reliable JDK at your choice.

Better yet, we’re also working on a customized VS Code distribution which will help get you ready with everything needed for Java development with a single installer. To learn more, please sign up to receive latest news and updates for Java on VS Code.

Performance Improvements

We’ve been benchmarking and profiling the performance of VS Code for Java, on all platforms and major scenarios including loading and editing. There were several enhancement to improve performance in recent releases.

  • Improved editing performance when dealing with large amount of source file opened in the editor
  • Optimize start up and load time with better server initilization and lazy downloading Java source

As we try our best improving performance, it would still take some time when importing a big Java project to Visual Studio Code. In this case, it would be helpful to show more progress details and let you know what’s actually happening behind the scene. Instead of just showing the percentage of progress, we now added detailed step information into the status, such as inspecting location, configuring project, updating Maven dependencies, refreshing workspace and build workspace to let you know the wait is meaningful.

More Code Actions

Code actions are key to your productivity, so we keep bringing more of them to Visual Studio Code.

Resolve ambiguous imports

To deal with ambiguous imports, you now have a dropdown list to pick the right one. The code line with the unresolved type is also presented to you to help you decide.

Generate hashCode() and equals()

Now hashCode() & equals() can be generated with default implementations. All the non-static member variables are listed, and you can customize the generated code using the check list.

There are two options for you to customize the generated code:

  • If you use Java 7+, you can set java.codeGeneration.hashCodeEquals.useJava7Objects to true to generate shorter code which calls Objects.hash and Objects.equals.
  • You can also set java.codeGeneration.hashCodeEquals.useInstanceof to check the object type instead of calling Object.getClass().

Generate toString()

Picking which fields to be included in the toString() method and configure its template are all supported with the Generate toString() code action.

Extract to local variable

To create a new variable (correctly typed) from the return value of an expression, the new quick fix extract to local variable provides a quick fix bulb which appears when the cursor stops at the bracket (). The keyboard shortcut is ctrl + .

Override/Implement methods

With this new source action, all the cadidates are presented to you with a checklist. Then you can decide what to override or implement.

Add static import

You can now convert static functions calls to static imports.

Folding Range

Now you can expand or collapse sections of code to make your Java file easy to read. We’ve enabled a couple popular ways for you to specify which code elements should be considered as a region to fold.

Debugger Updates

Debugger is one of the most used extension in the Java extension family, and we’re excited to show you the improvements below

Display Logical Structure of Collections

The debugger is now showing the logical structure of lists and maps, instead of the physical layout of the collections. If you prefer the physical layout view, you can go back by setting java.debug.settings.showLogicalStructure to false.

Hightlight Exceptions in Editor

Exceptions are now highlighted with extra info in the editor window. Before that, you need to hover on the exception to see details. Now the most important info is presented to you right at where it occurs.

Go to Definition by Clicking Stack Trace in Debug Console

When there an exception, you can now click on the stack trace to see the definition of the function calls.

Other notable featutes include

  • Auto-completion in debug console for types without source code
  • Auto shorten the command line when file name or extension is too long.

Maven Updates

And a couple new features for Maven as well.

Debug Maven Plug-in Goal

Now you can easily configure your breakpoint and start debug any Maven goals with just a couple clicks.

Customize maven commands

Now you are able to specify your favorite commands in settings for future execution.

Show Dependency Tree

We also support showing dependencies in a tree view which allows you to inspect all dependencies in your project at a single place and check for potential issues.

One more thing. There’s one more shortcut to add dependencies. When editing a POM file, there is a shortcut command to search and add a dependency. Try Command Palette -> Maven: Add a dependency.

Sign up

If you’d like to follow the latest of Java on VS Code, please provide your email with us using the form below. We will send out updates and tips every couple weeks and invite you to test our unreleased feature and provide feedback early on.

Try it out

Please don’t hesitate to give it a try! Your feedback and suggestions are very important to us and will help shape our product in future. You may take this survey to share your thoughts!

Visual Studio Code is a fast and lightweight code editor with great Java support from many extensions

The post Java on Visual Studio Code April Update appeared first on The Visual Studio Blog.

Remote Development with Visual Studio Code

Business Applications Sessions at Build 2019

$
0
0

Developers from all over will be coming to Seattle to learn more about the new tools and technologies being created to help developers build the software and services that continue to fuel the ongoing transformation of industries and experiences. I have been involved with Build, and prior to that PDC, in various capacities for many years and on May 6th when Satya starts the day with the Vision Keynote I will be just as excited as I have been in previous years. Build provides an amazing stage to highlight the incredibly broad range of building blocks that developers are leveraging, including what’s new in the area I’m currently focused on which is the Business Applications and Power Platform space. For those of you who will be attending Build and are interested in diving deeper into these areas, here are a few sessions that you might be interested in, including an end to end overview of this area that I’ll be leading on Monday afternoon.

Future Business Apps in Power Platform, Dynamics and Office

Monday May 6, 2019 2:00pm – 3:00pm

https://mybuild.techcommunity.microsoft.com/sessions/77082

I will be leading a session with Orville McDonald and Justyna Lucznik to discuss how to create the latest in business applications. It will be an end-to-end session starting with getting insights from your data using AI and Power BI. Then it is followed up with using the Power Platform to connect data, create apps with no “cliff development” and then get deeper into the code to create a custom app.

AI in Power BI

Wednesday May 8, 2019 10:00am – 11:00am

https://mybuild.techcommunity.microsoft.com/sessions/77088

For the past few years I have seen many companies create AI solutions without first getting their data estate in order or getting the insights to determine how best to prioritize their scenarios. Power BI is a great way to do BI before AI and now with AI infused experiences in Power BI you can get deeper insights into your data.

Building an App on the Power Platform

Wednesday May 8, 2019 3:30pm – 4:30pm

https://mybuild.techcommunity.microsoft.com/sessions/77091

The Power Platform can help you accelerate the time it takes to get your app to market with a development platform that enables you to do a lot with no-code/low-code tooling. In this session learn how to develop business apps, extend with custom visuals and then to package and publish on AppSource for distribution.

PowerApps Component Framework

Wednesday May 8, 2019 9:30am – 9:50am

https://mybuild.techcommunity.microsoft.com/sessions/77350

In this session you can learn more about the PowerApps component framework including developer tooling, how to build your first control and some examples of what others have built.

Embed Low-Code/No-Code Capabilities in your App with PowerApps and Flow

Tuesday May 7, 2019 8:30am – 9:30am

https://mybuild.techcommunity.microsoft.com/sessions/77090

Developing Business Applications via Microsoft Power Platform

Tuesday May 7, 2019 3:30pm – 4:30pm

https://mybuild.techcommunity.microsoft.com/sessions/77155

Building Azure Apps using the Common Data Service

Tuesday May 7, 2019 5:00pm – 6:00pm

https://mybuild.techcommunity.microsoft.com/sessions/77085

Building Data Connectors for the Microsoft Power Platform

Tuesday May 7, 2019 10:00am – 11:00am

https://mybuild.techcommunity.microsoft.com/sessions/77092

Dataflows and Data Interop with the Common Data Model

Wednesday May 8, 2019 8:30am – 9:30am

https://mybuild.techcommunity.microsoft.com/sessions/77087

Intelligent Process Automation with Microsoft Flow

Wednesday May 8, 2019 12:30pm – 1:30pm

https://mybuild.techcommunity.microsoft.com/sessions/77244

Sprinkle some DevOps on your PowerApps & Dynamics 365 Customer Engagement projects

Wednesday May 8, 2019 5:00pm – 6:00pm

https://mybuild.techcommunity.microsoft.com/sessions/77093

 

You can learn more about these Build sessions and find more in the session catalog. I hope that you enjoy your time at Build 2019, and we look forward to seeing what you create.

Cheers,

Guggs


Making AI real for every developer and every organization

$
0
0

image

AI is fueling the next wave of transformative innovations that will change the world. With Azure AI, our goal is to empower organizations to apply AI across the spectrum of their business to engage customers, empower employees, optimize operations and transform products. To make this a reality, we have three guiding investment principles:

  • Boost the productivity of developers and data scientists and empower them to build AI solutions faster.
  • Enable these AI solutions to be deployed at scale alongside existing systems and processes.
  • Ensure organizations can build with full confidence knowing that they own and control their data on a platform that adheres to some of the industry’s strictest privacy standards and has the most comprehensive compliance portfolio of any cloud provider.

These guiding principles enable us to fulfill our mission of empowering every developer and every organization to harness the potential of AI. With research centers that span the globe, from Redmond to Shanghai, we continue to achieve industry breakthroughs in areas such as vision, speech, language, advanced machine learning techniques, and specialized AI hardware. These innovations are now key components of several of our flagship products, like Office 365, Xbox, Bing and Dynamics 365. This is important because with Azure AI, customers can benefit from the latest innovations that have been thoroughly battle-tested in our own products.

We are honored and humbled by the tremendous adoption of Azure AI by customers. Organizations of all sizes in all industries are using Azure AI to transform their business by:

  • Using machine learning to build predictive models, optimizing business processes.
  • Utilizing advanced vision, speech, language, and decision-enabling capabilities to build AI powered apps and agents to deliver personalized and engaging experiences.
  • Applying knowledge mining to uncover latent insights from vast repositories of data.

Today, we are excited to announce a range of innovations across all of these areas. Let’s walk through them.

Machine learning

Azure Machine Learning service is designed to accelerate the end-to-end machine learning lifecycle. With Azure Machine Learning, developers and organizations can quickly and easily build, train, and deploy models anywhere from the intelligent cloud to the intelligent edge, as well as manage their models with integrated (CI/CD) tooling.

As we strive to enable developers, data scientists, and DevOps professionals across all skill levels to increase productivity, operationalize models at scale and innovate faster. We are pleased to announce:

New capabilities to enhance productivity now in preview:

  • Automated machine learning user interface that enables business domain experts to train machine learning models with just a few clicks.
  • Zero-code, visual interface that enables users new to machine learning to build, train, and deploy models easily using drag and drop capabilities.
  • Azure Machine Learning notebooks that provides developers and data scientists a code-first machine learning experience.

New capabilities to enable operationalization of models at scale:

  • MLOps or DevOps for machine learning capabilities, including Azure DevOps integration that enables Azure DevOps to be used to manage the entire machine learning lifecycle including model reproducibility, validation, deployment, and retraining.
  • General availability of hardware accelerated models that run on FPGA’s in Azure for extremely low latency and low-cost inferencing. Available in preview for Databox Edge.
  • Model interpretability capabilities that enable customers to understand how a model works and why it makes certain predictions, removing the ‘black box’ aspect of ML models.

Our commitment to an open platform:

  • Contribution to the open source MLflow project, with native support for MLflow in Azure Machine Learning service.
  • Support for ONNX Runtime for NVIDIA TensorRT and Intel nGraph for high speed inferencing on NVIDIA and Intel chipsets.
  • Preview of a new service, Azure Open Datasets, that helps customers improve machine learning model accuracy using rich, curated open data and reduce time normally spent on both data discovery and preparation.

It’s exciting to see customers such as British Petroleum, Walgreens Boots and Schneider Electric deploying machine learning solutions at scale using Azure Machine Learning.

“All the data scientists on our team enjoy using Azure Machine Learning, because it’s fully interoperable with all the other tools they use in their day-to-day work—no extra training is needed, and they get more done faster now,” - Matthieu Boujonnier, Analytics Application Architect and Data Scientist, Schneider Electric. Read the Schneider Electric case study.

Visit Azure Machine Learning to discover more.

AI apps and agents

The combination of Azure Cognitive Services and Azure Bot Service enables developers to easily infuse powerful AI capabilities into their apps and agents.

Azure Cognitive Services continues to be the most comprehensive portfolio in the market for developers who want to embed the ability to see, hear, respond, translate, reason and more into their apps. Today we’re making it even easier for developers to embed AI into their applications:

  • Introduction of a new Decision category.
    Services in this category provide users recommendations to enable informed and efficient decision-making. Services such as Content Moderator, the recently announced Anomaly Detector and a new service called Personalizer, available in preview, are part of this new category. Personalizer is built on reinforcement-learning and prioritizes relevant content and experiences for each user, improving app usability and engagement. Microsoft’s very own Xbox drove a 40 percent lift in user engagement on its home screen as a result of using Personalizer.
  • In Vision, we are announcing two new services available in preview. Ink Recognizer enables developers to combine the benefits of physical pen and paper with the best of the digital by embedding digital ink recognition capabilities into apps. Developers can build on top of it to make notes searchable and convert hand-written sketches into presentation-ready content in a matter of minutes. Additionally, the Computer Vision read capability, which extracts text from common file types including multi-page documents and PDF, TIFF formats, is now generally available.
  • In Speech, we are announcing preview of new advanced speech-to-text capability called conversation transcription that catalyzes meeting efficiency by transcribing conversations in real-time so participants can fully engage in the discussion, know who said what when, and quickly follow up on next steps. Neural text-to-speech capability and Speech Service Device SDK are also now generally available.
  • In Language, Language Understanding has a new analytics dashboard to evaluate the quality of language models. In addition, QnA Maker now supports multiturn dialogs. The Text Analytics named entity extraction capability is now generally available.
  • We have expanded the portfolio of Cognitive Services that can run locally through a Docker container and we’re pleased to preview container support for Anomaly Detector, Speech-to-Text, and Text-to-Speech.

Only Azure provides developers with the flexibility to embed these powerful AI services where needed. Visit Azure Cognitive Services to find out more.

Azure Bot Service, built on Microsoft Bot Framework, makes it easier to develop bots and intelligent agents. New enhancements include:

  • Adaptive dialogs enable developers to create more sophisticated, dynamic conversations.
  • Language generation package streamlines the creation of smart and dynamic bot responses.
  • Emulator now has improved fidelity for debugging channels.

LaLiga, the premier men’s soccer league of Spain, create solutions using Cognitive Services, Bot Service and other Azure services. Their intelligent bot connects with 1.6 billion followers across various social networks. Delivering a world-class voice assistant is key to scoring brand love with fans:

“Our digital innovation platform built on Microsoft Azure helps us deliver the best possible fan experiences for the world’s best sports league.” Jose Carlos Franco, Head of Data and Analytics, LaLiga.

Knowledge mining

While organizations have seemingly unlimited access to information that can range from databases to PDFs to media files, there are still significant challenges in making that information usable and meaningful. With knowledge mining, you can leverage industry leading AI capabilities to easily unlock latent insights from all your content at scale.

We have two exciting announcements in this category:

  • The cognitive search capability of Azure Search, is now generally available and up to 30 times faster than before. Azure Search is the only offering in the market with a single mechanism to apply AI enrichments to content. Using Cognitive Search and its built-in AI capabilities, customers can discover patterns and relationships in their content, understand sentiment, extract key phrases and more, all without any data science expertise. In addition, a new knowledge store capability in preview enables developers to further leverage the insights and metadata they extract from the cognitive search pipeline. Developers can store the enriched metadata they create with cognitive search and apply it to any variety of scenarios such as Power BI visualizations, custom knowledge graphs, trigger actions within an application, or build machine learning models with new labeled data.
  • The new Form Recognizer service applies advanced machine learning to accurately extract text, key-value pairs, and tables from documents. With just a few samples, it tailors its understanding to supplied documents, both on-premises and in the cloud. It can be used to build robotic process automation (RPA) solutions.

The Metropolitan Museum of Art is exploring how Cognitive Search understands nuances and relationships across their encyclopedic collection:

British Petroleum, Icertis, Howden, Chevron, UiPath, and others benefit from Azure Search and Form Recognizer to extract insights from their content and automate processes.

“We’re excited to leverage Form Recognizer as a key document extraction capability on our Robotic Process Automation (RPA) platform and our open AI ecosystem. UiPath’s and Microsoft’s investments in AI are streamlining the process of unlocking key business data, making possible a new era of AI-driven business insights and knowledge management” Mark Benyovszky, Director Artificial Intelligence, UiPath.

Continuing to innovate

We continue to invest to make Azure the best place for AI and we are most excited to see how customers are applying AI in their businesses. The opportunities are limitless, and we are looking forward to seeing what you create with Azure AI.

Additional resources

Ready to get started? Check out:

Intelligent edge innovation across data, IoT, and mixed reality

$
0
0

Intelligent Edge innovation across data, IoT, and mixed reality

We are at an incredibly exciting technology inflection point. The virtually limitless computing power of the cloud, combined with increasingly connected and perceptive devices at the edge of the network, create possibilities we could only have dreamed of just a few years ago – possibilities made up of millions of connected devices, infinite data, and the ability to create truly immersive multi-sense, multidevice experiences.

While much attention has been paid to the cloud innovations, the advancements at the edge are becoming equally remarkable. And, of course, the experiences built using the cloud and edge together are what become truly transformative. The intelligent cloud and intelligent edge application pattern, transforms the way we can interact with digital information and further blend the physical and digital worlds for greater societal benefit and customer innovation.

Today’s announcements put us a few steps closer toward these outcomes:

  • Azure SQL Database Edge – A small footprint database engine optimized for the edge, with AI built-in
  • IoT Plug and Play – Connect IoT devices to the cloud without writing any code
  • HoloLens 2 Development Edition – All the tools needed to get started building mixed reality experiences

Azure SQL Database Edge

In our work to address the unique requirements for data security and analytics on the edge, we are excited to announce the preview of Azure SQL Database Edge, which brings to the edge the same performant, secure, and easy to manage SQL engine that our customers love in Azure SQL Database and SQL Server. This new offering also adds capabilities that are optimized for edge scenarios, including:

  • Support for Arm and x64-based interactive devices and edge gateways.
  • Low latency analytics on the edge by combining data streaming and time-series, with in-database machine learning and support for graph data. 
  • Industry leading security capabilities that Azure SQL Database offers to protect data at rest and in motion on edge devices and edge gateways. Manage security policies and updates from a central management portal such as Azure IoT Central.
  • Ability to develop applications once and deploy anywhere by having a common programming surface area across Azure SQL Database, SQL Server on-premises, and Azure SQL Database Edge.
  • Support for both cloud connected and fully disconnected edge scenarios. Gain cloud scale by connecting edge scenarios to Azure. Enable fully disconnected edge scenarios with local compute and storage.
  • Supports business intelligence with Microsoft Power BI and other BI tools.

Join the Early Adopter Program to access the preview and get started building your next intelligent edge solution.

HoloLens 2 Development Edition

When the intelligent cloud and intelligent edge are imbued with mixed reality and artificial intelligence, we have a framework for achieving amazing things and empowering even more people. Enabling intelligent cloud and intelligent edge solutions requires a new class of distributed, connected applications that will deliver break-through business outcomes. Now all the pieces are coming together.

From construction sites to factory floors, from operating rooms to classrooms, mixed reality powered by AI, are changing how we work, learn, communicate, and get things done moving us beyond the 2D worlds of the PC and smartphone into third wave of computing.   

To help catalyze this third wave of computing, we are expanding our Mixed Reality Developer Program. Over the past year, we have engaged with more than 20,000 mixed reality developers, a number we expect to more than triple in the coming 12 months. To serve the needs of a larger audience, we are investing in more meetups, programs, events, and hacks (including our Mixed Reality Dev Days that is happening now) with an increased focus on serving the mixed reality dev community on a global scale.   

In addition to serving today’s global mixed reality development community, we are expanding the program to educate and build the mixed reality developers of the future. To support developers in this journey, we’re excited to announce the HoloLens 2 Development Edition.

Available through the Mixed Reality Developer Program, the Development Edition brings together all the tools developers needed to get started:

  • HoloLens 2 mixed reality device
  • $500.00 in Azure credits – To jump start your mixed reality development using Azure mixed reality services
  • 3-month free trials of Unity Pro and the Unity PiXYZ Plugin for CAD data

Bringing together HoloLens 2, Azure Mixed Reality Services, and Unity Pro, the most widely used real-time 3D development platform, makes it easy for developers to create professional mixed reality experiences using your industrial design data. The PiXYZ Plugin gives you the tools to create real-time experiences using Computer Aided Drawing (CAD) or Building Information Management (BIM) data.

“Pairing HoloLens 2 with Unity’s real-time 3D development platform enables businesses to accelerate innovation, create immersive experiences, and engage with industrial customers in more interactive ways. The addition of Unity Pro and PiXYZ Plugin to HoloLens 2 Development Edition gives businesses the immediate ability to create real-time 2D, 3D, VR, and AR interactive experiences while allowing for the importing and preparation of design data to create real-time experiences.”

– Tim McDonough, GM of Industrial - Unity

Ready to jump in and start developing today? Join the Mixed Reality Developer Program.

Visit hololens.com to learn more about the HoloLens 2 Development Edition and sign up to stay updated about latest news, mixed reality toolkits, code samples, and open source projects.  

IoT Plug and Play

One of the biggest challenges in building IoT solutions is to connect millions of IoT devices to the cloud due to heterogeneous nature of devices today – such as different form factors, processing capabilities, operational system, memory, and capabilities.

IoT Plug and Play offers a new, open modeling language to connect IoT devices to the cloud seamlessly. With IoT Plug and Play, developers can connect IoT devices to the cloud, without having to write a single line of embedded code. IoT Plug and Play also enables device manufactures to build smarter IoT devices that just work with the cloud.

In the past, Microsoft introduced the Plug and Play technology that allowed PC users to quickly connect peripherals without having to perform complex hardware and software configurations. Similarly, with IoT Plug and Play, Microsoft is simplifying IoT to accelerate adoption for enterprises who will be able to prototype and then move to full scale deployments much faster.

Today, we are announcing that cloud developers can find IoT Plug and Play enabled devices in our Azure IoT Device Catalog. The first wave includes dozens of devices from partners such as Compal, Kyocera, and STMicroelectronics.

Here is what a few partners in the IoT Plug and Play ecosystem have to say:

“IoT Plug and Play underlines our commitment to providing a streamlined and optimized deployment path for customers using the Microsoft Azure platform, a major advantage in today’s competitive and quickly evolving market.”

– Richard Brown, VP of International Marketing, VIA Technologies, Inc.

“At Askey, we’re committed to adding leading and innovative technology to the smart mobility industry and ecosystem. Thanks to Microsoft’s IoT Plug and Play, and the Azure Certified for IoT program, IoT device application development management will never be the same.”

– Robert Lin, CEO of Askey

“Thundercomm is excited to be cooperating with Microsoft Azure on end-to-end IoT solutions. As a highlight of our cooperation, Thundercomm TurboX Asset Tag makes it possible for enterprises to decide how to improve the utilization of their valued assets. With IoT Plug and Play, we will partner together with Microsoft to accelerate and simplify the integration and deployment of IoT devices.”

– Larry Geng, Chairman, Thundercomm

List of partners

Discover even more possibilities with Azure IoT. Check the Building IoT Solutions with Azure: A Developer’s Guide to get started.

These are just a few of the many new innovations we are bringing to the edge. With our $5 billion, 4-year investment into the intelligent edge, you can be confident there is much more to come.

Digitizing trust: Azure Blockchain Service simplifies blockchain development

$
0
0

Digital map of multi-party computation_thumb[3]

In a rapidly globalizing digital world, business processes touch multiple organizations and great sums are spent managing workflows that cross trust boundaries. As digital transformation expands beyond the walls of one company and into processes shared with suppliers, partners, and customers, the importance of trust grows with it. Microsoft’s goal is to help companies thrive in this new era of secure multi-party computation by delivering open, scalable platforms, and services that any company from game publishers and grain processors, to payments ISVs and global shippers can use to digitally transform the processes they share with others.

Azure Blockchain Service: The foundation for blockchain applications in the cloud

Azure Blockchain Service is a fully-managed blockchain service that simplifies the formation, management, and governance of consortium blockchain networks so businesses can focus on workflow logic and application development. Today, we’re excited to announce that the public preview is now available.

With a few simple clicks, users can create and deploy a permissioned blockchain network and manage consortium policies using an intuitive interface in the Azure portal. Built-in governance enables developers to add new members, set permissions, monitor network health and activity, and execute governed, private interactions through integrations with Azure Active Directory.

This week, we also announced an exciting partnership with J.P. Morgan to make Quorum the first ledger available in Azure Blockchain Service. Because it’s built on the popular Ethereum protocol, which has the world’s largest blockchain developer community, Quorum is a natural choice. It integrates with a rich set of open-source tools while also supporting confidential transactions, something our enterprise customers require. Quorum customers like Starbucks, Louis Vuitton, and our own Xbox Finance team can now use Azure Blockchain Service to quickly expand their networks with lower costs, shifting their focus from infrastructure management to application development and business logic.

“We are incredibly proud of the success Quorum has had over the last four years as organizations around the world use Quorum to solve complex business and societal problems. We are delighted to partner alongside Microsoft as we continue to strengthen Quorum and expand capabilities and services on the platform.”

— Umar Farooq, Global Head of Blockchain at J.P. Morgan

We’re excited to offer customers an enterprise-grade Ethereum stack with Quorum, and look forward to adding new capabilities to Azure Blockchain Service in the coming months, including digital token management, improved application integration, and support for R3’s Corda Enterprise.

An application-driven approach

The ledger is just the foundation for new applications. After configuring the underlying blockchain network with Azure Blockchain Service, you need to codify your business logic using smart contracts. Until now, this has been cumbersome, requiring multiple command-line tools and limited developer IDE integration. Today we are releasing an extension for VS Code to address these issues. This extension allows you to create and compile Ethereum smart contracts, deploy them to either the public chain or a consortium network in Azure Blockchain Service, and manage their code using Azure DevOps.

Once your network is created and smart contract state machines are deployed, you must build an application in order for consortium participants to share business logic and data represented by the smart contracts. A key challenge has been integrating these applications with smart contracts so they either respond to smart contract updates or execute smart contract transactions. This connects business processes managed in other systems such as databases, CRM, and ERP systems with the ledger. Our new Azure Blockchain Dev Kit makes this easier than ever with connectors and templates for Logic Apps and Flow as well as integrations with serverless tools like Azure Functions.

You can learn more about how to build your first network, code your smart contracts, and interact with the ledger in the latest episodes of the web series Block Talk.

Embracing open communities

Over the past year, we have been preparing our Confidential Consortium Framework (CCF) for public release. CCF uses trusted execution environments (TEEs) such as SGX and VSM to enable ledgers that integrate with it to execute confidential transactions with the throughput and latency of a centralized database. Confidentiality and high performance are key requirements of our enterprise customers. We’re excited to announce that we have finished the first version of CCF, integrated with Quorum, and have made the source code available on Github.

Microsoft believes that the best way to bring blockchain to our customers is by partnering with the diverse and talented open source communities that are driving blockchain innovation today. We began this journey in 2015, partnering with the growing communities around Ethereum, R3 Corda, and Hyperledger to make those technologies available in Azure. Instead of building our own ledger, or creating a ledger alternative, we have worked to make open source technology developers love and work better with Azure. All of the tooling released this week allows developers to work against both consortium networks in Azure Blockchain Service and with public Ethereum.

“Microsoft has embraced the open community of blockchain developers and has brought the best of their cloud development tooling to the developers building the next wave of decentralized applications. With Azure Blockchain Service and Ethereum integrations for tools like VS Code, Microsoft is demonstrating its commitment to open blockchain development.”

— Vitalik Buterin, co-founder of Ethereum

Next steps

Learn more about Azure Blockchain Service and get started today:

Microsoft Q# Coding Contest – Winter 2019

$
0
0

Are you new to quantum computing and want to improve your skills? Have you done quantum programming before and looking for a new challenge? Microsoft’s Quantum team is excited to invite you to the second Microsoft Q# Coding Contest, organized in collaboration with Codeforces.com.

The contest will be held March 1 through March 4, 2019. It will offer the participants a selection of quantum programming problems of varying difficulty. In each problem, you’ll write Q# code to implement the described transformation on the qubits or to perform a more challenging task. The top 50 participants will win a Microsoft Quantum T-shirt.

This contest is the second one in the series started by the contest held in July 2018. The first contest offered problems on introductory topics in quantum computing: superposition, measurement, quantum oracles and simple algorithms. The second contest will take some of these topics to the next level and introduce some new ones.

For those eager to get a head start in the competition, the warmup round will be held February 22-25, 2019. It will feature a set of simpler problems and focus on getting the participants familiar with the contest environment, the submission system and the problem format. The warmup round is a great introduction, both for those new to Q# or those looking to refresh their skills.

Another great way to prepare for the contest is to work your way through the Quantum Katas. They offer problems on a variety of topics in quantum programming, many of them similar to those used in the first contest. Most importantly, the katas allow you to test and debug your solutions locally, giving you immediate feedback on your code.

Q# can be used with Visual Studio, Visual Studio Code or command line on Windows, macOS or Linux, providing an easy way to start with quantum programming. Any of these platforms can be used in the contest.

We hope to see you at the second global Microsoft Q# Coding Contest!

mariia mykhailova Mariia Mykhailova, Senior Software Engineer, Quantum
@tcnickolas
Mariia Mykhailova is a software engineer at the Quantum Architectures and Computation group at Microsoft. She focuses on developer outreach and education work for the Microsoft Quantum Development Kit. In her spare time she writes problems for programming competitions and creates puzzles.

 

The post Microsoft Q# Coding Contest – Winter 2019 appeared first on The Visual Studio Blog.

Getting hands on with Visual Studio for Mac, containers, and serverless code in the cloud

$
0
0

We are looking to improve your experience on the Visual Studio Blog. It would be very helpful if you could share your feedback via this short survey that should take less than 2 minutes. Thanks!

This week we released the second alpha of Visual Studio for Mac version 7.2, as well as some hands-on labs to try out some of the new features.

Visual Studio for Mac 7.2 Alpha

The alpha version of our next Visual Studio for Mac release includes new features such as:

  • Docker Containers – Join the microservices revolution, by testing and deploying your ASP.NET Core web apps to Docker containers. Visual Studio for Mac’s Docker support lets you easily deploy to a container as well as debug projects across multiple containers. Check out the hands-on-lab below to get started!
  • Xamarin Live Player – Get started building Xamarin mobile applications in minutes! Visit xamarin.com/live to learn how easy it is to try out mobile app development with your existing iOS or Android device and Visual Studio for Mac.
  • Azure Functions – Build and deploy serverless code in the cloud. Functions can be accessed by mobile or server apps, scheduled or triggered, and you only pay for the time they run. Follow the hands-on-lab below to write your first Azure Function.
  • IoT projects – Build, test, and deploy apps for Internet of Things devices. You can write IoT apps using C#, and deploy them to a Raspberry Pi, following our simple instructions.

To try out these features, download Visual Studio for Mac and switch to the alpha channel in the IDE.

More Hands-On Labs

Our latest hands-on labs for Visual Studio for Mac will help you get started with new features available in the 7.2 alpha. Visit the VS4Mac labs GitHub repo for past weeks’ projects using the Unity 3D game engine, Internet of Things devices, ASP.NET Core web sites, and Xamarin for mobile app development.

Today we’ve published two additional labs using: Docker Container support and Azure Functions projects.

Lab 5: Deploying ASP.NET Core to a Docker Container

Lab 3 demonstrated how to build, test, and debug an ASP.NET Core website on your Mac. This lab will show you how to run and debug an ASP.NET web site and web API in Docker containers, by completing these 4 tasks:

  1. Create a Docker-enabled ASP.NET Core web site
  2. Create a Docker-enabled ASP.NET Core web API
  3. Integrate two container apps
  4. Debug multi-container solutions

Follow the complete instructions to set up the two ASP.NET Core projects, make them work together, and debug them simultaneously.

Lab 6: Serverless computing with Azure Functions

“Serverless computing” is a new type of cloud feature where you can host a “function” without having to worry about setting up a server, or even an application, to run it in. Simply build and deploy your Azure Function, and it will be automatically hosted and scaled as required. You only pay for the time the function is running, it can respond to application requests, you can set up triggers, and it can access many different Azure services.Deploying ASP.NET Core to a Docker Container

To build your first Azure Function and get started with serverless computing, follow these 5 steps:

  1. Create an Azure Functions project
  2. Create an Azure Storage account
  3. Create and Debug an Azure Function
  4. Work with function.json
  5. Work with Azure Tables

The hands-on-lab instructions will walk you through creating the Azure Functions project in Visual Studio for Mac, deploying it to Azure, and persisting data with Azure Tables. This feature is so new it is only available in the Alpha channel release of Visual Studio for Mac. You’ll need to install an extension for Azure Functions, which the instructions will help you with.

Get Started

Download Visual Studio for Mac today, and visit the VS4Mac labs repo on GitHub. Both this week’s labs just scratch the surface of the capabilities being demonstrated: Docker support enables new testing and deployment options, and Azure Functions opens up a new, easier way to interact with powerful Azure services.

With the Visual Studio Community edition it is easy and free to get started. Check out the docs for more in-depth information on Visual Studio for Mac, and leave a comment below to suggest additional hands-on-labs you’d like to see.

Craig Dunn, Principal Program Manager
@conceptdev Craig works on the Mobile Developer Tools documentation team, where he enjoys writing cross-platform code for iOS, Android, Mac, and Windows platforms with Visual Studio and Xamarin.

The post Getting hands on with Visual Studio for Mac, containers, and serverless code in the cloud appeared first on The Visual Studio Blog.

Committing with Confidence: Commit Time Code Quality Information Updated

$
0
0

We are looking to improve your experience on the Visual Studio Blog. It would be very helpful if you could share your feedback via this short survey that should take less than 2 minutes. Thanks!

In the earlier post “Committing with Confidence: Getting Code Quality Information at Commit Time” , we introduced the new Build and Code Analysis Results panel, which gives you a heads-up reminder at commit-time of issues detected by any code analysis tool that puts results in the error list. This means you can take care of those issues before they propagate into your team’s CI/CD process, and commit with confidence.

The panel shows results both for live edit-time analysis (e.g. C#/VB Analyzers) and, via the Refresh Analysis button, for batch-style static analysis (e.g. C++ Static Analysis tools). It is supported on Visual Studio 2017 Enterprise. At present it supports code being committed to a Git Repo.

Focus on the issues that relate to your changed files

In response to your feedback, we have just released a new version of the Continuous Delivery Tools for Visual Studio extension which updates the panel to show issue counts only for files that are changed in the set of files you are committing, so that you can focus only on the issues related to your changes.

Commit Time Code Quality Information Updated - Team Explorer

The View Issues link will take you straight to a view of the Error List filtered to just the changed files too:

Commit Time Code Quality Information Updated - Focus on issues related to changed files

Please continue to share your feedback!

Please download and try the Continuous Delivery Tools for Visual Studio extension, try out the updated Build and Code Analysis Results panel and let us know what you think of the change, as well as the overall experience.

Tell us more about what you think about this by filling out a short survey

We’re always looking for feedback on where to take this Microsoft DevLabs extension next; features from the extension have already been refined using that feedback and incorporated into the core Visual Studio product in updates. There’s a Slack channel and a team email alias vsdevops@microsoft.com where you can reach out to the team and others in the community sharing ideas on this topic.

Mark Wilson-Thomas, Senior Program Manager, Visual Studio IDE Team
@MarkPavWTMark is a Program Manager on the Visual Studio IDE team, where he’s been building developer tools for nearly 10 years. He currently looks after the Visual Studio Editor. Prior to that, he worked on tools for Office, SQL, WPF and Silverlight.

 

The post Committing with Confidence: Commit Time Code Quality Information Updated appeared first on The Visual Studio Blog.

Remote Python Development in Visual Studio Code

$
0
0

Today at PyCon 2019, Microsoft’s Python and Visual Studio Code team announced remote development in Visual Studio Code, enabling Visual Studio Code developers to work in development setups where their code and tools are running remotely inside of docker containers, remote SSH hosts, and Windows Subsystem for Linux (WSL), while you still get a rich and seamless user experience locally.

This support is currently available in Visual Studio Code Insiders builds, and is enabled by three new remote extensions in the Visual Studio Code marketplace:

  • Remote-Containers: develop in workspaces running inside of local docker containers
  • Remote-SSH: develop in workspaces running on a remote machine over an SSH connection
  • Remote-WSL: develop in workspaces running inside of Windows Subsystem for Linux

To get started download the remote extension pack, check out the Visual Studio Code Remote documentation, and dive right in! Check out the video below for a quick tour and keep on reading to learn more!

The ability to work with WSL and remote Python interpreters have long been the top requested features on our Python Extension GitHub page. We have heard from our Python users many different reasons why they need to work in remote workspaces: in the case of SSH their code needs access to large amounts of data, compute, GPUs, or other resources; with Docker they need to be able to create and switch between development environments with complex dependencies; and with WSL they may need tools and packages that are only available in a Linux environment.

With remote development, we’ve enabled all of these scenarios with remote Python interpreters and more: Visual Studio Code’s UI runs on your local machine and connects to a remote server which hosts your extensions remotely. This enables features like auto-completions, debugging, the terminal, source control, extensions you install, almost everything in Visual Studio Code runs seamlessly on the remote machine as if it was your local development workspace.

Remote Docker Workspaces and Dev Containers

Docker containers are a popular way to create reproducible development environments without having to install complex dependencies on your local machine. This also allows new team members to reproduce your environment by installing docker and opening your workspace in Visual Studio Code.

The “Remote – Containers” extension allows Visual Studio Code to work seamlessly in this development environment using the concept of dev containers. A dev container is defined by files in a .devcontainer folder and tells Visual Studio Code how to create a Docker environment for that workspace. You can use a Dockerfile to create a single container or a docker-compose.yml for running multiple containers.

To get started developing in a docker container, run the Remote-Containers: Open Folder in Containers… command and then browse to a folder on your local machine. If a .devcontainer folder is found in the workspace root, Visual Studio Code will create the dev container use the existing dev container definition.

If no container definition exists, you will be prompted to create a new dev container for that workspace. We have built-in definitions for Python developers that let you get started with Python 2/3, Python3+Postgres, Miniconda, and Anaconda:

A list of pre-made python containers: Python 2, Python 3, Anaconda, Miniconda, Python3 + PostgreSQL

Once you open a dev container, Visual Studio Code will create the docker image(s) and then installs a light-weight remote server and voilà, you are now coding inside of the dev container! The remote server allows Visual Studio Code to run extensions remotely, such that almost all functionality works just like it does in your local environment.

Your files are volume-mounted into the container so you can open files, and start editing code and get IntelliSense and auto-completions:

You can start debugging, set breakpoints and step through code:

You can run cells and view graphical output in the Python Interactive window:

… and when you open the terminal you are using the terminal inside of the container!

Instead of creating and using dev container definitions, you can also attach Visual Studio Code to any running docker container with the Docker extension and right->clicking on a container and selecting Attach Visual Studio Code:

Attaching visual studio code to a container, using the Docker extension

When attaching to an existing container you may be missing some dependencies used by Visual Studio Code, so if you run into issues check our sample container definitions in our vscode-dev-containers repo for dependencies to add to your Dockerfile.

If you want to get back to the normal local view, you can run the Remote-Containers: Reopen Folder Locally command. All your changes are saved to your workspace via a volume mounted drive and so will be available in the local view.

Remote SSH Workspaces

Another common setup Python developers have is that their development environment is running on a remote machine, often because the remote machine has access to resources or data sets that are not available on the local machine. Like the docker scenario, you can use the “Remote-SSH” extension to open a remote workspace over an SSH connection.

To get started, first ensure that you can ssh to the remote machine from the command line, and then run Remote-SSH: New Window and enter the SSH host you wish to target:

If you configure SSH hosts, they will be readily available in the Remote SSH activity bar for easy connection:

Visual Studio Code will re-open and then install the remote server on the SSH host. You can then open a folder on the remote machine:

… and you can then get started editing and debugging code right away! In this case, none of your source code needs to be local for this to work, all the editing and debugging capabilities are provided by the remote server.

Remote WSL Workspaces

The Windows Subsystem for Linux allows you to run a native Linux bash shell running in windows, similarly to other scenarios. The “Remote-WSL” extension allows the Visual Studio Code UI to run on your windows desktop, while the Visual Studio Code remote server runs extensions inside of the Linux subsystem.

To get started with Remote WSL, run the Remote-WSL: New Window command:

It will then install the Visual Studio Code remote server and give you a new Visual Studio Code window running in a WSL context.

You can click on Open folder… and it will allow you to open a folder in the Linux file system:

Alternatively, you can open a folder in WSL directly by from the wsl prompt by typing “code-insiders .”.

You can then create files, edit code, open a WSL terminal, and debug just like the other remote development environments:

Get Started Now

We are excited for the capabilities this unlocks in Visual Studio Code for Python developers, to get started head over to the Visual Studio Code Remote docs, and/or try out some of our sample apps:

Be sure to let us know of any issues or feedback on the remote extensions by filing issues on our vscode-remote-release GitHub page.

The post Remote Python Development in Visual Studio Code appeared first on Python.


Availability of Microsoft R Open 3.5.2 and 3.5.3

$
0
0

It's taken a little bit longer than usual, but Microsoft R Open 3.5.2 (MRO) is now available for download for Windows and Linux. This update is based on R 3.5.2, and accordingly fixes a few minor bugs compared to MRO 3.5.1. The main change you will note is that new CRAN packages released since R 3.5.1 can now be used with this version of MRO.

Microsoft R Open 3.5.3, based on R 3.5.3, will be available next week, on May 10. Microsoft R Open 3.6.0, based on the recently-released R 3.6.0, is currently under development, and we'll make an announcement here when it's available too.

One thing to note: as of version 3.5.2 Microsoft R Open is no longer distributed for MacOS systems. If you want to continue to take advantage of the Accelerate framework on MacOS for, it's not too difficult to tweak the CRAN binary for MacOS to enable multi-threaded computing, and the additional open source bundled packages (like checkpoint and iterators) are available to install from CRAN or GitHub. 

As always, we hope you find Microsoft R Open useful, and if you have any comments or questions please visit the Microsoft R Open forum. You can follow the development of Microsoft R Open at the MRO Github repository. To download Microsoft R Open, simply follow the link below.

MRAN: Download Microsoft R Open

LaLiga entertains millions with Azure-based conversational AI

$
0
0

For LaLiga, keeping fans entertained and engaged is a top priority. And when it comes to fans, the Spanish football league has them in droves, with approximately 1.6 billion social media followers around the world. So any time it introduces a new feature, forum, or app for fans, instant global popularity is almost guaranteed. And while this is great news for LaLiga, it also poses technical challenges—nobody wants systems crashing or going unresponsive when millions of people are trying out a fun new app.

When LaLiga chose to develop a personal digital assistant running on Microsoft Azure, its developers took careful steps to ensure optimal performance in the face of huge user volume in multiple languages across a variety of voice platforms. Specifically, the league used Azure to build a conversational AI solution capable of accommodating the quirks of languages and nicknames to deliver a great experience across multiple channels and handle a global volume of millions of users.

Along the way, some valuable lessons emerged for tackling a deployment of this scope and scale.

Accommodating the quirks of languages and nicknames

The LaLiga virtual assistant has launched for Google Assistant and Skype, and it will eventually support 11 platforms. The assistant was created with Azure Bot Service and the Microsoft Bot Framework, and it incorporates Azure Cognitive Services and a variety of other Azure tools. The main engine for the assistant takes advantage of the scalability and flexibility of Azure App Service —a platform as a service (PaaS) offering—to streamline development. LaLiga used it multiple times to accelerate the development of the bot logic, image service, Google Assistant connector, Alexa connector, data loaders, cache management, and two Azure functions for live data and proactive messages.

A chart of the LaLiga virtual assistant architecture

Figure 1. An overview of the LaLiga virtual assistant architecture

Fans can ask the assistant questions using natural language, and the system parses this input to determine user intent by using Azure Text Analytics and Azure Cognitive Services Language Understanding in either Spanish or English. That may seem straightforward, but developers learned that subtitles of language can complicate the process. For example, the primary word for “goalkeeper” is different in the Spanish dialects used in Spain, Argentina, and Colombia. So the mapping of questions to intents needed to accommodate a many-to-one relationship for these variations.

A similar issue arose with players whose names have complicated spellings and don’t clearly correspond to the pronunciation - for example, "Griesman" instead of "Griezmann" - resulting in a variety of translations. The solution here was to use aliases to guide the system to the correct player. Nicknames were another sticking point. Developers used Azure Monitor Application Insights to investigate user queries that weren’t mapping to any existing player and found that a lot of people were asking about a player but using his nickname rather than his official name. Once again, aliases came to the rescue

Guaranteeing a great experience across multiple channels

One goal of the development team was to support a consistent, high-quality user experience across different mobile platforms and devices, each of which has its own display parameters and may also have different connection protocols. In response to every user query, the LaLiga virtual assistant returns three elements: an image, some text, and a voice reply. The image can be a picture of a player or a “hero card” showing match results or player statistics. For channels with a visual display, the image and text are customized with XAML to make them easily legible for the specific display resolution.

All channels aren’t created equal when it comes to user popularity, either. LaLiga expects that some channels will be used much more frequently than others, and this requires adjustments to properly manage scalability resources. Developers created an app service for each channel and optimized it for anticipated usage.

Developers also needed to customize the connectors that the assistant uses for different channels depending on the channels’ capabilities and requirements. For example, the Alexa interface is based on Microsoft .NET Framework, which made it straightforward to develop a connection with Microsoft tools, but Google Assistant uses Node.js, requiring more complex development. Developers found it tricky to map messages from the LaLiga virtual assistant to types that Google Assistant understands. Adding a custom connector hosted with App Service resolved the issue. App Service also helps manage the scalability requirements for the channel. Microsoft is using the lessons learned from the LaLiga virtual assistant to help all developers streamline the creation of connectors with Azure-based bots.

A chart showing integration with Google Assistant and Alexa

Figure 2. An overview of integration with Google Assistant and Alexa

Planning for millions of enthusiastic users

LaLiga anticipates that the assistant will be hugely popular and that most users will ask multiple questions, generating a vast number of hits on the system each day and leading to high consumption of computing resources. Developers adopted multiple strategies to mitigate this high demand.

Incoming queries get divided into two categories—live and non-live. A live query could be one about a match in progress, where the data could be constantly changing, whereas a non-live query might relate to a completed game or player’s basic statistics. Whenever a non-live query arrives, the result is cached, so the answer is readily available if someone else asks the same question. The LaLiga virtual assistant uses a highly optimized Azure SQL database as its main data storage, rather than a non-structured data lake, to expedite results.

Because scalability was a big concern, the team decided early to dedicate a developer to scalability testing. The developer created an automated system to simulate queries to the assistant, eventually testing millions of hits a day and setting the stage for a smooth launch. Bombarding the system with so many queries revealed another hitch—all those hits were genuine queries, but some web services might see that huge volume and think that the system is being hit by a distributed denial of service (DDOS) attack. So it’s essential to ensure that all components are configured to account for the popularity of the assistant.

Learn from LaLiga and build your own great bot

While some of these use cases may seem straightforward, the LaLiga virtual assistant development experience showed that sometimes a small tweak to development processes or application configuration can yield substantial rewards in terms of system performance and development time. We hope that the lessons learned during the LaLiga project will help you build your own massively popular digital assistant!

Read the case study for more information about LaLiga’s digital transformation and its initiatives to boost fan engagement.

Get started building your own branded virtual assistant.

Start more simply and build your first Q&A bot with QnA Maker.

Helping small businesses protect their greatest asset—their data

It’s National Small Business Week in the U.S.—what’s your workplace advantage?

Azure.Source – Volume 81

$
0
0

We’re really looking forward to Microsoft Build 2019, our premier event for developers happening next week, May 6-8 at the Washington State Convention Center in Seattle. It’s a chance for developers to gain access to the latest updates and developments across Microsoft’s products and solutions, understand our strategy and product roadmaps, and learn about new technology and open source software in innovative ways. Even the weather in Seattle looks like it's going to cooperate.

The Azure team is of course a big part of Build. We’ll be giving loads of presentations, and also posting a wide range of blog posts providing more details about what we share throughout the event. This Channel 9 video gives you a look ahead with the Azure IoT team:

As we're gearing up for Microsoft Build 2019, the IoT Show goes into Azure IoT's building on the Microsoft Campus to meet some of the speakers who are preparing awesome IoT content for you.

In the meantime, there’s plenty of other stuff going on around Azure right now. Here are some of the highlights:

News and updates:

Intelligent edge innovation across data, IoT, and mixed reality

We're at an incredibly exciting technology inflection point. The virtually limitless computing power of the cloud, combined with increasingly connected and perceptive devices at the edge of the network, create possibilities we could only have dreamed of just a few years ago – possibilities made up of millions of connected devices, infinite data, and the ability to create truly immersive multi-sense, multidevice experiences. This post looks at some of the newest advances.

Digitizing trust: Azure Blockchain Service simplifies blockchain development

In a rapidly globalizing digital world, business processes touch multiple organizations and great sums are spent managing workflows that cross trust boundaries. As digital transformation expands beyond the walls of one company and into processes shared with suppliers, partners, and customers, the importance of trust grows with it. Microsoft’s goal is to help companies thrive in this new era of secure multi-party computation by delivering open, scalable platforms, and services that any company from game publishers and grain processors, to payments ISVs and global shippers can use to digitally transform the processes they share with others.

Making AI real for every developer and every organization

AI is fueling the next wave of transformative innovations that will change the world. With Azure AI, our goal is to empower organizations to apply AI across the spectrum of their business to engage customers, empower employees, optimize operations, and transform products. Read this blog to learn about our three guiding investment principles.

Technical content

Azure Stack IaaS – part seven

This blog post covers the automation options in your Cloud IaaS toolkit. We’ve come a long way – in the virtualization days, before cloud and self-service, it took a while to get all the approvals, credentials, virtual LANs (VLANs), logical unit numbers (LUNs), etc. It took so long, that the actual creation part was easy. When cloud came along with self-service, not only was it easier to create a virtual machine (VM) without relying on others, but it changed our thinking about whether VMs were precious or disposable. We’ll discuss!

Migrating big data workloads to Azure HDInsight

Migrating big data workloads to the cloud remains a key priority for our customers, and Azure HDInsight is committed to making that journey simple and cost-effective. HDInsight partners with Unravel, whose mission is to reduce the complexity of delivering reliable application performance when migrating data from on-premises or a different cloud platform onto HDInsight. Unravel’s Application Performance Management (APM) platform brings a host of services towards providing unified visibility and operational intelligence to plan and optimize the migration process onto HDInsight.

Deploy a FHIR sandbox in Azure

In connection with HIMSS 2019, we announced the Azure API for FHIR, which provides our customers with an enterprise grade, managed FHIR® API in Azure. Since then, we've been busy improving the service with new configuration options and features. Some of the features we have been working on include authentication configuration and the SMART on FHIR Azure Active Directory Proxy, which enables the so-called SMART on FHIR EHR launch with the Azure API for FHIR. We've developed a sandbox environment that illustrates how the service and the configuration options are used. In this post, we focus on how to deploy the sandbox in Azure. Later blog posts will dive into some of the technical details of the various configuration options.

5 internal capabilities to help you increase IoT success

This article is the third in a four-part series designed to help companies maximize their ROI on IoT. In the first post, we discussed how IoT can transform businesses. In the second post, we shared insights on how to create a successful strategy that yields desired ROI. In this third post, we discuss how companies can move forward by identifying and filling capability gaps. Let’s dive into some ideas about how to solve some of the challenges that could slow your IoT progress.

Monitoring enhancements for VMware and physical workloads protected with Azure Site Recovery

Azure Site Recovery has enhanced the health monitoring of your workloads by introducing various health signals on the replication component, Process Server. The Process Server (PS) in a hybrid disaster recovery (DR) scenario is a vital component of data replication. It handles replication caching, data compression, and data transfer. Once the workloads are protected, issues can be triggered due to multiple factors including high data change rate (churn) at source, network connectivity, available bandwidth, under provisioning the Process Server, or protecting a large number of workloads with a single Process Server. These may lead to a bad state of the PS and have a cascading effect on replication of VMs. Troubleshooting these issues is now made easier with additional health signals from the Process Server.

Building recommender systems with Azure Machine Learning service

Recommendation systems are used in a variety of industries, from retail to news and media. If you’ve ever used a streaming service or ecommerce site that has surfaced recommendations for you based on what you’ve previously watched or purchased, you’ve interacted with a recommendation system. With the availability of large amounts of data, many businesses are turning to recommendation systems as a critical revenue driver. However, finding the right recommender algorithms can be very time consuming for data scientists. This is why Microsoft has provided a GitHub repository with Python best practice examples to facilitate the building and evaluation of recommendation systems using Azure Machine Learning services.

Six Principles to Build Healthy Data-Driven Organizations

Organizations are increasingly forming teams around the function of Data Science. Data Science is a field that combines mathematics, programming, and visualization techniques and applies scientific methods to specific business domains or problems, like predicting future customer behavior, planning air traffic routes, or recognizing speech patterns. But what does it really mean to be a data-driven organization? InfoQ takes a look.

Azure shows

MSDN Channel 9

DevOps for ASP.NET Developers, Pt. 7

In part 7 of our series, Abel and Jeremy show us two ways to scaffold out an Azure DevOps pipeline. We'll see how to use Azure DevOps projects via the Azure portal, which gives us a UI to configure everything. Also, they'll show us how to use the Yo Team generator that allows us to work from the command line.

Detect Shake (Xamarin. Essentials API of the Week)

Xamarin.Essentials provides developers with cross-platform APIs for their mobile applications. On this week's Xamarin.Essential API of the week, we take a look at the Detect Shake API to help you detect when a user shakes a device.

YouTube

Mastering Azure using Cloud Shell, PowerShell and Bash!

Azure can be managed in many different ways. Learn your command line options like Azure PowerShell, Azure CLI, and Cloud Shell to be more efficient in managing your Azure infrastructure. Become a hero on the shell to manage the cloud!

The Azure Podcast

Episode 276 – Cloud simplified

Ryan Berry, an Azure Cloud Solutions Architect at Microsoft, talks about his own YouTube Channel where they distill down complex topics into bite sized chunks to make it easy for you to quickly leverage these features to address similar requirements you may have for moving something into Azure.

Azure DevOps Podcast

Rob Richardson on Containers in Azure

In this episode, Rob explains the critical steps when creating a container, what developers should consider when looking to run and support Containers through Azure, and much, much more.

Viewing all 5971 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>