Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

Because it’s Friday: Stars in Motion

$
0
0

In the center of our galaxy, in the direction of the constellation Sagittarius, lies the massive black hole around which the entire galaxy revolves. The European Space Agency has observed that region for 26 years, and you can actually see stars orbiting the the black hole! (via @_Astro_Nerd_)

Kinda puts things into perspective, doesn't it? Anyway, that wraps things up for this week. We'll be back again after the weekend — have a good one!


The Extremely Promising State of Diabetes Technology in 2018

$
0
0

This blog post is an update to these two Diabetes Technology blog posts:

You might also enjoy this video of the talk I gave at WebStock 2018 on Solving Diabetes with an Open Source Artificial Pancreas*.

First, let me tell you that insulin is too expensive in the US.

Between 2002 and 2013, the price of insulin jumped, with the typical cost for patients increasing from about $40 a vial to $130.

Open Source Artificial Pancreas on iPhoneFor some of the newer insulins like the ones I use, I pay as much as $296 a bottle. I have a Health Savings Plan so this is often out of pocket until I hit the limit for the year.

People in America are rationing insulin. This is demonstrable fact. I've personally mailed extra insulin to folks in need. I've meet young people who lost their insurance at age 26 and have had to skip shots to save vials of insulin.

This is a problem, but on the technology side there's some extremely promising work happening, and it's we have really hit our stride in the last ten years.

I wrote the first Glucose Management system for the PalmPilot in 1998 called GlucoPilot and provided on the go in-depth analysis for the first time. The first thing that struck me was that the PalmPilot and the Blood Sugar Meter were the same size. Why did I need two devices with batteries, screens, buttons and a CPU? Why so many devices?

I've been told every year the a Diabetes Breakthrough is coming "in five years." It's been 25 years.

In 2001 I went on a trip across the country with my wife, an insulin pump and 8 PDAs (personal digital assistants, the "iPhones" of the time) and tried to manage my diabetes using all the latest wireless technology...this was the latest stuff 17 years ago. I had just moved from injections to an insulin pump. Even now in 2018 Insulin Pumps are expensive, mostly proprietary, and super expensive. In fact, many folks use insulin pumps in the states use out of warranty pumps purchased on Craigslist.

Fast forward to 2018 and I've been using an Open Source Artificial Pancreas for two years.

The results speak for themselves. While I do have bad sugars sometimes, and I do struggle, if you look at my blood work my HA1c (the long term measurement of "how I'm doing" shows non-diabetic levels. To be clear - I'm fully and completely Type 1 diabetic, I produce zero insulin of my own. I take between 40 and 50 Units of insulin every day, and have for the last 25 years...but I will likely die of old age.

Open Source Artificial Pancreas === Diabetes results pic.twitter.com/ZSsApTLRXq

— Scott Hanselman (@shanselman) September 10, 2018

This is significant. Why? Because historically diabetics die of diabetes. While we wait (or more accurately, #WeAreNotWaiting) for a biological/medical solution to Type 1 diabetes, the DIY (Do It Yourself) community is just doing it ourselves.

Building on open hardware, open software, and reverse-engineered protocols for proprietary hardware, the online diabetes community literally has their choice of open source pancreases in 2018! Who would have imagined it. You can choose your algorithms, your phone, your pump, your continuous glucose meter.

Today, in 2018, you can literally change the code and recompile a personal branch of your own pancreas.

Watch my 2010 YouTube video "I am Diabetic" as I walk you through the medical hardware (pumps, needles, tubes, wires) in managing diabetes day to day. Then watch my 2018 talk on Solving Diabetes with an Open Source Artificial Pancreas*.

I believe that every diabetic should be offered a pump, a continuous glucose meter, and trained on some kind of artificial pancreas. A cloud based reporting system has also been a joy. My wife and family can see my sugar in real time when I'm away. My wife has even called me overseas to wake me up when I was in a bad sugar situation.

Artificial Pancreas generations

As the closed-hardware and closed-software medical companies work towards their own artificial pancreases, the open source community feel those companies would better serve us by opening up their protocols, using standard Bluetooth ISO profiles and use security best practices.

Looking at the table above, the open source community is squarely in #4 and moving quickly into #5. But why did we have to do this ourselves? We got tired of waiting.

All in all, through open software and hardware, I can tell you that my life is SO MUCH BETTER than it was when I was first diagnosed. I figure we'll have this all figured out in about five years, right? ;)

THANK YOU!

MORE DIABETES READING

* Yes there are some analogies, stretched metaphors, and oversimplifications in this talk. This talk is an introduction to the space to the normally-sugared. If you are a diabetes expert you might watch and say...eh...ya, I mean, it kind of works like that. Please take the talk in in the thoughtful spirit it was intended.


Sponsor: Get home early, eat supper on time and coach your kids in soccer. Moving workloads to Azure just got easy with Azure NetApp Files. Sign up to Preview Azure NetApp Files!



© 2018 Scott Hanselman. All rights reserved.
     

Introducing GitHub Pull Requests for Visual Studio Code

Announcing Azure Pipelines with unlimited CI/CD minutes for open source

$
0
0

With the introduction of Azure DevOps today, we’re offering developers a new CI/CD service called Azure Pipelines that enables you to continuously build, test, and deploy to any platform or cloud. It has cloud-hosted agents for Linux, macOS, and Windows, powerful workflows with native container support, and flexible deployments to Kubernetes, VMs, and serverless environments.

Microsoft is committed to fueling open source software development. Our next step in this journey is to provide the best CI/CD experience for open source projects. Starting today, Azure Pipelines provides unlimited CI/CD minutes and 10 parallel jobs to every open source project for free. All open source projects run on the same infrastructure that our paying customers use. That means you’ll have the same fast performance and high quality of service. Many of the top open source projects are already using Azure Pipelines for CI/CD, such as Atom, Cpython, Pipenv, Tox, Visual Studio Code, and TypeScript – and the list is growing every day.

In the following, you can see Atom running parallel jobs on Linux, macOS, and Windows for its CI.

atom2x

Azure Pipelines app on GitHub Marketplace

Azure Pipelines has an app in the GitHub Marketplace so it’s easy to get started. After you install the app in your GitHub account, you can start running CI/CD for all your repositories.

pipelines2x

Pull Request and CI Checks

When the GitHub app is setup, you’ll see CI/CD checks on each commit to your default branch and every pull request.

highlight2x

Our integration with the GitHub Checks API makes it easy to see build results in your pull request. If there’s a failure, the call stack is shown as well as the impacted files.

githubchecks2x

More than just open source

Azure Pipelines is also great for private repositories. It is the CI/CD solution for companies like Columbia, Shell, Accenture, and many others. It’s also used by Microsoft’s biggest projects like Azure, Office 365, and Bing. Our free offer for private projects includes a cloud-hosted job with 1,800 minutes of CI/CD a month or you can run unlimited minutes of CI/CD on your own hardware, hosted in the cloud or your on-premises hardware. You can purchase parallel jobs for private projects from Azure DevOps or the GitHub Marketplace.

In addition to CI, Azure Pipelines has flexible deployments to any platform and cloud, including Azure, Amazon Web Services, and Google Cloud Platform, as well as any of your on-premises server running Linux, macOS or Windows. There are built-in tasks for Kubernetes, serverless, and VM deployments. Also, there’s a rich ecosystem of extensions for the most popular languages and tools. The Azure Pipelines agent and tasks are open source and we’re always reviewing feedback and accepting pull requests on GitHub.

Join our upcoming live streams to learn more about Azure Pipelines and other Azure DevOps services.

  • Keynote: Watch our live Azure DevOps keynote on September 11, 2018 from 8:00 - 9:30 AM Pacific Time.

  • Live training: Join our live Mixer workshop with interactive Q&A on September 17, 2018 from 8:30 AM - 2:30 PM Pacific Time.

You can save-the-date and watch both live streams on our events page. There you’ll also find additional on-demand videos and other resources to help get you started.

I’m excited for you to try Azure Pipelines and tell us what you think. You can share your thoughts directly to the product team using @AzureDevOps, Developer Community, or comment on this post.

 

Jeremy Epling

@jeremy_epling

Introducing Azure DevOps

$
0
0

Today we are announcing Azure DevOps. Working with our customers and developers around the world, it’s clear DevOps has become increasingly critical to a team’s success. Azure DevOps captures over 15 years of investment and learnings in providing tools to support software development teams. In the last month, over 80,000 internal Microsoft users and thousands of our customers, in teams both small and large, used these services to ship products to you.

The services we are announcing today span the breadth of the development lifecycle to help developers ship software faster and with higher quality. They represent the most complete offering in the public cloud. Azure DevOps includes:

Azure PipelinesAzure Pipelines

CI/CD that works with any language, platform, and cloud. Connect to GitHub or any Git repository and deploy continuously. Learn More >

Azure BoardsAzure Boards

Powerful work tracking with Kanban boards, backlogs, team dashboards, and custom reporting. Learn more >

Azure ArtifactsAzure Artifacts

Maven, npm, and NuGet package feeds from public and private sources. Learn more >

Azure ReposAzure Repos

Unlimited cloud-hosted private Git repos for your project. Collaborative pull requests, advanced file management, and more. Learn more >

Azure Test PlansAzure Test Plans

All in one planned and exploratory testing solution. Learn more >

Each Azure DevOps service is open and extensible. They work great for any type of application regardless of the framework, platform, or cloud. You can use them together for a full DevOps solution or with other services. If you want to use Azure Pipelines to build and test a Node service from a repo in GitHub and deploy it to a container in AWS, go for it. Azure DevOps supports both public and private cloud configurations. Run them in our cloud or in your own data center. No need to purchase different licenses. Learn more about Azure DevOps pricing.

Here's an example of Azure Pipelines used independently to build a GitHub repo:

pipelinesbuild2x

Alternatively, here’s an example of a developer using all Azure DevOps services together from the vantage point of Azure Boards.

boards2x

Open Source projects receive free CI/CD with Azure Pipelines

As an extension of our commitment to provide open and flexible tools for all developers, Azure Pipelines offers free CI/CD with unlimited minutes and 10 parallel jobs for every open source project. With cloud hosted Linux, macOS and Windows pools, Azure Pipelines is great for all types of projects.

Many of the top open source projects are already using Azure Pipelines for CI/CD, such as Atom, Cpython, Pipenv, Tox, Visual Studio Code, and TypeScript – and the list is growing every day.

We want everyone to have extremely high quality of service. Accordingly, we run open source projects on the same infrastructure that our paying customers use.

Azure Pipelines is also now available in the GitHub Marketplace making it easy to get setup for your GitHub repos, open source or otherwise. 

Here’s a walkthrough of Azure Pipelines:

Learn more >

The evolution of Visual Studio Team Services (VSTS) 

Azure DevOps represents the evolution of Visual Studio Team Services (VSTS). VSTS users will be upgraded into Azure DevOps projects automatically. For existing users, there is no loss of functionally, simply more choice and control. The end to end traceability and integration that has been the hallmark of VSTS is all there. Azure DevOps services work great together. Today is the start of a transformation and over the next few months existing users will begin to see changes show up. What does this mean?

  • URLs will change from abc.visualstudio.com to dev.azure.com/abc. We will support redirects from visualstudio.com URLs so there will not be broken links.
  • As part of this change, the services have an updated user experience. We continue to iterate on the experience based on feedback from the preview. Today we’re enabling it by default for new customers. In the coming months we will enable it by default for existing users.
  • Users of the on-premises Team Foundation Server (TFS) will continue to receive updates based on features live in Azure DevOps. Starting with next version of TFS, the product will be called Azure DevOps Server and will continue to be enhanced through our normal cadence of updates.

Learn more

To learn more about Azure DevOps, please join us:

  • Keynote: Watch our live Azure DevOps keynote on September 11, 2018 from 8:00 - 9:30 AM Pacific Time.

  • Live training: Join our live Mixer workshop with interactive Q&A on September 17, 2018 from 8:30 AM - 2:30 PM Pacific Time.

You can save-the-date and watch both live streams on our events page. There you’ll also find additional on-demand videos and other resources to help get you started.

We couldn’t be more excited to offer Azure DevOps to you and your teams. We can’t wait to see what amazing things you create with it.

What’s new in Azure DevOps Launch Update

$
0
0
We shared some exciting news this morning on the Azure blog: We’re bringing Azure DevOps to developers and enterprises around the world who need flexible and efficient tools for their development process. With the evolution of Visual Studio Team Services (VSTS) into Azure DevOps, current VSTS customers will shortly receive a Launch Update that seamlessly... Read More

Learn how Key Vault is used to secure the Healthcare AI Blueprint

$
0
0

System security is a top priority for any healthcare organization. There are many types of security including physical, network, application, email and so on. This article covers the system security provided by Azure Key Vault. Specifically, we examine the Key Vault implementation used in the Azure Healthcare blueprint. The intent is to demonstrate how a Key Vault works by seeing it used with the blueprint.

image

Securing sensitive data in the real world

In a healthcare organization there are potentially dozens (or hundreds) of users that need access to sensitive data from diverse sources. Doctors, technicians, receptionists — some need access to just x-rays, some to payment schedules, and doctors need patient records. The matrix of users and data stores can be large. Managing so many permissions could be a nightmare. For dashboards or other user interfaces, permission needs to be granted to service accounts. For example, in machine learning a data scientist may need to query data from many data repositories to find correlations, and will need appropriate rights to those data stores.

In the blueprint, a Key Vault stores data like passwords and secrets that system users need access to things like databases and Machine Learning studio (MLS).

What Key Vault provides

Most complex systems require a secure and reliable way to store various types of data. This is especially true in regulated environments like healthcare, where patient and other confidential information must be protected.

A Key Vault provides separation of the application from any cryptography needed by that application. For example, a custom application deployed to Azure can access any keys, secrets, or certificates it has permission to access from a URL at runtime.

When using Key Vault to provide reliable technical security, it stores keys, certificates and what are referred to as “secrets.” A secret may be any string that is sensitive, such as a database connection string or a password.

Keys stored in key vaults are safeguarded by hardware security modules (HSM), a special trusted network computer performing a variety of cryptographic operations like key storage and management. Key Vaults take advantage of HSMs and are FIPS 140-2 Level 2 validated.

Using Key Vault

There may be more than one Key Vault in an Azure tenant. It is a good practice to dedicate one Key Vault to a single application or system and another to a different system. This ensures Key Vaults respect the single responsibility principal by separating application or system specific stored items. Single-purpose Key Vaults are easier to manage than storing data from all systems and applications in one vault.

Anyone with an Azure subscription can create and use Key Vaults. The person who creates the Key Vault is the owner for it. In the Healthcare for AI blueprint, the user Alex_siteadmin has full access to the information in the Key Vault created using her credentials during the initial installation.

Users may access data in the vault after the vault owner provides access permissions. The owner provides developers with URIs to call from their applications.

In the blueprint system, Alex_siteadmin may access the sensitive data (certificates, keys, secrets) in the vault. To do this via the portal, simply click the Key Vault resource and look under the “settings” section.

image

Click the type of setting you wish to see: keys, secrets or certificates. After clicking Secrets, one finds Sqldb-ConnectionString in the list shown below.

image 
Clicking Sqldb-ConnectionString reveals the URL to the connection string secret. Although Alex_siteadmin can see the URL to access the connection string, she cannot see the connection string itself.

Using Key Vault from an application

URLs are provided by Key Vaults for resource access in applications running under authored accounts. PowerShell and Azure CLI commands are available to retrieve keys secrets from vaults but accessing the key from a running application is typically the primary intent for storing items in the vault.

Secrets, keys and certificates are all available via a URL, but only if the current user has permissions to the item in the vault. Access policies must be set for users and service accounts to access vault items.

Managing Key Vaults

As mentioned above, it is a recommended practice to have separate instances of Key Vaults for different applications. This makes managing a Key Vault much simpler than storing secrets from multiple applications in a single vault.

It is also important to manage access policies for the various items in the vault. Accounts should be configured to have access solely to those items they need — and nothing else. Permissions are set via access policies which provide fine-grained control of permissions to items in the vault.

Deleting and restoring Key Vaults

“Soft delete” is a Key Vault feature that may be enabled on a vault. When this is true, if a Key Vault is deleted, it is recoverable for 90 days. It disappears from the Azure portal and it looks like the Key Vault has been completely deleted, like any other resource or service in Azure. This isn’t the case, however. It is held by Azure for 90 days and can be restored for any reason. Because of this precaution, a new Key Vault with the same name cannot be added to the Azure subscription until the “soft deleted” vault is truly deleted.

Restoring a Key Vault

To restore a Key Vault, owners may turn to PowerShell or the Azure CLI. In PowerShell, execute the following command.

Undo-AzureRmKeyVaultRemoval -VaultName <vault name> -ResourceGroupName <resource group> -Location <location>

Deleting a soft delete Key Vault

PowerShell and the Azure CLI also provides commands for permanently deleting a soft deleted vault. This can be used to delete the vault before the 90-day period has elapsed. This is the PowerShell command:

Remove-AzureRmKeyVault -VaultName <vault name> -InRemovedState -Location <location>

Wrapping up

The Azure Healthcare AI Blueprint makes extensive use of Key Vault. It helps to ensure HIPAA compliance and uphold HITRUST certification. Create a separate Key Vault to experiment with in your Azure environment. Do not change any items in the app’s Key Vault — otherwise, the application may stop working.

Key Vaults focus specifically on security. It is the central store for keys, certificates and secrets like database connection strings, passwords, and other sensitive information. When looking to secure your data and security keys, start your journey with Key Vault.

Recommended next steps:

  1. Do the quickstart: Set and retrieve a secret from Azure Key Vault using PowerShell. In this quickstart, you use PowerShell to create a key vault. You then store a secret in the newly created vault.
  2. Read the free solution guide Implementing the Azure blueprint for healthcare. Leverage the knowledge from this guide to deploy the Azure Healthcare AI Blueprint and explore how the blueprint is secured using key vault.

Azure.Source – Volume 48

$
0
0

Now in preview

Avere vFXT for Microsoft Azure now in public preview - The Avere vFXT for Azure is a high-performance cache for HPC workloads in the cloud that provides low-latency access to Azure compute from Azure Blob and file-based storage locations. The Avere vFXT for Azure has no charge associated with licensing; however, costs associated with consumption do apply at normal rates. Avere Systems joined Microsoft earlier this year.

Powerful Debugging Tools for Spark for Azure HDInsight - Microsoft runs one of the largest big data cluster in the world – internally called “Cosmos”, which runs millions of jobs across hundreds of thousands of servers over multiple exabytes of data. Being able to run and manage jobs of this scale by developers was a huge challenge. We built powerful tools that graphically show the entire job graph, which helped developers greatly and we want to bring that power to all HDInsight Spark developers. The default Spark history server user experience is now enhanced in HDInsight with rich information on your spark jobs with powerful interactive visualization of Job Graphs & Data Flows. The new features greatly assist Spark developers in job data management, data sampling, job monitoring and job diagnosis.

Screenshot of an example Spark job graph displaying Spark job execution details with data input and output across stages

Also in preview

Azure Friday

Azure Friday | Enhanced productivity using Azure Data Factory visual tools - Gaurav Malhotra joins Scott Hanselman to discuss the Azure Data Factory visual tools, which enable you to iteratively create, configure, test, deploy, and monitor data integration pipelines. We took into account your feedback to enable functional, performance, and security improvements to the visual tools.

Now generally available

The Azure Podcast

The Azure Podcast | Episode 245 - Azure Certifications - Microsoft Consultants Doug Strother and John Miller, both veterans of certifications, share some tips on getting your Azure certification.

News and updates

Exciting new capabilities on Azure HDInsight - Ashish Thapliyal Principal, Program Manager, Azure HDInsight provides a roll-up of the various updates for Azure HDInsight from the past few months, including Apache Phoenix and Zeppelin integration, Oozie support in HDInsight enterprise security package, Azure Data Lake Storage Gen2 integration, Virtual Network Service Endpoints support, support for Spark 2.3, and more.

Additional news and updates

The IoT Show

Screenshot from Monitoring And Diagnostics of an IoT Solution with Azure IoT Hub video

The IoT Show | Monitoring And Diagnostics of an IoT Solution with Azure IoT Hub - When your IoT solution is misbehaving, what do you do? John Lian, PM on the Azure IoT team, shows us some ways to configure proper alerts and get the right logs to troubleshoot common problems.

Screenshot from IoT In Action - Introducing Azure Sphere video

The IoT Show | IoT In Action - Introducing Azure Sphere - Check out this sneak peek into the upcoming Special Edition of the IoT in Action Webinar Series – An Introduction to Microsoft Azure Sphere! Azure Sphere is a solution for creating highly-secured, connected microcontroller unit (MCU) powered devices, providing you with the confidence and the power to reimagine your business and create the future.

Technical content and training

Transparent data encryption or always encrypted? - Transparent Data Encryption (TDE) and Always Encrypted are two different encryption technologies offered by SQL Server and Azure SQL Database. Generally, encryption protects data from unauthorized access in different scenarios. They are complementary features, and this blog post will show a side-by-side comparison to help decide which technology to choose and how to combine them to provide a layered security approach.

Connected arms: A closer look at the winning idea from Imagine Cup 2018 - Joseph Sirosh, Corporate Vice President, Artificial Intelligence & Research, takes a closer look at the champion project of Imagine Cup 2018, smartARM. smartARM is a low-cost, low-function and high-cost, higher-function below-elbow prostheses through its use of low-cost materials and AI. By combining low-cost materials with the cloud and AI, Samin Khan and Hamayal Choudhry present smartARM as an affordable yet personalized alternative to state-of-the-art advanced prosthetic arms.

Additional technical content

Azure tips & tricks

Screenshot from Learn about the underlying Software in Azure Cloud Shell video

Learn about the underlying Software in Azure Cloud Shell - Learn about the software found inside an Azure Cloud Shell instance. Get a deeper look into what happens when you fire an Azure Cloud Shell and what is happening in the underlying operating system.

Screenshot from How to use tags to quickly organize Azure Resources video

How to use tags to quickly organize Azure Resources - Learn how to take advantage of tags to organize your Azure resources. Watch how to use tags to add additional metadata, allowing you categorize and search for your resources in the Azure portal.

Events

Join us for two days of live streaming activity that will help you learn all the details of what was announced for Azure DevOps set of services for developers, Azure Pipelines CI/CD platform and new GitHub extensions from Microsoft:

Azure DevOps Keynote - Keynote: Watch our live Azure DevOps keynote on September 11, 2018 from 8:00 - 9:30 AM Pacific Time (GMT -7).

Azure DevOps Workshop - Join our live Mixer workshop with interactive Q&A on September 17, 2018 from 8:30 AM - 2:30 PM Pacific Time (GMT -7).

Tuesdays with Corey

Screenshot from Azure Event Hubs with Kafka coolness video

Tuesdays with Corey | Azure Event Hubs with Kafka coolness - Corey Sanders, Corporate VP - Microsoft Azure Compute team sat down with Dan Rosanova, Principal PM responsible for Azure messaging services. Dan shares some new functionality around Event Hubs with Kafka compatibility.

Industries

Delivering innovation in retail with the flexible and productive Microsoft AI platform - Learn how the application of AI is transforming the retail and consumer goods industry resulting in positive business improvements aimed at solving a range of service to production-type problems. Whether you want to use pre-built AI APIs or develop custom models with your data, Microsoft's AI platform offers flexibility using the infrastructure and tools available in the platform. Determine the best path forward by understanding each approach.

Save money on actuarial compute by retiring your on-premises HPC grids - Every insurance company sees the increasing demand from growth in the number of policies processed, and new regulations that require changes to the actuarial and accounting systems. FRS-17 requires changes to workflows, reporting and control throughout the actuarial and accounting process. Read this post to learn why now is the time to move to a cloud-based solution on Azure, and download the Actuarial risk compute and modeling overview to understand your options for moving your compute workload to Azure.

Diagrm showing 27 logos representing various industry compliance programs

Finish your insurance actuarial modeling in hours, not days - Actuarial departments everywhere work to make sure that their financial and other models produce results which can be used to evaluate their business for regulatory and internal needs. Today, it is common for quarterly reporting to require thousands of hours of compute time. Most actuarial packages on the market support in-house grids with the ability to either extend the grid to Azure, or to use Azure as the primary environment. Learn how actuarial teams review their models and processes to maximize the scale benefits of the cloud.

Anti-money laundering – Microsoft Azure helping banks reduce false positives - Anti-Money Laundering (AML) Transactions Monitoring Systems (TMS) identify suspicious transactions that may involve illicit proceeds or legitimate proceeds used for illegal purposes. AML costs across the banking industry due to this regulatory blunt force approach, has resulted in massive inefficiency and high operating costs for banks. The predominant TMS technologies are antiquated batch rule-based systems that have not fundamentally changed since the late ‘90s, and have not kept pace with increasing regulatory expectations and continually evolving money laundering techniques. Learn how the application of ML and AI can quickly flag changes in patterns of activity, whether caused by new product offerings or money laundering.

Describe, diagnose, and predict with IoT Analytics - The whole point of collecting IoT data is to extract actionable insights. Insights that will trigger some sort of action that will result in some business value such as optimized factory operations, improved product quality, better understanding of customer demand, new sources of revenue, and improved customer experience. This post discusses the extraction of value out of IoT data by focusing on the analytics part of the story, the four types of analytics and how they map to IoT, the role of machine learning in IoT analytics, and an overview of a solution architecture that highlights the components of an IoT solution with analytics.

Solution diagram showing components of an IoT Analytics Solution

Current use cases for machine learning in retail and consumer goods - Retail and consumer goods companies are seeing the applicability of machine learning (ML) to drive improvements in customer service and operational efficiency. For example, the Azure cloud is helping retail and consumer brands improve the shopping experience by ensuring shelves are stocked and product is always available when, where and how the consumer wants to shop. Check out 8 ML use cases to improve service and provide benefits of optimization, automation and scale. In addition, this post has links to a lot of other resources you should check out, such as an ML algorithm cheat sheet.

Where is the market opportunity for developers of IoT solutions? - There is a clear opportunity for SIs with practices around IoT and data analytics to develop IoT solutions customized to their customers. The IoT space is a complex, fragmented, and crowded ecosystem. As such, it is hard for software developers to identify where exactly the opportunity is. This post discusses the biggest market opportunities for developers of IoT solutions. There is also an opportunity for ISVs is in the development of IoT platforms and pre-built IoT apps to accelerate the development of IoT solutions.

Reduce costs by optimizing actuarial risk compute on Azure - Actuarial risk modeling is a compute-intensive operation. Employing thousands of server cores, with many uneven workloads such as monthly and quarterly valuation and production runs to meet regulatory requirements, these compute-intensive workloads, are well suited to take advantage of this near bottomless resources of the cloud. Learn how Azure enables you to access more compute power when needed, without having to manage it. And read the actuarial risk compute and modeling overview to understand your options for moving your compute workload to Azure.

A Cloud Guru's Azure This Week

Screenshot from A Cloud Guru's Azure This Week - 7 September 2018 video

A Cloud Guru's Azure This Week - 7 September 2018 - This Week: In this episode of Azure This Week, Dean takes a look at OMS, and its new home in the Azure Portal, new features of Visual Studio Live Share as well as updating you on a brand new Azure course on A Cloud Guru.


Using C++17 Parallel Algorithms for Better Performance

$
0
0

C++17 added support for parallel algorithms to the standard library, to help programs take advantage of parallel execution for improved performance. MSVC first added experimental support for some algorithms in 15.5, and the experimental tag was removed in 15.7.

The interface described in the standard for the parallel algorithms doesn’t say exactly how a given workload is to be parallelized. In particular, the interface is intended to express parallelism in a general form that works for heterogeneous machines, allowing SIMD parallelism like that exposed by SSE, AVX, or NEON, vector “lanes” like that exposed in GPU programming models, and traditional threaded parallelism.

Our parallel algorithms implementation currently relies entirely on library support, not on special support from the compiler. This means our implementation will work with any tool currently consuming our standard library, not just MSVC’s compiler. In particular, we test that it works with Clang/LLVM and the version of EDG that powers Intellisense.

How to: Use Parallel Algorithms

To use the parallel algorithms library, you can follow these steps:

  1. Find an algorithm call you wish to optimize with parallelism in your program. Good candidates are algorithms which do more than O(n) work like sort, and show up as taking reasonable amounts of time when profiling your application.
  2. Verify that code you supply to the algorithm is safe to parallelize.
  3. Choose a parallel execution policy. (Execution policies are described below.)
  4. If you aren’t already, #include <execution> to make the parallel execution policies available.
  5. Add one of the execution policies as the first parameter to the algorithm call to parallelize.
  6. Benchmark the result to ensure the parallel version is an improvement. Parallelizing is not always faster, particularly for non-random-access iterators, or when the input size is small, or when the additional parallelism creates contention on external resources like a disk.

For the sake of example, here’s a program we want to make faster. It times how long it takes to sort a million doubles.

// compile with:
//  debug: cl /EHsc /W4 /WX /std:c++latest /Fedebug /MDd .program.cpp
//  release: cl /EHsc /W4 /WX /std:c++latest /Ferelease /MD /O2 .program.cpp
#include <stddef.h>
#include <stdio.h>
#include <algorithm>
#include <chrono>
#include <random>
#include <ratio>
#include <vector>

using std::chrono::duration;
using std::chrono::duration_cast;
using std::chrono::high_resolution_clock;
using std::milli;
using std::random_device;
using std::sort;
using std::vector;

const size_t testSize = 1'000'000;
const int iterationCount = 5;

void print_results(const char *const tag, const vector<double>& sorted,
                   high_resolution_clock::time_point startTime,
                   high_resolution_clock::time_point endTime) {
  printf("%s: Lowest: %g Highest: %g Time: %fmsn", tag, sorted.front(),
         sorted.back(),
         duration_cast<duration<double, milli>>(endTime - startTime).count());
}

int main() {
  random_device rd;

  // generate some random doubles:
  printf("Testing with %zu doubles...n", testSize);
  vector<double> doubles(testSize);
  for (auto& d : doubles) {
    d = static_cast<double>(rd());
  }

  // time how long it takes to sort them:
  for (int i = 0; i < iterationCount; ++i)
  {
    vector<double> sorted(doubles);
    const auto startTime = high_resolution_clock::now();
    sort(sorted.begin(), sorted.end());
    const auto endTime = high_resolution_clock::now();
    print_results("Serial", sorted, startTime, endTime);
  }
}

Parallel algorithms depend on available hardware parallelism, so ensure you test on hardware whose performance you care about. You don’t need a lot of cores to show wins, and many parallel algorithms are divide and conquer problems that won’t show perfect scaling with thread count anyway, but more is still better. For the purposes of this example, we tested on an Intel 7980XE system with 18 cores and 36 threads. In that test, debug and release builds of this program produced the following output:

.debug.exe
Testing with 1000000 doubles...
Serial: Lowest: 1349 Highest: 4.29497e+09 Time: 310.176500ms
Serial: Lowest: 1349 Highest: 4.29497e+09 Time: 304.714800ms
Serial: Lowest: 1349 Highest: 4.29497e+09 Time: 310.345800ms
Serial: Lowest: 1349 Highest: 4.29497e+09 Time: 303.302200ms
Serial: Lowest: 1349 Highest: 4.29497e+09 Time: 290.694300ms

C:UsersbionDesktop>.release.exe
Testing with 1000000 doubles...
Serial: Lowest: 2173 Highest: 4.29497e+09 Time: 74.590400ms
Serial: Lowest: 2173 Highest: 4.29497e+09 Time: 75.703500ms
Serial: Lowest: 2173 Highest: 4.29497e+09 Time: 87.839700ms
Serial: Lowest: 2173 Highest: 4.29497e+09 Time: 73.822300ms
Serial: Lowest: 2173 Highest: 4.29497e+09 Time: 73.757400ms

Next, we need to ensure our sort call is safe to parallelize. Algorithms are safe to parallelize if the “element access functions” — that is, iterator operations, predicates, and anything else you ask the algorithm to do on your behalf follow the normal “any number of readers or at most one writer” rules for data races. Moreover, they must not throw exceptions (or throw exceptions rarely enough that terminating the program if they do throw is OK).

Next, choose an execution policy. Currently, the standard includes the parallel policy, denoted by std::execution::par, and the parallel unsequenced policy, denoted by std::execution::par_unseq. In addition to the requirements exposed by the parallel policy, the parallel unsequenced policy requires that your element access functions tolerate weaker than concurrent forward progress guarantees. That means that they don’t take locks or otherwise perform operations that require threads to concurrently execute to make progress. For example, if a parallel algorithm runs on a GPU and tries to take a spinlock, the thread spinning on the spinlock may prevent other threads on the GPU from ever executing, meaning the spinlock may never be unlocked by the thread holding it, deadlocking the program. You can read more about the nitty gritty requirements in the [algorithms.parallel.defns] and [algorithms.parallel.exec] sections of the C++ standard. If in doubt, use the parallel policy. In this example, we are using the built-in double less-than operator which doesn’t take any locks, and an iterator type provided by the standard library, so we can use the parallel unsequenced policy.

Note that the Visual C++ implementation implements the parallel and parallel unsequenced policies the same way, so you should not expect better performance for using par_unseq on our implementation, but implementations may exist that can use that additional freedom someday.

In the doubles sort example above, we can now add

#include <execution>

to the top of our program. Since we’re using the parallel unsequenced policy, we add std::execution::par_unseq to the algorithm call site. (If we were using the parallel policy, we would use std::execution::par instead.) With this change the for loop in main becomes the following:

for (int i = 0; i < iterationCount; ++i)
{
  vector<double> sorted(doubles);
  const auto startTime = high_resolution_clock::now();
  // same sort call as above, but with par_unseq:
  sort(std::execution::par_unseq, sorted.begin(), sorted.end());
  const auto endTime = high_resolution_clock::now();
  // in our output, note that these are the parallel results:
  print_results("Parallel", sorted, startTime, endTime);
}

Last, we benchmark:

.debug.exe
Testing with 1000000 doubles...
Parallel: Lowest: 6642 Highest: 4.29496e+09 Time: 54.815300ms
Parallel: Lowest: 6642 Highest: 4.29496e+09 Time: 49.613700ms
Parallel: Lowest: 6642 Highest: 4.29496e+09 Time: 49.504200ms
Parallel: Lowest: 6642 Highest: 4.29496e+09 Time: 49.194200ms
Parallel: Lowest: 6642 Highest: 4.29496e+09 Time: 49.162200ms

.release.exe
Testing with 1000000 doubles...
Parallel: Lowest: 18889 Highest: 4.29496e+09 Time: 20.971100ms
Parallel: Lowest: 18889 Highest: 4.29496e+09 Time: 17.510700ms
Parallel: Lowest: 18889 Highest: 4.29496e+09 Time: 17.823800ms
Parallel: Lowest: 18889 Highest: 4.29496e+09 Time: 20.230400ms
Parallel: Lowest: 18889 Highest: 4.29496e+09 Time: 19.461900ms

The result is that the program is faster for this input. How you benchmark your program will depend on your own success criteria. Parallelization does have some overhead and will be slower than the serial version if N is small enough, depending on memory and cache effects, and other factors specific to your particular workload. In this example, if I set N to 1000, the parallel and serial versions run at approximately the same speed, and if I change N to 100, the serial version is 10 times faster. Parallelization can deliver huge wins but choosing where to apply it is important.

Current Limitations of the MSVC Implementation of Parallel Algorithms

We built the parallel reverse, and it was 1.6x slower than the serial version on our test hardware, even for large values of N. We also tested with another parallel algorithms implementation, HPX, and got similar results. That doesn’t mean it was wrong for the standards committee to add those to the STL; it just means the hardware our implementation targets didn’t see improvements. As a result we provide the signatures for, but do not actually parallelize, algorithms which merely permute, copy, or move elements in sequential order. If we get feedback with an example where parallelism would be faster, we will look into parallelizing these. The affected algorithms are:

  • copy
  • copy_n
  • fill
  • fill_n
  • move
  • reverse
  • reverse_copy
  • rotate
  • rotate_copy
  • swap_ranges

Some algorithms are unimplemented at this time and will be completed in a future release. The algorithms we parallelize in Visual Studio 2017 15.8 are:

  • adjacent_difference
  • adjacent_find
  • all_of
  • any_of
  • count
  • count_if
  • equal
  • exclusive_scan
  • find
  • find_end
  • find_first_of
  • find_if
  • for_each
  • for_each_n
  • inclusive_scan
  • mismatch
  • none_of
  • reduce
  • remove
  • remove_if
  • search
  • search_n
  • sort
  • stable_sort
  • transform
  • transform_exclusive_scan
  • transform_inclusive_scan
  • transform_reduce

Design Goals for MSVC’s Parallel Algorithms Implementation

While the standard specifies the interface of the parallel algorithms library, it doesn’t say at all how algorithms should be parallelized, or even on what hardware they should be parallelized. Some implementations of C++ may parallelize by using GPUs or other heterogeneous compute hardware if available on the target. copy doesn’t make sense for our implementation to parallelize, but it does make sense on an implementation that targets a GPU or similar accelerator. We value the following aspects in our implementation:

Composition with platform locks

Microsoft previously shipped a parallelism framework, ConcRT, which powered parts of the standard library. ConcRT allows disparate workloads to transparently use the hardware available, and lets threads complete each other’s work, which can increase overall throughput. Basically, whenever a thread would normally go to sleep running a ConcRT workload, it suspends the chore it’s currently executing and runs other ready-to-run chores instead. This non-blocking behavior reduces context switches and can produce higher overall throughput than the Windows threadpool our parallel algorithms implementation uses. However, it also means that ConcRT workloads do not compose with operating system synchronization primitives like SRWLOCK, NT events, semaphores, COM single threaded apartments, window procedures, etc. We believe that is an unacceptable trade-off for the “by default” implementation in the standard library.

The standard’s parallel unsequenced policy allows a user to declare that they support the kinds of limitations lightweight user-mode scheduling frameworks like ConcRT have, so we may look at providing ConcRT-like behavior in the future. At the moment however, we only have plans to make use of the parallel policy. If you can meet the requirements, you should use the parallel unsequenced policy anyway, as that may lead to improved performance on other implementations, or in the future.

Usable performance in debug builds

We care about debugging performance. Solutions that require the optimizer to be turned on to be practical aren’t suitable for use in the standard library. If I add a Concurrency::parallel_sort call to the previous example program, ConcRT’s parallel sort a bit faster in release but almost 100 times slower in debug:

for (int i = 0; i < iterationCount; ++i)
{
  vector<double> sorted(doubles);
  const auto startTime = high_resolution_clock::now();
  Concurrency::parallel_sort(sorted.begin(), sorted.end());
  const auto endTime = high_resolution_clock::now();
  print_results("ConcRT", sorted, startTime, endTime);
}
C:UsersbionDesktop>.debug.exe
Testing with 1000000 doubles...
ConcRT: Lowest: 5564 Highest: 4.29497e+09 Time: 23910.081300ms
ConcRT: Lowest: 5564 Highest: 4.29497e+09 Time: 24096.297700ms
ConcRT: Lowest: 5564 Highest: 4.29497e+09 Time: 23868.098500ms
ConcRT: Lowest: 5564 Highest: 4.29497e+09 Time: 24159.756200ms
ConcRT: Lowest: 5564 Highest: 4.29497e+09 Time: 24950.541500ms

C:UsersbionDesktop>.release.exe
Testing with 1000000 doubles...
ConcRT: Lowest: 1394 Highest: 4.29496e+09 Time: 19.019000ms
ConcRT: Lowest: 1394 Highest: 4.29496e+09 Time: 16.348000ms
ConcRT: Lowest: 1394 Highest: 4.29496e+09 Time: 15.699400ms
ConcRT: Lowest: 1394 Highest: 4.29496e+09 Time: 15.907100ms
ConcRT: Lowest: 1394 Highest: 4.29496e+09 Time: 15.859500ms
Composition with other programs on the system and parallelism frameworks

Scheduling in our implementation is handled by the Windows system thread pool. The thread pool takes advantage of information not available to the standard library, such as what other threads on the system are doing, what kernel resources threads are waiting for, and similar.
It chooses when to create more threads, and when to terminate them. It’s also shared with other system components, including those not using C++.

For more information about the kinds of optimizations the thread pool does on your (and our) behalf, check out Pedro Teixeira’s talk on the thread pool, as well as official documentation for the CreateThreadpoolWork, SubmitThreadpoolWork, WaitForThreadpoolWorkCallbacks, and CloseThreadpoolWork functions.

Above all, parallelism is an optimization

If we can’t come up with a practical benchmark where the parallel algorithm wins for reasonable values of N, it will not be parallelized. We view being twice as fast at N=1’000’000’000 and 3 orders of magnitude slower when N=100 as an unacceptable tradeoff. If you want “parallelism no matter what the cost” there are plenty of other implementations which work with MSVC, including HPX and Threading Building Blocks.

Similarly, the C++ standard permits parallel algorithms to allocate memory, and throw std::bad_alloc when they can’t acquire memory. In our implementation we fall back to a serial version of the algorithm if additional resources can’t be acquired.

Benchmark parallel algorithms and speed up your own applications

Algorithms that take more than O(n) time (like sort) and are called with larger N than 2,000 are good places to consider applying parallelism. We want to make sure this feature behaves as you expect; please give it a try. If you have any feedback or suggestions for us, let us know. We can be reached via the comments below, via email (visualcpp@microsoft.com) and you can provide feedback via Help > Report A Problem in the product, or via Developer Community. You can also find us on Twitter (@VisualC) and Facebook (msftvisualcpp).

Application Insights improvements for Java and Node.js

$
0
0

Did you know that Application Insights supports Java and Node.js? That's because at Microsoft our mission is to empower every person and every organization on the planet to achieve more. For those of us on the Azure Application Insights team, every person means every developer, DevOps practitioner and site reliability engineer - regardless of the tech stack that they use.

That's why we've been working for over a year now to enable Java and Node.js teams to have a first-class monitoring experience in both their Azure and on-premises environments. So today I'm proud to share with you some of what our team has already accomplished, and I'm excited about the features and improvements that we will be continuing to release over the next several months. But first, let's talk about Java.

Application Insights for Java

I Code Java

The second version of our Application Insights for Java SDK was released to Maven/Gradle and GitHub earlier this year, and the team has continued to crank out improvements since then, most recently with version 2.1.2. In addition to a myriad of bug fixes, the team has also added support for fixed rate sampling, enhanced support for Log4J, and cross-component telemetry correlation. We also auto detect the usage of the most popular app frameworks, storage clients and communication libraries so that developers are not required to instrument their applications in any special way.

Spring by Pivotal logo

Java developers love the Spring framework and so do we. Because of this, we've also spent time improving our overall support of Spring in general and published a Spring Boot Starter compatible with the Spring Initializr application bootstrapping tool to help Spring developers configure Application Insights quickly and easily. As our love continues to grow, expect to see us continue to improve the monitoring experience for Spring teams in the future.

Finally, be sure to check out our quick start on Application Insights for Java to learn more.

Application Insights for Node.js

Node.js logo

Now, on to Node.js. The stable version of our Application Insights for Node.js SDK has been available on npm and GitHub for nearly a year now, and we continue to make improvements to it. Most recently we hardened its security to prevent developers from accidentally using compromised versions of the TLS protocol, added support for HTTP(S) proxies, and, like Java, offer a great collection of auto-collectors to reduce the need for manual application instrumentation. As always, our quick start on Application Insights for Node.js is a great place to find out more.

Looking to the future

As mentioned, not only am I proud of what we've accomplished recently, but I'm also excited about what the future has in store. We already dropped some hints when we announced that we've joined the OpenCensus project, which is an open source project with a goal to achieve "a single distribution of libraries for metrics and distributed tracing with minimal overhead." Look for us to make more contributions there, and to expand our support to additional stacks and technologies.

Speaking of contributions, all of Application Insights SDKs are open source, including Java and Node.js. The team loves to hear feedback via issues and pull requests, so feel free to come develop with us!

Lastly, if you have any questions or comments, please leave them in the comments below.

AI helps troubleshoot an intermittent SQL Database performance issue in one day

$
0
0

In this blogpost, you will learn how Azure SQL Database intelligent performance feature Intelligent Insights has successfully helped a customer troubleshoot a hard to find 6-month intermittent database performance issue in a single day only. You will find out how Intelligent Insights helps an ISV operate 60,000 databases by identifying related performance issues across their database fleet. You will also learn how Intelligent Insights helped an enterprise seamlessly identify a hard to troubleshoot performance degradation issue on a large-scale 35TB database fleet.

Azure SQL Database, the most intelligent cloud database, is empowering small and medium size business, and large enterprises to focus on writing awesome applications while entrusting Azure to autonomously take care of running, scaling, and maintain a peak performance with a minimum of human interaction, or advanced technical skill set required.

Intelligent Insights is a new disruptive intelligent performance technology leveraging the power of artificial intelligence (AI) to continuously monitor and troubleshoot Azure SQL Database performance issues with a pinpoint accuracy and at a large scale simply not possible before. Performance troubleshooting models for the solution where fine-tuned and advanced based on the learning from a massive workload pool of several million Azure SQL Databases.

In the period since its public preview debut in Sep. 2017, we have witnessed some remarkable customer success stories with Intelligent Insights that I would like to share with you.

Troubleshooting hard-to-find intermittent SQL Database performance issues

NewOrbit is an ISV based in the United Kingdom building and running software on Azure for startups and enterprises, including a system for background checking of employees and tenants. Thousands of customers check more than 400,000 people each year and the system does that through integrating information from various services. NewOrbit runs entirely on Azure and use about 200 SQL Databases. NewOrbit has a flat organizational structure consisting of developers and customer account managers only. They do not employ DevOps team or DBAs as they rely on SQL Database built-in intelligence to automatically tune and troubleshoot database performance for them as NewOrbit CTO Frans Lytzen is saying “I have Azure for that”.

NewOrbit experienced an intermittent performance issue lasting for about 6 months that resulted in getting an increased amount of timeouts for existing systems running in production. Their first reaction was to upgrade to a higher pricing tier on Azure with more capacity, however this did not help and it seemed very strange to them. As they’ve seen more than enough spare DTUs on their Azure subscription, NewOrbit understood right away that dialing up the capacity will not make the issue go away.

NewOrbit has decided to try Intelligent Insights. As soon as they fired up the solution, it showed that there seems to exist a memory pressure on their databases. This was not immediately obvious and NewOrbit understood that memory pressure is not the cause but most likely an effect of another underlying problem.

image_000

Intelligent Insights has within a day of being turned on pointed out there are several queries suspected as a root cause for the memory pressure. NewOrbit has promptly fixed and deployed new queries to production witnessing an immediate decrease in memory pressure the following day.

“On an earlier occasion, we had a performance issue lasting for about 6 months. Before Intelligent Insights we have not had a way of figuring out where do we even start troubleshooting. Intelligent Insights gave us a list of things to do. What Intelligent Insight does is that it enables us to pinpoint where the problem is and to get a fix deployed within 24hrs.”

Frans Lytzen, CTO, NewOrbit.

As NewOrbit grows as a service, their databases and queries are getting bigger and the complexity grows. They have now implemented a continuous improvement program relying on Intelligent Insights to regularly review and optimize database queries for the best performance of their applications.

Identifying performance issues in the sea of 60,000 SQL Databases

SnelStart is a Dutch ISV developing and running SaaS service for small and large companies providing financial online services such are bookkeeping, invoicing and accounting. SnelStart relies on Azure to deliver their service to about 64,000 customers. The underlying infrastructure employs 60,000+ SQL Databases, mainly in elastic pools. Maintaining a massive amount of databases at scale and quickly reacting to customer issues, especially in cases of services degradation or an outage is an imperative for SnelStart.

To maintain such a large amount of databases typically involves having a large DBA and DevOps team. Once a customer calls on the phone complaining that the service is slow or unavailable, SnelStart team needs to quickly identify and act on resolving the issue. The company had a number of cases where it would find out about a performance or unavailability issues from their customers first. Hoping to improve their response time, SnelStart has decided to use Intelligent Insights as a performance-monitoring tool.

SnelStart uses the solution daily to help identify elastic pools hitting its CPU limits. The built-in intelligence helps the company identify top consuming databases in overutilized elastic pools, providing suggestions of other elastic pools with sufficient capacity where the hot databases could be moved. SnelStart as their DevOps strategy has created separate elastic pools on each of their logical servers where they move high consuming databases. This allows the company to research slow running queries, and other database problems with the help of Azure SQL Analytics until repaired. Once performance of hot databases has improved, SnelStart typically returns them back to the original elastic pools.

image_002

With the help of Intelligent Insights, SnelStart was also, in one of the instances, able to quickly identify application queries causing database locking, much before their developers could manually troubleshoot the root cause of the issue. Hotfix for the issue was deployed within minutes, and before customers were impacted at scale. Better yet, there were no customer phone calls received regarding the service performance.

This was an entirely new type of capability for the company, as the service monitoring has shifted from a reactive to proactive relying on SQL Database built-in intelligent performance.

“Intelligent Insights proactively finds a database performance problem in a more efficient way and much faster than humans. With it we can proactively help customers until we have a fix for the problem.”

Bauke Stil, Application Manager, SnelStart.

SnelStart was impressed with the efficiency of detecting database performance issues automatically and at such scale in the sea of 60,000 SQL Databases. As all of their customer databases are having the same structure, the company is also using the solution to identify and resolve common performance issues across their entire database fleet. Once an issue is identified on one database, a hotfix is deployed to all databases immediately benefiting all SnelStart customers.

Finding a locking issue on a large-scale 35TB SQL Database fleet

Microsoft TFS provides an online service supporting our developer community and customers. It is a complex system handling trillion lines of code across the customer base. The service provides sophisticated reporting, builds, labs, tests, ticketing, release automation management and project management – amongst others. Providing a 24/7/365 service globally with a top performance is imperative for TFS. Once you click to check-in your code, or if you are perhaps creating a TFS ticket, you expect a prompt response from the service. TFS massive infrastructure runs on about 900 SQL Databases with combined 35TB of data stored.

TFS has previously used its own monitoring and alerting solution capable of observing slow queries and sustained high CPU periods. On one of the occasions, the existing tools identified a performance issue providing only a shallow analysis that there exists a heavy locking on a database, but nothing else. Further troubleshooting has indicated the locking was table scoped and related to lock partitioning. This meant that the affected slow query visible in the stats is typically not the root cause of locking, but only a surface effect of some other blocking query causing the locking. The workload in such cases keeps on piling up, so upgrading to a larger resource pool could only be a short-lived solution. The only way out of this was to find the blocking query and to fix it.

Troubleshooting in cases such as this one is typically very difficult, as it requires considerable DBA skills and it is very time consuming. TFS turned to Intelligent Insights to help with performance troubleshooting automation. In this particular case, the system has identified application related increase in the workload pile up, and has provided a list the affected and blocking queries.

image_003

Analysis of the blocking query has identified that this was a maintenance query scheduled periodically to remove unused attachments. Further analyzing the code, it was determined that the size of the deletion batch was too large causing the issue. New query with reduced batch size was deployed promptly, having an immediate effect in resolving the heavy locking and gradually relieving the workload pressure.

Maintaining TFS demanding infrastructure requires some of the best Microsoft engineers on the job. We have surely met the bar with one of the best SQL performance troubleshooters at Microsoft, Remi Lemarchand, Principal Software Engineer at TFS, hearing him say:

“Intelligent Insights does a great job of finding database performance problems. I find it always right on the spot!”

Remi Lemarchand, Principal Software Engineer, Microsoft TFS.

Remi concludes that he prefers using Intelligent Insights first in troubleshooting database performance issues before moving onto other tools due to the level of depth it provides, and considerable reduction of manual DBA troubleshooting time required. Since applying the fix to the blocking query, the database is purring along nicely!

Summary

Marriage of AI and Azure SQL Database has resulted in disruptive new capabilities of intelligent performance not possible before. Small and medium size business can now do more with Azure on a large scale with a smaller crew and at a considerably lower cost compared to running infrastructure on their own. Large companies and enterprises can have less headaches and relieve pressure from their DBAs and DevOps as SQL Database intelligent performance can help them identify operational issues on tens of thousands of databases in a single day, compared to months in some cases.

To help you start using SQL Database Intelligent Insights for troubleshooting performance issues, see Setup Intelligent Insights with Log Analytics.

For related SQL Database intelligent performance products, see Monitor Azure SQL Databases with Azure SQL Analytics, and Automatic tuning in Azure SQL Database.

Please let us know how do you envision SQL Database intelligent performance helping you?

GPUs vs CPUs for Deployment of Deep Learning Models

$
0
0

This blog post was co-authored by Mathew Salvaris, Senior Data Scientist, Azure CAT and Daniel Grecoe, Senior Software Engineer, Azure CAT

Choosing the right type of hardware for deep learning tasks is a widely discussed topic. An obvious conclusion is that the decision should be dependent on the task at hand and based on factors such as throughput requirements and cost. It is widely accepted that for deep learning training, GPUs should be used due to their significant speed when compared to CPUs. However, due to their higher cost, for tasks like inference which are not as resource heavy as training, it is usually believed that CPUs are sufficient and are more attractive due to their cost savings. However, when inference speed is a bottleneck, using GPUs provide considerable gains both from financial and time perspectives. In a previous tutorial and blog Deploying Deep Learning Models on Kubernetes with GPUs, we provide step-by-step instructions to go from loading a pre-trained Convolutional Neural Network model to creating a containerized web application that is hosted on Kubernetes cluster with GPUs using Azure Container Service (AKS).

Expanding on this previous work, as a follow up analysis, here we provide a detailed comparison of the deployments of various deep learning models to highlight the striking differences in the throughput performance of GPU versus CPU deployments to provide evidence that, at least in the scenarios tested, GPUs provide better throughput and stability at a lower cost.

In our tests, we use two frameworks Tensorflow (1.8) and Keras (2.1.6) with Tensorflow (1.6) backend for 5 different models with network sizes which are in the order of small to large as follows:

  • MobileNetV2 (3.4M parameters)
  • NasNetMobile (4.2M parameters)
  • ResNet50 (23.5M parameters)
  • ResNet152 (58.1M parameters)
  • NasNetLarge (84.7M parameters)

We selected these models since we wanted to test a wide range of networks from small parameter efficient models such as MobileNet to large networks such as NasNetLarge.

For each of these models, a docker image with an API for scoring images have been prepared and deployed on four different AKS cluster configurations:

  • 1 node GPU cluster with 1 pod
  • 2 node GPU cluster with 2 pods
  • 3 node GPU cluster with 3 pods
  • 5 node CPU cluster with 35 pods

The GPU clusters were created using Azure NC6 series virtual machines with K80 GPUs while the CPU cluster was created using D4 v2 virtual machines with 8 cores. The CPU cluster was strategically configured to approximately match the largest GPU cluster cost so that a fair throughput per dollar comparison can be made between the 3 node GPU cluster and 5 node CPU cluster which is close but slightly more expensive at the time of these tests. For more recent pricing, please use Azure Virtual Machine pricing calculator.

All clusters were set up in East US region and testing of the clients occurred using a test harness in East US and Western Europe. The purpose was to determine if testing from different regions had any effect on throughput results. As it turns out, testing from different regions had some, but very little influence and therefore the results listed in this analysis only include data from the testing client in the East US region with a total of 40 different cluster configurations.

The tests were conducted by running an application on an Azure Windows virtual machine in the same region as the deployed scoring service. Using a range of 20-50 concurrent threads, 1000 images were scored, and the results recorded was the average throughput over the entire set. Actual sustained throughput is expected to be higher in an operationalized service due to the cyclical nature of the tests. The results reported below use the averages from the 50 thread set in the test cycle, and the application used to test these configurations can be found on GitHub.

Scaling of GPU Clusters

The general throughput trends for GPU clusters differ from CPU clusters such that adding more GPU nodes to the cluster increases throughput performance linearly. The following graph illustrates the linear growth in throughput as more GPUs are added to the clusters for each framework and model tested.

image

Due to management overheads in the cluster, although there is a significant increase in the throughput, the increase is not proportional to the number of GPUs added and is less than 100 percent per GPU added.

GPU vs CPU results

As stated before, the purpose of the tests is to understand if the deep learning deployments perform significantly better on GPUs which would translate to reduced financial costs of hosting the model. In the below figure, the GPU clusters are compared to a 5 node CPU cluster with 35 pods for all models for each framework. Note that, the 3 node GPU cluster roughly translates to an equal dollar cost per month with the 5 node CPU cluster at the time of these tests.

image

The results suggest that the throughput from GPU clusters is always better than CPU throughput for all models and frameworks proving that GPU is the economical choice for inference of deep learning models. In all cases, the 35 pod CPU cluster was outperformed by the single GPU cluster by at least 186 percent and by the 3 node GPU cluster by 415 percent which is of similar cost. These results are more pronounced for smaller networks such as MobileNetV2 for which the single node GPU cluster performs 392 percent and 3 node GPU 804 percent better than the CPU cluster for TensorFlow framework.

It is important to note that, for standard machine learning models where number of parameters are not as high as deep learning models, CPUs should still be considered as more effective and cost efficient. Also, there exists methods to optimize CPU performance such as MKL DNN and NNPACK. However, similar methods also exist for GPUs such as TensorRT. We also found that the performance on GPU clusters were far more consistent than CPU.

We hypothesize that this is because there is no contention for resources between the model and the web service that is present in the CPU only deployment. It can be concluded that for deep learning inference tasks which use models with high number of parameters, GPU based deployments benefit from the lack of resource contention and provide significantly higher throughput values compared to a CPU cluster of similar cost.

We hope that you find this comparison beneficial for your next deployment decision and let us know if you have any questions or comments.

Five options to help actuarial and account processes meet IFRS-17 deadlines

$
0
0

Actuarial compute solutions are what keeps insurance companies in business. The legally required reporting can only be done at scale by employing thousands of server cores for workloads like monthly and quarterly valuation and production reporting. With these compute-intensive workloads, actuaries need the elasticity of the cloud. Azure offers a variety of compute options, from large machines with hundreds of cores or thousands of smaller standard machines with fewer cores. This scalability means you can use as much compute power as needed to finish your production runs with enough time to spare to correct an error and rerun before a deadline.

image

Microsoft supports a range of actuarial compute solutions running on Azure including popular life actuarial partner platforms from Milliman, Willis Towers Watson, FIS, and Moody’s. All of these companies are experienced in both policy reserving and project modeling. Many life insurance companies today use at least one of these platforms, some of the largest insurance companies use parts of all of them. Partner solutions built on Azure provide different options for model creation and model modification. The solutions are flexible and support customization of both the model and the cloud environment. 

The new regulation, IFRS-17, adds new levels of reporting and controls to the actuarial and account processes for all insurance companies that are covered by it. These changes require many companies to re-evaluate and upgrade their actuarial applications and platforms. By utilizing an Azure partner, the insurance company relies on an industry leading cloud-platform teamed with top solution providers.

The following is a set of solutions and resources to help you work towards IFRS-17 rollout, as well as moving your actuarial compute to the cloud:

  • Integrate is a solution from Milliman. Milliman anticipates large changes in calculation needs throughout all phases, product development, valuation, Cashflow Testing, and other projection work because of IFRS-17 for companies that fall under this new internal account standard for insurance. 
  • RiskAgility FM, from Willis Towers Watson (WTW).  WTW believes that most insurance companies will build onto their existing actuarial modeling system to solve for key values needed to meet the IFRS requirements. RiskAgility from WTW offers the flexibility to wrap around valuation and modeling solutions from other software providers, while adding all needed controls and governance. It also manages the additional data storage and calculation loops required under IFRS-17.
  • Prophet Managed Cloud Service from FIS. The Prophet IFRS-17 suite of solutions are built on the existing Prophet functionality and allows users to meet IFRS requirements while using existing Prophet models and data feeds.  The new components add the needed end-to-end control and reporting functionality with all additional looping and data feeds.
  • RiskIntegrity is a solution from Moody’s Analytics RiskIntegrity IFRS 17 offers insurers an automated, secure, and flexible solution for managing a complex, data-intensive process and performing key calculations, helping to bridge actuarial and financial reporting functions. It integrates with the Moody’s Analytics AXISTM actuarial system and is fully compatible with other actuarial systems.
  • Want to learn more about options for moving your actuarial compute and modeling to the cloud, as well as meeting IFRS-17 requirements? Access the actual risk modelling use case, solution guide, and more resources.

I post regularly about new developments on social media. If you would like to follow me, you can find me on Linkedin and Twitter.

Reduce false positives, become more efficient by automating anti-money laundering detection

$
0
0

This blog post was created in partnership with André Burrell who is the Banking & Capital Markets Strategy Leader on the Worldwide Industry team at Microsoft.

image

In our last blog post Anti-money laundering – Microsoft Azure helping banks reduce false positives, we alluded to Microsoft’s high-level approach to a solution—which automates the end-to-end handling of anti-money laundering (AML) detection and management.

  • AML ≠ Anti-fraud. Anti-fraud is immediate identification and halting of transactions.
  • AML pursues the identification of suspected money laundering or other crimes.
  • Failure to have an “adequate Transaction Monitoring System” can result in substantial fines.

Due to the growing number of fines issued, there is now an increased drive to hold compliance officers, senior executives, and board members personally liable for failing to have an adequate AML program and transaction monitoring system (TMS). Any alert generated and not closed by the TMS must be reviewed by a human. Current technologies cannot assess a transaction in context. Without human intervention, it is difficult, almost impossible to adapt to the rapidly evolving patterns used by money launders or terrorists. We have many partners that address bank challenges with fraud. Among that elite group, Behavioral Biometrics solution from BioCatch and the Onfido Identity Verification Solution help automate fraud detection through frictionless detection.

Yes or no? The false positive problem

The rules use fixed risk scores of the customer, product, and geography involved. Based on the risk score, different dollar thresholds are applied within the same rule.  Aggravating these limitations, data silos (across banks) require significant data manipulation to format the data for use in TMS. This results in limited contextual information entering the TMS. The result: TMS systems are generating huge numbers of false positive alerts. Each alert must then be reviewed by a human investigator within strict timeframes. Most banks are experiencing a “false positive” rate of about 95-99 percent.  This means that only between 1 percent – 5 percent of all alerts result in an actual filing of a Suspicious Activity Report (SAR). Yikes!

Banks have generally grown through mergers and acquisitions. As they grow, promises are made to consolidate the different systems; however, in reality, those promises are rarely kept. As banks continue to grow, the problems with siloed systems are exacerbated.

Regulators require each alert to be reviewed within 60  days of generation, and a final determination must be made within 90 days. Within 30 days of determining a transaction is suspicious, a SAR must be filed. Failure to regularly meet these time frames can be grounds for regulatory action. This time pressure causes additional problems as the bank must make arbitrary decisions to meet the deadlines.

You’re missing the context

Even in cases where the bank makes the deadline, and the rules are working correctly, there is another problem: they have inadequate information because they are detecting fraud without additional context.

One type of missing context: location of the customer. For example, a customer attends a meeting in Germany while the transaction originates in London. Another type of missing context: social data. For example, social media information can reveal that a customer owns a small business where his or her presence can be confirmed.

The filing of a SAR does not mean the transaction is illegal.  However, to avoid the huge fines and costs from losses, banks start over-filing SARs. The result: legitimate customers are incorrectly flagged as having engaged in suspicious activity and thus subject to closer monitoring. More recently, such customers become subject to de-risking, i.e. closing an account to a perception of high risk. Siloed systems require investigators to access multiple systems to gather information on the customer and their transaction history to determine whether a transaction is suspicious. Additionally, the investigator must create a written record as to why the alert was closed.

Banks over-file out of an overabundance of caution and thus weaken the value of SARs to law enforcement—who are also overwhelmed with false positive SARs. These technological and data limitations are well known in the industry. Money launderers and terrorists are also known to exploit these weaknesses.

Figure 1 shows a simple illustration of the problem: 10 cash transactions are logged, and every transaction is for $9500. Only one of the ten is an illegal transaction. There is a rule that requires the reporting of all cash transactions over $10,000. The rule is intended to identify customers that are breaking up their cash transactions “structuring” to avoid the reporting of all cash transactions over $10,000; in figure 1, the rule identifies all cash transactions by customers that are between $9,500 and $9,999 over some period.

image

Figure 1

When viewed in full context, only 1 of the ten transactions is truly suspicious. Existing rule-based systems, without context, generate alerts on all ten transactions. Thus, a human must review all ten transactions. In this example, the false positive rate is 90 percent. The size of the problem becomes apparent when you consider that a TMS in small to medium-sized bank would have 20+ rules, with at least three variations per rule to address different risk-based thresholds.

As bank products and services become more complex, the more rules they will have. Can the rules in a TMS be fined tuned? Yes, from a technical perspective it is possible to refine the rules. Regulators expect banks to validate the effectiveness of their rules periodically. However, with limited contextual information, the banks are reduced to conducting simple above and below the line testing. Each time a bank tunes its rules to reduce the number of false positives, they run the risk of missing truly suspicious activity and being subject to regulatory action. Result: the blunt force approach is the status quo. Banks are forced to process the high number of false positives. And the usefulness of SARs to law enforcement is reduced. This makes TMS technology ripe for disruption through the application of machine learning and artificial intelligence (ML/AI).

Reducing false positives demands a 360 degree view for more context

Banks will eventually embrace the cloud across the majority of their technology, but for right now they need a true 360 degree view of customers across all the banks platforms/systems based on the banks full data sets. When the 360 degree view exists, true customer insights and context become more readily available; risk and compliance can be made more robust while driving down costs; and deep analytics on a bank’s dark data, customer base, and products becomes available. It also lays a strong foundation upon which to build a Know Your Customer (KYC) solution and utilizes Microsoft Dynamics (customer relationship management system) and Microsoft’s mixed reality product called Hololens to improve the bank’s employee effectiveness and speed to resolution.

Currently AML investigators typically access three to five or more systems to gather the information need to investigate the alert. Imagine through Hololens, AML investigators being able to see visualizations of networks of transactions and drilling down into the data by reaching out and selecting a node, applying predefined PowerBI dashboard templates to further interrogate the selected data; keeping multiple windows open into other systems; or enabling two or more investigators to simultaneously view an investigation and related data to reach a collaborative decision. Such a system would help banks overcome their resistance to the cloud for customer information while simultaneously meeting regulators demand for a complete 360 degree view of customers across all a bank’s platforms thus enabling better customer insights and compliance and risk management. 

Recommended next steps

To cut through the overwhelming amount of information, and to provide a path for your company, we just released the Detecting Online and Mobile Fraud with AI use case.  Here you can learn more about new types of attack vectors, types of emerging fraud every day, and how detection systems need to respond faster than ever leveraging Azure machine learning and AI solutions.

We would love for you to engage with the author and guest contributor on this topic. Feel free to reach out via the comments below.

Retail brands: gain a competitive advantage with modern data management

$
0
0

928414120

Retailers today are responsible for a significant amount of data. As a customer, I expect that the data I provide to a retailer is handled properly following the Payment Card Industry Data Security Standard (PCI DSS), General Data Protection Regulation (GDPR) and other compliance guidelines.

I also expect that the data I give to a retailer is being leveraged in a way that improves my shopping experience. For example, I want better recommendations given my purchase history, my “likes,” and my browsing. I expect the retailer to know my email and payment information for a single click checkout. These are small and trivial frictions that can be eliminated, as long as the data is handled properly with the right permissions. And the capture and analysis of big data (general information aggregated from all customers) is an opportunity for the retailer to fine-tune the overall customer experience.

However much of the data collected goes unused. This occurs because the infrastructure within an organization is unable to make the data accessible or searchable. It can’t be used to improve decision-making across the retail value chain. The unused data comes from many sources: mobile devices, digital and physical store shopping, and IoT. This means vast amounts of data points are ripe for collection. But the investment and modernization required to effectively manage and leverage the data hasn’t caught up. Today’s barriers to gaining value from data are:

  • Data silos
  • Incongruent data types
  • Performance constraints
  • Rising costs

A healthy data environment is one that enables value to be generated by moving the organization from “what happened?” to predictive and action-oriented insights: “what will happen”? This then leads the question “what should I do?” in which predictions can be automated, moving from insight to action, and generating new value.

Modern practices allow data to be more efficiently and effectively leveraged in a flexible, extensible and secure way. Deriving value from your data also means that your organization has achieved:

  • Data integration
  • Support for diverse data models
  • Unlimited scale
  • Fully-managed infrastructure
  • Lower total cost of ownership

Chart: Ideal characteristics of a data ecosystem

Today:

Barriers to gaining value from data

Tomorrow:

Deriving value from your data

Data silos Data integration
Incongruent data types Support for diverse data models

Performance constraints

Unlimited scale
Rising costs Fully-managed infrastructure
  Lower total cost of ownership (TCO)

In conjunction with leveraging new technology capabilities such as the cloud and AI, organizations must put effort into effective data management. In a way, it should be a pre-requisite for modernization and/or transformation efforts.

A data management model consists of these states: ingest, prepare, store, analyze and action. Data can be categorized into three types of data sources: purchased, public and proprietary data. In legacy environments, data is often siloed and constrained by a variety of factors such as channel, system or by utility. A major goal should be to remove these traditional constraints so that the right data can be leveraged across an organization at scale, securely. And nurturing a healthy data environment is a competitive advantage, especially as an organization’s data science capabilities grow.

Great data management provides a significant strategic advantage and enables brand differentiation when serving customers. In retail, it doesn’t matter if you are data rich or data poor. If you can’t action and operationalize the insights to power and automate decision making, your data management strategy needs to be reexamined.

Recommended next steps


.NET Core September 2018 Update

$
0
0

Today, we are releasing the .NET Core September 2018 Update. This update includes .NET Core 2.1.4 and .NET Core SDK 2.1.402 and contains important reliability fixes.

Security

CVE-2018-8409: .NET Core Denial Of Service Vulnerability
A denial of service vulnerability exists in .NET Core 2.1 when System.IO.Pipelines improperly handles requests. An attacker who successfully exploited this vulnerability could cause a denial of service against an application that is leveraging System.IO.Pipelines. The vulnerability can be exploited remotely, without authentication. A remote unauthenticated attacker could exploit this vulnerability by providing specially crafted requests to the application.

CVE-2018-8409: ASP.NET Core Denial Of Service Vulnerability
A denial of service vulnerability exists in ASP.NET Core 2.1 that improperly handles web requests. An attacker who successfully exploited this vulnerability could cause a denial of service against an ASP.NET Core web application. The vulnerability can be exploited remotely, without authentication. A remote unauthenticated attacker could exploit this vulnerability by providing a specially crafted web requests to the ASP.NET Core application.

Getting the Update

The latest .NET Core updates are available on the .NET Core download page. This update is included in Visual Studio 15.8.4, which is also releasing today.

Additional details, such as how to update vulnerable applications, can be found in the ASP.NET Core and .NET Core repo announcements.

See the .NET Core 2.1.4 release notes for details on the release including a detailed commit list.

Docker Images

.NET Docker images have been updated for today’s release. The following repos have been updated.

microsoft/dotnet
microsoft/dotnet-samples

Note: Look at the “Tags” view in each repository to see the updated Docker image tags.

Note: You must re-pull base images in order to get updates. The Docker client does not pull updates automatically.

Azure App Services deployment

Deployment of .NET Core 2.1.4 to Azure App Services has begun and the West Central US region will be live this morning. Remaining regions will be updated over the next few days and deployment progress can be tracked in this Azure App Service announcement.

Previous .NET Core Updates

The last few .NET Core updates follow:

August 2018 Update
July 2018 Update
June 2018 Update
May 2018 Update

Windows 10 SDK Preview Build 17754 available now!

$
0
0

Today, we released a new Windows 10 Preview Build of the SDK to be used in conjunction with Windows 10 Insider Preview (Build 17754 or greater). The Preview SDK Build 17754 contains bug fixes and under development changes to the API surface area.

The Preview SDK can be downloaded from developer section on Windows Insider.

For feedback and updates to the known issues, please see the developer forum. For new developer feature requests, head over to our Windows Platform UserVoice.

Things to note:

  • This build works in conjunction with previously released SDKs and Visual Studio 2017. You can install this SDK and still also continue to submit your apps that target Windows 10 build 1803 or earlier to the store.
  • The Windows SDK will now formally only be supported by Visual Studio 2017 and greater. You can download the Visual Studio 2017 here.
  • This build of the Windows SDK will install on Windows 10 Insider Preview builds and supported Windows operating systems.
  • In order to assist with script access to the SDK, the ISO will also be able to be accessed through the following URL: https://go.microsoft.com/fwlink/?prd=11966&pver=1.0&plcid=0x409&clcid=0x409&ar=Flight&sar=Sdsurl&o1=17754 once the static URL is published.

C++/WinRT Update for build 17709 and beyond:

This update introduces many improvements and fixes for C++/WinRT. Notably, it introduces the ability to build C++/WinRT without any dependency on the Windows SDK. This isn’t particularly interesting to the OS developer, but even in the OS repo it provides benefits because it does not itself include any Windows headers. Thus, a developer will typically pull in fewer or no dependencies inadvertently. This also means a dramatic reduction in the number of macros that a C++/WinRT developer must guard against. Removing the dependency on the Windows headers means that C++/WinRT is more portable and standards compliant and furthers our efforts to make it a cross-compiler and cross-platform library. It also means that the C++/WinRT headers will never be mangled by macros. If you previously relied on C++/WinRT to include various Windows headers that you will now have to include them yourself. It has always been good practice to always include any headers you depend on explicitly and not rely on another library to include them for you.

Highlights

Support get_strong and get_weak to create delegates: This update allows a developer to use either get_strong or get_weak instead of a raw this pointer when creating a delegate pointing to a member function.

Add async cancellation callback: The most frequently requested feature for C++/WinRT’s coroutine support has been the addition of a cancellation callback.

Simplify the use of APIs expecting IBuffer parameters: Although most APIs prefer collections or arrays, enough APIs rely on IBuffer that it should be easier to use such APIs from C++. This update provides direct access to the data behind an IBuffer implementation using the same data naming convention used by the C++ standard library containers. This also avoids colliding with metadata names that conventionally begin with an uppercase letter.

Conformance: Improved support for Clang and Visual C++’s stricter conformance modes.

Improved code gen: Various improvements to reduce code size, improve inlining, and optimize factory caching.

Remove unnecessary recursion: When the command line refers to a folder rather than a specific winmd, cppwinrt will no longer search recursively for winmd files. It causes performance problems in the OS build and can lead to usage errors that are hard to diagnose when developers inadvertently cause cppwinrt to consume more winmds than expected. The cppwinrt compiler also now handles duplicates more intelligently, making it more resilient to user error and poorly-formed winmd files.

Declare both WINRT_CanUnloadNow and WINRT_GetActivationFactory in base.h: Callers don’t need to declare them directly. Their signatures have also changed, amounting to a breaking change. The declarations alleviate most of the pain of this change. The change is necessitated by the fact that C++/WinRT no longer depends on the Windows headers and this change removes the dependency on the types from the Windows headers.

Harden smart pointers: The event revokers didn’t revoke when move-assigned a new value. This lead me to take a closer look at the smart pointer classes and I noticed that they were not reliably handling self-assignment. This is rooted in the com_ptr class template that most of the others rely on. I fixed com_ptr and updated the event revokers to handle move semantics correctly to ensure that they revoke upon assignment. The handle class template has also been hardened by the removal of the implicit constructor that made it easy to write incorrect code. This also turned bugs in the OS into compiler errors fixed in this PR.

Breaking Changes

Support for non-WinRT interfaces is disabled by default. To enable, simply #include <unknwn.h> before any C++/WinRT headers.

winrt::get_abi(winrt::hstring) now returns void* instead of HSTRING. Code requiring the HSTRING ABI can simply use a static_cast.

winrt::put_abi(winrt::hstring) returns void** instead of HSTRING*. Code requiring the HSTRING ABI can simply use a reinterpret_cast.

HRESULT is now projected as winrt::hresult. Code requiring an HRESULT can simply static_cast if you need to do type checking or support type traits, but it is otherwise convertible as long as <unknwn.h> is included first.

GUID is now projected as winrt::guid. Code implementing APIs with GUID parameters must use winrt::guid instead, but it is otherwise convertible as long as <unknwn.h> is included first.

The signatures of WINRT_CanUnloadNow and WINRT_GetActivationFactory has changed. Code must not declare these functions at all and instead include winrt/base.h to include their declarations.

The winrt::handle constructor is now explicit. Code assigning a raw handle value must call the attach method instead.

winrt::clock::from_FILETIME has been deprecated. Code should use winrt::clock::from_file_time instead.

What’s New

MSIX Support

It’s finally here! You can now package your applications as MSIX! These applications can be installed and run on any device with 17682 build or later.

To package your application with MSIX, use the MakeAppx tool. To install the application – just click on the MSIX file. To understand more about MSIX, watch this introductory video: link

Feedback and comments are welcome on our MSIX community: http://aka.ms/MSIXCommunity

MSIX is not currently supported by the App Certification Kit nor the Microsoft Store at this time.

MC.EXE

We’ve made some important changes to the C/C++ ETW code generation of mc.exe (Message Compiler):

The “-mof” parameter is deprecated. This parameter instructs MC.exe to generate ETW code that is compatible with Windows XP and earlier. Support for the “-mof” parameter will be removed in a future version of mc.exe.

As long as the “-mof” parameter is not used, the generated C/C++ header is now compatible with both kernel-mode and user-mode, regardless of whether “-km” or “-um” was specified on the command line. The header will use the _ETW_KM_ macro to automatically determine whether it is being compiled for kernel-mode or user-mode and will call the appropriate ETW APIs for each mode.

  • The only remaining difference between “-km” and “-um” is that the EventWrite[EventName] macros generated with “-km” have an Activity ID parameter while the EventWrite[EventName] macros generated with “-um” do not have an Activity ID parameter.

The EventWrite[EventName] macros now default to calling EventWriteTransfer (user mode) or EtwWriteTransfer (kernel mode). Previously, the EventWrite[EventName] macros defaulted to calling EventWrite (user mode) or EtwWrite (kernel mode).

  • The generated header now supports several customization macros. For example, you can set the MCGEN_EVENTWRITETRANSFER macro if you need the generated macros to call something other than EventWriteTransfer.
  • The manifest supports new attributes.
    • Event “name”: non-localized event name.
    • Event “attributes”: additional key-value metadata for an event such as filename, line number, component name, function name.
    • Event “tags”: 28-bit value with user-defined semantics (per-event).
    • Field “tags”: 28-bit value with user-defined semantics (per-field – can be applied to “data” or “struct” elements).
  • You can now define “provider traits” in the manifest (e.g. provider group). If provider traits are used in the manifest, the EventRegister[ProviderName] macro will automatically register them.
  • MC will now report an error if a localized message file is missing a string. (Previously MC would silently generate a corrupt message resource.)
  • MC can now generate Unicode (utf-8 or utf-16) output with the “-cp utf-8” or “-cp utf-16” parameters.

Known Issues

The SDK headers are generated with types in the “ABI” namespace. This is done to avoid conflicts with C++/CX and C++/WinRT clients that need to consume types directly at the ABI layer[1]. By default, types emitted by MIDL are *not* put in the ABI namespace, however this has the potential to introduce conflicts from teams attempting to consume ABI types from Windows WinRT MIDL generated headers and non-Windows WinRT MIDL generated headers (this is especially challenging if the non-Windows header references Windows types).

To ensure that developers have a consistent view of the WinRT API surface, validation has been added to the generated headers to ensure that the ABI prefix is consistent between the Windows headers and user generated headers. If you encounter an error like:

5>c:program files (x86)windows kits10include10.0.17687.0winrtwindows.foundation.h(83): error C2220: warning treated as error – no ‘object’ file generated

5>c:program files (x86)windows kits10include10.0.17687.0winrtwindows.foundation.h(83): warning C4005: ‘CHECK_NS_PREFIX_STATE’: macro redefinition

5>g:<PATH TO YOUR HEADER HERE>(41): note: see previous definition of ‘CHECK_NS_PREFIX_STATE’

It means that some of your MIDL generated headers are inconsistent with the system generated headers.

There are two ways to fix this:

  • Preferred: Compile your IDL file with the /ns_prefix MIDL command line switch. This will cause all your types to be moved to the ABI namespace consistent with the Windows headers. This may require code changes in your code however.
  • Alternate: Add #define DISABLE_NS_PREFIX_CHECKS before including the Windows headers. This will suppress the validation.

API Updates, Additions and Removals

When targeting new APIs, consider writing your app to be adaptive in order to run correctly on the widest number of Windows 10 devices. Please see Dynamically detecting features with API contracts for more information.

The following APIs have been added to the platform since the release of 17134. The APIs listed below have been removed.

In addition, please note the following removal since build 17744:


namespace Windows.ApplicationModel.DataTransfer 
{ public sealed class DataPackagePropertySetView : IIterable<IKeyValuePair<string, object>>, IMapView<string, object> { 
string SourceDisplayName { get; } } }

Additions:

 
namespace Windows.AI.MachineLearning {
  public interface ILearningModelFeatureDescriptor
  public interface ILearningModelFeatureValue
  public interface ILearningModelOperatorProvider
  public sealed class ImageFeatureDescriptor : ILearningModelFeatureDescriptor
  public sealed class ImageFeatureValue : ILearningModelFeatureValue
  public interface ITensor : ILearningModelFeatureValue
  public sealed class LearningModel : IClosable
  public sealed class LearningModelBinding : IIterable<IKeyValuePair<string, object>>, IMapView<string, object>
  public sealed class LearningModelDevice
  public enum LearningModelDeviceKind
  public sealed class LearningModelEvaluationResult
  public enum LearningModelFeatureKind
  public sealed class LearningModelSession : IClosable
  public struct MachineLearningContract
  public sealed class MapFeatureDescriptor : ILearningModelFeatureDescriptor
  public sealed class SequenceFeatureDescriptor : ILearningModelFeatureDescriptor
  public sealed class TensorBoolean : ILearningModelFeatureValue, ITensor
  public sealed class TensorDouble : ILearningModelFeatureValue, ITensor
  public sealed class TensorFeatureDescriptor : ILearningModelFeatureDescriptor
  public sealed class TensorFloat : ILearningModelFeatureValue, ITensor
  public sealed class TensorFloat16Bit : ILearningModelFeatureValue, ITensor
  public sealed class TensorInt16Bit : ILearningModelFeatureValue, ITensor
  public sealed class TensorInt32Bit : ILearningModelFeatureValue, ITensor
  public sealed class TensorInt64Bit : ILearningModelFeatureValue, ITensor
  public sealed class TensorInt8Bit : ILearningModelFeatureValue, ITensor
  public enum TensorKind
  public sealed class TensorString : ILearningModelFeatureValue, ITensor
  public sealed class TensorUInt16Bit : ILearningModelFeatureValue, ITensor
  public sealed class TensorUInt32Bit : ILearningModelFeatureValue, ITensor
  public sealed class TensorUInt64Bit : ILearningModelFeatureValue, ITensor
  public sealed class TensorUInt8Bit : ILearningModelFeatureValue, ITensor
}
namespace Windows.ApplicationModel {
  public sealed class AppInstallerInfo
  public sealed class LimitedAccessFeatureRequestResult
  public static class LimitedAccessFeatures
  public enum LimitedAccessFeatureStatus
  public sealed class Package {
    IAsyncOperation<PackageUpdateAvailabilityResult> CheckUpdateAvailabilityAsync();
    AppInstallerInfo GetAppInstallerInfo();
  }
  public enum PackageUpdateAvailability
  public sealed class PackageUpdateAvailabilityResult
}
namespace Windows.ApplicationModel.Calls {
  public sealed class VoipCallCoordinator {
    IAsyncOperation<VoipPhoneCallResourceReservationStatus> ReserveCallResourcesAsync();
  }
}
namespace Windows.ApplicationModel.Chat {
  public static class ChatCapabilitiesManager {
    public static IAsyncOperation<ChatCapabilities> GetCachedCapabilitiesAsync(string address, string transportId);
    public static IAsyncOperation<ChatCapabilities> GetCapabilitiesFromNetworkAsync(string address, string transportId);
  }
  public static class RcsManager {
    public static event EventHandler<object> TransportListChanged;
  }
}
namespace Windows.ApplicationModel.DataTransfer {
  public static class Clipboard {
    public static event EventHandler<ClipboardHistoryChangedEventArgs> HistoryChanged;
    public static event EventHandler<object> HistoryEnabledChanged;
    public static event EventHandler<object> RoamingEnabledChanged;
    public static bool ClearHistory();
    public static bool DeleteItemFromHistory(ClipboardHistoryItem item);
    public static IAsyncOperation<ClipboardHistoryItemsResult> GetHistoryItemsAsync();
    public static bool IsHistoryEnabled();
    public static bool IsRoamingEnabled();
    public static bool SetContentWithOptions(DataPackage content, ClipboardContentOptions options);
    public static SetHistoryItemAsContentStatus SetHistoryItemAsContent(ClipboardHistoryItem item);
  }
  public sealed class ClipboardContentOptions
  public sealed class ClipboardHistoryChangedEventArgs
  public sealed class ClipboardHistoryItem
  public sealed class ClipboardHistoryItemsResult
  public enum ClipboardHistoryItemsResultStatus
  public sealed class DataPackagePropertySetView : IIterable<IKeyValuePair<string, object>>, IMapView<string, object> {
    bool IsFromRoamingClipboard { get; }
  }
  public enum SetHistoryItemAsContentStatus
}
namespace Windows.ApplicationModel.Store.Preview {
  public enum DeliveryOptimizationDownloadMode
  public enum DeliveryOptimizationDownloadModeSource
  public sealed class DeliveryOptimizationSettings
  public static class StoreConfiguration {
    public static bool IsPinToDesktopSupported();
    public static bool IsPinToStartSupported();
    public static bool IsPinToTaskbarSupported();
    public static void PinToDesktop(string appPackageFamilyName);
    public static void PinToDesktopForUser(User user, string appPackageFamilyName);
  }
}
namespace Windows.ApplicationModel.Store.Preview.InstallControl {
  public enum AppInstallationToastNotificationMode
  public sealed class AppInstallItem {
    AppInstallationToastNotificationMode CompletedInstallToastNotificationMode { get; set; }
    AppInstallationToastNotificationMode InstallInProgressToastNotificationMode { get; set; }
    bool PinToDesktopAfterInstall { get; set; }
    bool PinToStartAfterInstall { get; set; }
    bool PinToTaskbarAfterInstall { get; set; }
  }
  public sealed class AppInstallManager {
    bool CanInstallForAllUsers { get; }
  }
  public sealed class AppInstallOptions {
    string CampaignId { get; set; }
    AppInstallationToastNotificationMode CompletedInstallToastNotificationMode { get; set; }
    string ExtendedCampaignId { get; set; }
    bool InstallForAllUsers { get; set; }
    AppInstallationToastNotificationMode InstallInProgressToastNotificationMode { get; set; }
    bool PinToDesktopAfterInstall { get; set; }
    bool PinToStartAfterInstall { get; set; }
    bool PinToTaskbarAfterInstall { get; set; }
    bool StageButDoNotInstall { get; set; }
  }
  public sealed class AppUpdateOptions {
    bool AutomaticallyDownloadAndInstallUpdateIfFound { get; set; }
  }
}
namespace Windows.ApplicationModel.UserActivities {
  public sealed class UserActivity {
    bool IsRoamable { get; set; }
  }
}
namespace Windows.Data.Text {
  public sealed class TextPredictionGenerator {
    CoreTextInputScope InputScope { get; set; }
    IAsyncOperation<IVectorView<string>> GetCandidatesAsync(string input, uint maxCandidates, TextPredictionOptions predictionOptions, IIterable<string> previousStrings);
    IAsyncOperation<IVectorView<string>> GetNextWordCandidatesAsync(uint maxCandidates, IIterable<string> previousStrings);
  }
  public enum TextPredictionOptions : uint
}
namespace Windows.Devices.Display.Core {
  public sealed class DisplayAdapter
  public enum DisplayBitsPerChannel : uint
  public sealed class DisplayDevice
  public enum DisplayDeviceCapability
  public sealed class DisplayFence
  public sealed class DisplayManager : IClosable
  public sealed class DisplayManagerChangedEventArgs
  public sealed class DisplayManagerDisabledEventArgs
  public sealed class DisplayManagerEnabledEventArgs
  public enum DisplayManagerOptions : uint
  public sealed class DisplayManagerPathsFailedOrInvalidatedEventArgs
  public enum DisplayManagerResult
  public sealed class DisplayManagerResultWithState
  public sealed class DisplayModeInfo
  public enum DisplayModeQueryOptions : uint
  public sealed class DisplayPath
  public enum DisplayPathScaling
  public enum DisplayPathStatus
  public struct DisplayPresentationRate
  public sealed class DisplayPrimaryDescription
  public enum DisplayRotation
  public sealed class DisplayScanout
  public sealed class DisplaySource
  public sealed class DisplayState
  public enum DisplayStateApplyOptions : uint
  public enum DisplayStateFunctionalizeOptions : uint
  public sealed class DisplayStateOperationResult
  public enum DisplayStateOperationStatus
  public sealed class DisplaySurface
  public sealed class DisplayTarget
  public enum DisplayTargetPersistence
  public sealed class DisplayTask
  public sealed class DisplayTaskPool
  public enum DisplayTaskSignalKind
  public sealed class DisplayView
  public sealed class DisplayWireFormat
  public enum DisplayWireFormatColorSpace
  public enum DisplayWireFormatEotf
  public enum DisplayWireFormatHdrMetadata
  public enum DisplayWireFormatPixelEncoding
}
namespace Windows.Devices.Enumeration {
  public enum DeviceInformationKind {
    DevicePanel = 8,
  }
  public sealed class DeviceInformationPairing {
    public static bool TryRegisterForAllInboundPairingRequestsWithProtectionLevel(DevicePairingKinds pairingKindsSupported, DevicePairingProtectionLevel minProtectionLevel);
  }
}
namespace Windows.Devices.Enumeration.Pnp {
  public enum PnpObjectType {
    DevicePanel = 8,
  }
}
namespace Windows.Devices.Lights {
  public sealed class LampArray
  public enum LampArrayKind
  public sealed class LampInfo
  public enum LampPurposes : uint
}
namespace Windows.Devices.Lights.Effects {
  public interface ILampArrayEffect
  public sealed class LampArrayBitmapEffect : ILampArrayEffect
  public sealed class LampArrayBitmapRequestedEventArgs
  public sealed class LampArrayBlinkEffect : ILampArrayEffect
  public sealed class LampArrayColorRampEffect : ILampArrayEffect
  public sealed class LampArrayCustomEffect : ILampArrayEffect
  public enum LampArrayEffectCompletionBehavior
  public sealed class LampArrayEffectPlaylist : IIterable<ILampArrayEffect>, IVectorView<ILampArrayEffect>
  public enum LampArrayEffectStartMode
  public enum LampArrayRepetitionMode
  public sealed class LampArraySolidEffect : ILampArrayEffect
  public sealed class LampArrayUpdateRequestedEventArgs
}
namespace Windows.Devices.PointOfService {
  public sealed class BarcodeScannerCapabilities {
    bool IsVideoPreviewSupported { get; }
  }
  public sealed class ClaimedBarcodeScanner : IClosable {
    event TypedEventHandler<ClaimedBarcodeScanner, ClaimedBarcodeScannerClosedEventArgs> Closed;
  }
  public sealed class ClaimedBarcodeScannerClosedEventArgs
  public sealed class ClaimedCashDrawer : IClosable {
    event TypedEventHandler<ClaimedCashDrawer, ClaimedCashDrawerClosedEventArgs> Closed;
  }
  public sealed class ClaimedCashDrawerClosedEventArgs
  public sealed class ClaimedLineDisplay : IClosable {
    event TypedEventHandler<ClaimedLineDisplay, ClaimedLineDisplayClosedEventArgs> Closed;
  }
  public sealed class ClaimedLineDisplayClosedEventArgs
  public sealed class ClaimedMagneticStripeReader : IClosable {
    event TypedEventHandler<ClaimedMagneticStripeReader, ClaimedMagneticStripeReaderClosedEventArgs> Closed;
  }
  public sealed class ClaimedMagneticStripeReaderClosedEventArgs
  public sealed class ClaimedPosPrinter : IClosable {
    event TypedEventHandler<ClaimedPosPrinter, ClaimedPosPrinterClosedEventArgs> Closed;
  }
  public sealed class ClaimedPosPrinterClosedEventArgs
}
namespace Windows.Devices.PointOfService.Provider {
  public sealed class BarcodeScannerDisableScannerRequest {
    IAsyncAction ReportFailedAsync(int reason);
    IAsyncAction ReportFailedAsync(int reason, string failedReasonDescription);
  }
  public sealed class BarcodeScannerEnableScannerRequest {
    IAsyncAction ReportFailedAsync(int reason);
    IAsyncAction ReportFailedAsync(int reason, string failedReasonDescription);
  }
  public sealed class BarcodeScannerFrameReader : IClosable
  public sealed class BarcodeScannerFrameReaderFrameArrivedEventArgs
  public sealed class BarcodeScannerGetSymbologyAttributesRequest {
    IAsyncAction ReportFailedAsync(int reason);
    IAsyncAction ReportFailedAsync(int reason, string failedReasonDescription);
  }
  public sealed class BarcodeScannerHideVideoPreviewRequest {
    IAsyncAction ReportFailedAsync(int reason);
    IAsyncAction ReportFailedAsync(int reason, string failedReasonDescription);
  }
  public sealed class BarcodeScannerProviderConnection : IClosable {
    IAsyncOperation<BarcodeScannerFrameReader> CreateFrameReaderAsync();
    IAsyncOperation<BarcodeScannerFrameReader> CreateFrameReaderAsync(BitmapPixelFormat preferredFormat);
    IAsyncOperation<BarcodeScannerFrameReader> CreateFrameReaderAsync(BitmapPixelFormat preferredFormat, BitmapSize preferredSize);
  }
  public sealed class BarcodeScannerSetActiveSymbologiesRequest {
    IAsyncAction ReportFailedAsync(int reason);
    IAsyncAction ReportFailedAsync(int reason, string failedReasonDescription);
  }
  public sealed class BarcodeScannerSetSymbologyAttributesRequest {
    IAsyncAction ReportFailedAsync(int reason);
    IAsyncAction ReportFailedAsync(int reason, string failedReasonDescription);
  }
  public sealed class BarcodeScannerStartSoftwareTriggerRequest {
    IAsyncAction ReportFailedAsync(int reason);
    IAsyncAction ReportFailedAsync(int reason, string failedReasonDescription);
  }
  public sealed class BarcodeScannerStopSoftwareTriggerRequest {
    IAsyncAction ReportFailedAsync(int reason);
    IAsyncAction ReportFailedAsync(int reason, string failedReasonDescription);
  }
  public sealed class BarcodeScannerVideoFrame : IClosable
}
namespace Windows.Devices.Sensors {
  public sealed class HingeAngleReading
  public sealed class HingeAngleSensor
  public sealed class HingeAngleSensorReadingChangedEventArgs
  public sealed class SimpleOrientationSensor {
    public static IAsyncOperation<SimpleOrientationSensor> FromIdAsync(string deviceId);
    public static string GetDeviceSelector();
  }
}
namespace Windows.Devices.SmartCards {
  public static class KnownSmartCardAppletIds
  public sealed class SmartCardAppletIdGroup {
    string Description { get; set; }
    IRandomAccessStreamReference Logo { get; set; }
    ValueSet Properties { get; }
    bool SecureUserAuthenticationRequired { get; set; }
  }
  public sealed class SmartCardAppletIdGroupRegistration {
    string SmartCardReaderId { get; }
    IAsyncAction SetPropertiesAsync(ValueSet props);
  }
}
namespace Windows.Devices.WiFi {
  public enum WiFiPhyKind {
    HE = 10,
  }
}
namespace Windows.Foundation {
  public static class GuidHelper
}
namespace Windows.Globalization {
  public static class CurrencyIdentifiers {
    public static string MRU { get; }
    public static string SSP { get; }
    public static string STN { get; }
    public static string VES { get; }
  }
}
namespace Windows.Graphics.Capture {
  public sealed class Direct3D11CaptureFramePool : IClosable {
    public static Direct3D11CaptureFramePool CreateFreeThreaded(IDirect3DDevice device, DirectXPixelFormat pixelFormat, int numberOfBuffers, SizeInt32 size);
  }
  public sealed class GraphicsCaptureItem {
    public static GraphicsCaptureItem CreateFromVisual(Visual visual);
  }
}
namespace Windows.Graphics.Display.Core {
  public enum HdmiDisplayHdrOption {
    DolbyVisionLowLatency = 3,
  }
  public sealed class HdmiDisplayMode {
    bool IsDolbyVisionLowLatencySupported { get; }
  }
}
namespace Windows.Graphics.Holographic {
  public sealed class HolographicCamera {
    bool IsHardwareContentProtectionEnabled { get; set; }
    bool IsHardwareContentProtectionSupported { get; }
  }
  public sealed class HolographicQuadLayerUpdateParameters {
    bool CanAcquireWithHardwareProtection { get; }
    IDirect3DSurface AcquireBufferToUpdateContentWithHardwareProtection();
  }
}
namespace Windows.Graphics.Imaging {
  public sealed class BitmapDecoder : IBitmapFrame, IBitmapFrameWithSoftwareBitmap {
    public static Guid HeifDecoderId { get; }
    public static Guid WebpDecoderId { get; }
  }
  public sealed class BitmapEncoder {
    public static Guid HeifEncoderId { get; }
  }
}
namespace Windows.Management.Deployment {
  public enum DeploymentOptions : uint {
    ForceUpdateFromAnyVersion = (uint)262144,
  }
  public sealed class PackageManager {
    IAsyncOperationWithProgress<DeploymentResult, DeploymentProgress> DeprovisionPackageForAllUsersAsync(string packageFamilyName);
  }
  public enum RemovalOptions : uint {
    RemoveForAllUsers = (uint)524288,
  }
}
namespace Windows.Media.Audio {
  public sealed class CreateAudioDeviceInputNodeResult {
    HResult ExtendedError { get; }
  }
  public sealed class CreateAudioDeviceOutputNodeResult {
    HResult ExtendedError { get; }
  }
  public sealed class CreateAudioFileInputNodeResult {
    HResult ExtendedError { get; }
  }
  public sealed class CreateAudioFileOutputNodeResult {
    HResult ExtendedError { get; }
  }
  public sealed class CreateAudioGraphResult {
    HResult ExtendedError { get; }
  }
  public sealed class CreateMediaSourceAudioInputNodeResult {
    HResult ExtendedError { get; }
  }
  public enum MixedRealitySpatialAudioFormatPolicy
  public sealed class SetDefaultSpatialAudioFormatResult
  public enum SetDefaultSpatialAudioFormatStatus
  public sealed class SpatialAudioDeviceConfiguration
  public sealed class SpatialAudioFormatConfiguration
  public static class SpatialAudioFormatSubtype
}
namespace Windows.Media.Control {
  public sealed class CurrentSessionChangedEventArgs
  public sealed class GlobalSystemMediaTransportControlsSession
  public sealed class GlobalSystemMediaTransportControlsSessionManager
  public sealed class GlobalSystemMediaTransportControlsSessionMediaProperties
  public sealed class GlobalSystemMediaTransportControlsSessionPlaybackControls
  public sealed class GlobalSystemMediaTransportControlsSessionPlaybackInfo
  public enum GlobalSystemMediaTransportControlsSessionPlaybackStatus
  public sealed class GlobalSystemMediaTransportControlsSessionTimelineProperties
  public sealed class MediaPropertiesChangedEventArgs
  public sealed class PlaybackInfoChangedEventArgs
  public sealed class SessionsChangedEventArgs
  public sealed class TimelinePropertiesChangedEventArgs
}
namespace Windows.Media.Core {
  public sealed class MediaStreamSample {
    IDirect3DSurface Direct3D11Surface { get; }
    public static MediaStreamSample CreateFromDirect3D11Surface(IDirect3DSurface surface, TimeSpan timestamp);
  }
}
namespace Windows.Media.Devices.Core {
  public sealed class CameraIntrinsics {
    public CameraIntrinsics(Vector2 focalLength, Vector2 principalPoint, Vector3 radialDistortion, Vector2 tangentialDistortion, uint imageWidth, uint imageHeight);
  }
}
namespace Windows.Media.Import {
  public enum PhotoImportContentTypeFilter {
    ImagesAndVideosFromCameraRoll = 3,
  }
  public sealed class PhotoImportItem {
    string Path { get; }
  }
}
namespace Windows.Media.MediaProperties {
  public sealed class ImageEncodingProperties : IMediaEncodingProperties {
    public static ImageEncodingProperties CreateHeif();
  }
  public static class MediaEncodingSubtypes {
    public static string Heif { get; }
  }
}
namespace Windows.Media.Protection.PlayReady {
  public static class PlayReadyStatics {
    public static IReference<DateTime> HardwareDRMDisabledAtTime { get; }
    public static IReference<DateTime> HardwareDRMDisabledUntilTime { get; }
    public static void ResetHardwareDRMDisabled();
  }
}
namespace Windows.Media.Streaming.Adaptive {
  public enum AdaptiveMediaSourceResourceType {
    MediaSegmentIndex = 5,
  }
}
namespace Windows.Networking.BackgroundTransfer {
  public enum BackgroundTransferPriority {
    Low = 2,
  }
}
namespace Windows.Networking.Connectivity {
  public sealed class ConnectionProfile {
    bool CanDelete { get; }
    IAsyncOperation<ConnectionProfileDeleteStatus> TryDeleteAsync();
  }
  public enum ConnectionProfileDeleteStatus
}
namespace Windows.Networking.NetworkOperators {
  public enum ESimOperationStatus {
    CardGeneralFailure = 13,
    ConfirmationCodeMissing = 14,
    EidMismatch = 18,
    InvalidMatchingId = 15,
    NoCorrespondingRequest = 23,
    NoEligibleProfileForThisDevice = 16,
    OperationAborted = 17,
    OperationProhibitedByProfileClass = 21,
    ProfileNotAvailableForNewBinding = 19,
    ProfileNotPresent = 22,
    ProfileNotReleasedByOperator = 20,
  }
}
namespace Windows.Perception {
  public sealed class PerceptionTimestamp {
    TimeSpan SystemRelativeTargetTime { get; }
  }
  public static class PerceptionTimestampHelper {
    public static PerceptionTimestamp FromSystemRelativeTargetTime(TimeSpan targetTime);
  }
}
namespace Windows.Perception.Spatial {
  public sealed class SpatialAnchorExporter
  public enum SpatialAnchorExportPurpose
  public sealed class SpatialAnchorExportSufficiency
  public sealed class SpatialLocation {
    Vector3 AbsoluteAngularAccelerationAxisAngle { get; }
    Vector3 AbsoluteAngularVelocityAxisAngle { get; }
  }
}
namespace Windows.Perception.Spatial.Preview {
  public static class SpatialGraphInteropPreview
}
namespace Windows.Services.Cortana {
  public sealed class CortanaActionableInsights
  public sealed class CortanaActionableInsightsOptions
}
namespace Windows.Services.Store {
  public sealed class StoreAppLicense {
    bool IsDiscLicense { get; }
  }
  public sealed class StoreContext {
    IAsyncOperation<StoreRateAndReviewResult> RequestRateAndReviewAppAsync();
    IAsyncOperation<IVectorView<StoreQueueItem>> SetInstallOrderForAssociatedStoreQueueItemsAsync(IIterable<StoreQueueItem> items);
  }
  public sealed class StoreQueueItem {
    IAsyncAction CancelInstallAsync();
    IAsyncAction PauseInstallAsync();
    IAsyncAction ResumeInstallAsync();
  }
  public sealed class StoreRateAndReviewResult
  public enum StoreRateAndReviewStatus
}
namespace Windows.Storage.Provider {
  public enum StorageProviderHydrationPolicyModifier : uint {
    AutoDehydrationAllowed = (uint)4,
  }
  public sealed class StorageProviderSyncRootInfo {
    Guid ProviderId { get; set; }
  }
}
namespace Windows.System {
  public sealed class AppUriHandlerHost
  public sealed class AppUriHandlerRegistration
  public sealed class AppUriHandlerRegistrationManager
  public static class Launcher {
    public static IAsyncOperation<bool> LaunchFolderPathAsync(string path);
    public static IAsyncOperation<bool> LaunchFolderPathAsync(string path, FolderLauncherOptions options);
    public static IAsyncOperation<bool> LaunchFolderPathForUserAsync(User user, string path);
    public static IAsyncOperation<bool> LaunchFolderPathForUserAsync(User user, string path, FolderLauncherOptions options);
  }
}
namespace Windows.System.Preview {
  public enum HingeState
  public sealed class TwoPanelHingedDevicePosturePreview
  public sealed class TwoPanelHingedDevicePosturePreviewReading
  public sealed class TwoPanelHingedDevicePosturePreviewReadingChangedEventArgs
}
namespace Windows.System.Profile {
  public enum SystemOutOfBoxExperienceState
  public static class SystemSetupInfo
  public static class WindowsIntegrityPolicy
}
namespace Windows.System.Profile.SystemManufacturers {
  public sealed class SystemSupportDeviceInfo
  public static class SystemSupportInfo {
    public static SystemSupportDeviceInfo LocalDeviceInfo { get; }
  }
}
namespace Windows.System.RemoteSystems {
  public sealed class RemoteSystem {
    IVectorView<RemoteSystemApp> Apps { get; }
  }
  public sealed class RemoteSystemApp
  public sealed class RemoteSystemAppRegistration
  public sealed class RemoteSystemConnectionInfo
  public sealed class RemoteSystemConnectionRequest {
    RemoteSystemApp RemoteSystemApp { get; }
    public static RemoteSystemConnectionRequest CreateForApp(RemoteSystemApp remoteSystemApp);
  }
  public sealed class RemoteSystemWebAccountFilter : IRemoteSystemFilter
}
namespace Windows.System.Update {
  public enum SystemUpdateAttentionRequiredReason
  public sealed class SystemUpdateItem
  public enum SystemUpdateItemState
  public sealed class SystemUpdateLastErrorInfo
  public static class SystemUpdateManager
  public enum SystemUpdateManagerState
  public enum SystemUpdateStartInstallAction
}
namespace Windows.System.UserProfile {
  public sealed class AssignedAccessSettings
}
namespace Windows.UI.Accessibility {
  public sealed class ScreenReaderPositionChangedEventArgs
  public sealed class ScreenReaderService
}
namespace Windows.UI.Composition {
  public enum AnimationPropertyAccessMode
  public sealed class AnimationPropertyInfo : CompositionObject
  public sealed class BooleanKeyFrameAnimation : KeyFrameAnimation
  public class CompositionAnimation : CompositionObject, ICompositionAnimationBase {
    void SetExpressionReferenceParameter(string parameterName, IAnimationObject source);
  }
  public enum CompositionBatchTypes : uint {
    AllAnimations = (uint)5,
    InfiniteAnimation = (uint)4,
  }
  public sealed class CompositionGeometricClip : CompositionClip
  public class CompositionGradientBrush : CompositionBrush {
    CompositionMappingMode MappingMode { get; set; }
  }
  public enum CompositionMappingMode
  public class CompositionObject : IAnimationObject, IClosable {
    void PopulatePropertyInfo(string propertyName, AnimationPropertyInfo propertyInfo);
    public static void StartAnimationGroupWithIAnimationObject(IAnimationObject target, ICompositionAnimationBase animation);
    public static void StartAnimationWithIAnimationObject(IAnimationObject target, string propertyName, CompositionAnimation animation);
  }
  public sealed class Compositor : IClosable {
    BooleanKeyFrameAnimation CreateBooleanKeyFrameAnimation();
    CompositionGeometricClip CreateGeometricClip();
    CompositionGeometricClip CreateGeometricClip(CompositionGeometry geometry);
    RedirectVisual CreateRedirectVisual();
    RedirectVisual CreateRedirectVisual(Visual source);
  }
  public interface IAnimationObject
  public sealed class RedirectVisual : ContainerVisual
}
namespace Windows.UI.Composition.Interactions {
  public sealed class InteractionSourceConfiguration : CompositionObject
  public enum InteractionSourceRedirectionMode
  public sealed class InteractionTracker : CompositionObject {
    bool IsInertiaFromImpulse { get; }
    int TryUpdatePosition(Vector3 value, InteractionTrackerClampingOption option);
    int TryUpdatePositionBy(Vector3 amount, InteractionTrackerClampingOption option);
  }
  public enum InteractionTrackerClampingOption
  public sealed class InteractionTrackerInertiaStateEnteredArgs {
    bool IsInertiaFromImpulse { get; }
  }
  public class VisualInteractionSource : CompositionObject, ICompositionInteractionSource {
    InteractionSourceConfiguration PointerWheelConfig { get; }
  }
}
namespace Windows.UI.Input.Inking {
  public enum HandwritingLineHeight
  public sealed class PenAndInkSettings
  public enum PenHandedness
}
namespace Windows.UI.Input.Inking.Preview {
  public sealed class PalmRejectionDelayZonePreview : IClosable
}
namespace Windows.UI.Notifications {
  public sealed class ScheduledToastNotificationShowingEventArgs
  public sealed class ToastNotifier {
    event TypedEventHandler<ToastNotifier, ScheduledToastNotificationShowingEventArgs> ScheduledToastNotificationShowing;
  }
}
namespace Windows.UI.Shell {
  public enum SecurityAppKind
  public sealed class SecurityAppManager
  public struct SecurityAppManagerContract
  public enum SecurityAppState
  public enum SecurityAppSubstatus
  public sealed class TaskbarManager {
    IAsyncOperation<bool> IsSecondaryTilePinnedAsync(string tileId);
    IAsyncOperation<bool> RequestPinSecondaryTileAsync(SecondaryTile secondaryTile);
    IAsyncOperation<bool> TryUnpinSecondaryTileAsync(string tileId);
  }
}
namespace Windows.UI.StartScreen {
  public sealed class StartScreenManager {
    IAsyncOperation<bool> ContainsSecondaryTileAsync(string tileId);
    IAsyncOperation<bool> TryRemoveSecondaryTileAsync(string tileId);
  }
}
namespace Windows.UI.Text {
  public sealed class RichEditTextDocument : ITextDocument {
    void ClearUndoRedoHistory();
  }
}
namespace Windows.UI.Text.Core {
  public sealed class CoreTextLayoutRequest {
    CoreTextLayoutBounds LayoutBoundsVisualPixels { get; }
  }
}
namespace Windows.UI.ViewManagement {
  public enum ApplicationViewWindowingMode {
    CompactOverlay = 3,
    Maximized = 4,
  }
}
namespace Windows.UI.ViewManagement.Core {
  public sealed class CoreInputView {
    bool TryHide();
    bool TryShow();
    bool TryShow(CoreInputViewKind type);
  }
  public enum CoreInputViewKind
}
namespace Windows.UI.WebUI {
  public sealed class BackgroundActivatedEventArgs : IBackgroundActivatedEventArgs
  public delegate void BackgroundActivatedEventHandler(object sender, IBackgroundActivatedEventArgs eventArgs);
  public sealed class NewWebUIViewCreatedEventArgs
  public static class WebUIApplication {
    public static event BackgroundActivatedEventHandler BackgroundActivated;
    public static event EventHandler<NewWebUIViewCreatedEventArgs> NewWebUIViewCreated;
  }
  public sealed class WebUIView : IWebViewControl, IWebViewControl2
}
namespace Windows.UI.Xaml {
  public class BrushTransition
  public class ColorPaletteResources : ResourceDictionary
  public class DataTemplate : FrameworkTemplate, IElementFactory {
    UIElement GetElement(ElementFactoryGetArgs args);
    void RecycleElement(ElementFactoryRecycleArgs args);
  }
  public sealed class DebugSettings {
    bool FailFastOnErrors { get; set; }
  }
  public sealed class EffectiveViewportChangedEventArgs
  public class ElementFactoryGetArgs
  public class ElementFactoryRecycleArgs
  public class FrameworkElement : UIElement {
    bool IsLoaded { get; }
    event TypedEventHandler<FrameworkElement, EffectiveViewportChangedEventArgs> EffectiveViewportChanged;
    void InvalidateViewport();
  }
  public interface IElementFactory
  public class ScalarTransition
  public class UIElement : DependencyObject, IAnimationObject {
    bool CanBeScrollAnchor { get; set; }
    public static DependencyProperty CanBeScrollAnchorProperty { get; }
    Vector3 CenterPoint { get; set; }
    ScalarTransition OpacityTransition { get; set; }
    float Rotation { get; set; }
    Vector3 RotationAxis { get; set; }
    ScalarTransition RotationTransition { get; set; }
    Vector3 Scale { get; set; }
    Vector3Transition ScaleTransition { get; set; }
    Matrix4x4 TransformMatrix { get; set; }
    Vector3 Translation { get; set; }
    Vector3Transition TranslationTransition { get; set; }
    void PopulatePropertyInfo(string propertyName, AnimationPropertyInfo propertyInfo);
    virtual void PopulatePropertyInfoOverride(string propertyName, AnimationPropertyInfo animationPropertyInfo);
    void StartAnimation(ICompositionAnimationBase animation);
    void StopAnimation(ICompositionAnimationBase animation);
  }
  public class Vector3Transition
  public enum Vector3TransitionComponents : uint
}
namespace Windows.UI.Xaml.Automation {
  public sealed class AutomationElementIdentifiers {
    public static AutomationProperty IsDialogProperty { get; }
  }
  public sealed class AutomationProperties {
    public static DependencyProperty IsDialogProperty { get; }
    public static bool GetIsDialog(DependencyObject element);
    public static void SetIsDialog(DependencyObject element, bool value);
  }
}
namespace Windows.UI.Xaml.Automation.Peers {
  public class AppBarButtonAutomationPeer : ButtonAutomationPeer, IExpandCollapseProvider {
    ExpandCollapseState ExpandCollapseState { get; }
    void Collapse();
    void Expand();
  }
  public class AutomationPeer : DependencyObject {
    bool IsDialog();
    virtual bool IsDialogCore();
  }
  public class MenuBarAutomationPeer : FrameworkElementAutomationPeer
  public class MenuBarItemAutomationPeer : FrameworkElementAutomationPeer, IExpandCollapseProvider, IInvokeProvider
}
namespace Windows.UI.Xaml.Controls {
  public sealed class AnchorRequestedEventArgs
  public class AppBarElementContainer : ContentControl, ICommandBarElement, ICommandBarElement2
  public sealed class AutoSuggestBox : ItemsControl {
    object Description { get; set; }
    public static DependencyProperty DescriptionProperty { get; }
  }
  public enum BackgroundSizing
  public sealed class Border : FrameworkElement {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
    BrushTransition BackgroundTransition { get; set; }
  }
  public class CalendarDatePicker : Control {
    object Description { get; set; }
    public static DependencyProperty DescriptionProperty { get; }
  }
  public class ComboBox : Selector {
    object Description { get; set; }
    public static DependencyProperty DescriptionProperty { get; }
    bool IsEditable { get; set; }
    public static DependencyProperty IsEditableProperty { get; }
    string Text { get; set; }
    Style TextBoxStyle { get; set; }
    public static DependencyProperty TextBoxStyleProperty { get; }
    public static DependencyProperty TextProperty { get; }
    event TypedEventHandler<ComboBox, ComboBoxTextSubmittedEventArgs> TextSubmitted;
  }
  public sealed class ComboBoxTextSubmittedEventArgs
  public class CommandBarFlyout : FlyoutBase
  public class ContentPresenter : FrameworkElement {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
    BrushTransition BackgroundTransition { get; set; }
  }
  public class Control : FrameworkElement {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
    CornerRadius CornerRadius { get; set; }
    public static DependencyProperty CornerRadiusProperty { get; }
  }
  public class DataTemplateSelector : IElementFactory {
    UIElement GetElement(ElementFactoryGetArgs args);
    void RecycleElement(ElementFactoryRecycleArgs args);
  }
  public class DatePicker : Control {
    IReference<DateTime> SelectedDate { get; set; }
    public static DependencyProperty SelectedDateProperty { get; }
    event TypedEventHandler<DatePicker, DatePickerSelectedValueChangedEventArgs> SelectedDateChanged;
  }
  public sealed class DatePickerSelectedValueChangedEventArgs
  public class DropDownButton : Button
  public class DropDownButtonAutomationPeer : ButtonAutomationPeer, IExpandCollapseProvider
  public class Frame : ContentControl, INavigate {
    bool IsNavigationStackEnabled { get; set; }
    public static DependencyProperty IsNavigationStackEnabledProperty { get; }
    bool NavigateToType(TypeName sourcePageType, object parameter, FrameNavigationOptions navigationOptions);
  }
  public class Grid : Panel {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
  }
  public class IconSourceElement : IconElement
  public interface IScrollAnchorProvider
  public class MenuBar : Control
  public class MenuBarItem : Control
  public class MenuBarItemFlyout : MenuFlyout
  public class NavigationView : ContentControl {
    UIElement ContentOverlay { get; set; }
    public static DependencyProperty ContentOverlayProperty { get; }
    bool IsPaneVisible { get; set; }
    public static DependencyProperty IsPaneVisibleProperty { get; }
    NavigationViewOverflowLabelMode OverflowLabelMode { get; set; }
    public static DependencyProperty OverflowLabelModeProperty { get; }
    UIElement PaneCustomContent { get; set; }
    public static DependencyProperty PaneCustomContentProperty { get; }
    NavigationViewPaneDisplayMode PaneDisplayMode { get; set; }
    public static DependencyProperty PaneDisplayModeProperty { get; }
    UIElement PaneHeader { get; set; }
    public static DependencyProperty PaneHeaderProperty { get; }
    NavigationViewSelectionFollowsFocus SelectionFollowsFocus { get; set; }
    public static DependencyProperty SelectionFollowsFocusProperty { get; }
    NavigationViewShoulderNavigationEnabled ShoulderNavigationEnabled { get; set; }
    public static DependencyProperty ShoulderNavigationEnabledProperty { get; }
    NavigationViewTemplateSettings TemplateSettings { get; }
    public static DependencyProperty TemplateSettingsProperty { get; }
  }
  public class NavigationViewItem : NavigationViewItemBase {
    bool SelectsOnInvoked { get; set; }
    public static DependencyProperty SelectsOnInvokedProperty { get; }
  }
  public sealed class NavigationViewItemInvokedEventArgs {
    NavigationViewItemBase InvokedItemContainer { get; }
    NavigationTransitionInfo RecommendedNavigationTransitionInfo { get; }
  }
  public enum NavigationViewOverflowLabelMode
  public enum NavigationViewPaneDisplayMode
  public sealed class NavigationViewSelectionChangedEventArgs {
    NavigationTransitionInfo RecommendedNavigationTransitionInfo { get; }
    NavigationViewItemBase SelectedItemContainer { get; }
  }
  public enum NavigationViewSelectionFollowsFocus
  public enum NavigationViewShoulderNavigationEnabled
  public class NavigationViewTemplateSettings : DependencyObject
  public class Panel : FrameworkElement {
    BrushTransition BackgroundTransition { get; set; }
  }
  public sealed class PasswordBox : Control {
    bool CanPasteClipboardContent { get; }
    public static DependencyProperty CanPasteClipboardContentProperty { get; }
    object Description { get; set; }
    public static DependencyProperty DescriptionProperty { get; }
    FlyoutBase SelectionFlyout { get; set; }
    public static DependencyProperty SelectionFlyoutProperty { get; }
    void PasteFromClipboard();
  }
  public class RelativePanel : Panel {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
  }
  public class RichEditBox : Control {
    object Description { get; set; }
    public static DependencyProperty DescriptionProperty { get; }
    FlyoutBase ProofingMenuFlyout { get; }
    public static DependencyProperty ProofingMenuFlyoutProperty { get; }
    FlyoutBase SelectionFlyout { get; set; }
    public static DependencyProperty SelectionFlyoutProperty { get; }
    RichEditTextDocument TextDocument { get; }
    event TypedEventHandler<RichEditBox, RichEditBoxSelectionChangingEventArgs> SelectionChanging;
  }
  public sealed class RichEditBoxSelectionChangingEventArgs
  public sealed class RichTextBlock : FrameworkElement {
    FlyoutBase SelectionFlyout { get; set; }
    public static DependencyProperty SelectionFlyoutProperty { get; }
    void CopySelectionToClipboard();
  }
  public sealed class ScrollContentPresenter : ContentPresenter {
    bool CanContentRenderOutsideBounds { get; set; }
    public static DependencyProperty CanContentRenderOutsideBoundsProperty { get; }
    bool SizesContentToTemplatedParent { get; set; }
    public static DependencyProperty SizesContentToTemplatedParentProperty { get; }
  }
  public sealed class ScrollViewer : ContentControl, IScrollAnchorProvider {
    bool CanContentRenderOutsideBounds { get; set; }
    public static DependencyProperty CanContentRenderOutsideBoundsProperty { get; }
    UIElement CurrentAnchor { get; }
    double HorizontalAnchorRatio { get; set; }
    public static DependencyProperty HorizontalAnchorRatioProperty { get; }
    bool ReduceViewportForCoreInputViewOcclusions { get; set; }
    public static DependencyProperty ReduceViewportForCoreInputViewOcclusionsProperty { get; }
    double VerticalAnchorRatio { get; set; }
    public static DependencyProperty VerticalAnchorRatioProperty { get; }
    event TypedEventHandler<ScrollViewer, AnchorRequestedEventArgs> AnchorRequested;
    public static bool GetCanContentRenderOutsideBounds(DependencyObject element);
    void RegisterAnchorCandidate(UIElement element);
    public static void SetCanContentRenderOutsideBounds(DependencyObject element, bool canContentRenderOutsideBounds);
    void UnregisterAnchorCandidate(UIElement element);
  }
  public class SplitButton : ContentControl
  public class SplitButtonAutomationPeer : FrameworkElementAutomationPeer, IExpandCollapseProvider, IInvokeProvider
  public sealed class SplitButtonClickEventArgs
  public class StackPanel : Panel, IInsertionPanel, IScrollSnapPointsInfo {
    BackgroundSizing BackgroundSizing { get; set; }
    public static DependencyProperty BackgroundSizingProperty { get; }
  }
  public sealed class TextBlock : FrameworkElement {
    FlyoutBase SelectionFlyout { get; set; }
    public static DependencyProperty SelectionFlyoutProperty { get; }
    void CopySelectionToClipboard();
  }
  public class TextBox : Control {
    bool CanPasteClipboardContent { get; }
    public static DependencyProperty CanPasteClipboardContentProperty { get; }
    bool CanRedo { get; }
    public static DependencyProperty CanRedoProperty { get; }
    bool CanUndo { get; }
    public static DependencyProperty CanUndoProperty { get; }
    object Description { get; set; }
    public static DependencyProperty DescriptionProperty { get; }
    FlyoutBase ProofingMenuFlyout { get; }
    public static DependencyProperty ProofingMenuFlyoutProperty { get; }
    FlyoutBase SelectionFlyout { get; set; }
    public static DependencyProperty SelectionFlyoutProperty { get; }
    event TypedEventHandler<TextBox, TextBoxSelectionChangingEventArgs> SelectionChanging;
    void ClearUndoRedoHistory();
    void CopySelectionToClipboard();
    void CutSelectionToClipboard();
    void PasteFromClipboard();
    void Redo();
    void Undo();
  }
  public sealed class TextBoxSelectionChangingEventArgs
  public class TextCommandBarFlyout : CommandBarFlyout
  public class TimePicker : Control {
    IReference<TimeSpan> SelectedTime { get; set; }
    public static DependencyProperty SelectedTimeProperty { get; }
    event TypedEventHandler<TimePicker, TimePickerSelectedValueChangedEventArgs> SelectedTimeChanged;
  }
  public sealed class TimePickerSelectedValueChangedEventArgs
  public class ToggleSplitButton : SplitButton
  public class ToggleSplitButtonAutomationPeer : FrameworkElementAutomationPeer, IExpandCollapseProvider, IToggleProvider
  public sealed class ToggleSplitButtonIsCheckedChangedEventArgs
  public class ToolTip : ContentControl {
    IReference<Rect> PlacementRect { get; set; }
    public static DependencyProperty PlacementRectProperty { get; }
  }
  public class TreeView : Control {
    bool CanDragItems { get; set; }
    public static DependencyProperty CanDragItemsProperty { get; }
    bool CanReorderItems { get; set; }
    public static DependencyProperty CanReorderItemsProperty { get; }
    Style ItemContainerStyle { get; set; }
    public static DependencyProperty ItemContainerStyleProperty { get; }
    StyleSelector ItemContainerStyleSelector { get; set; }
    public static DependencyProperty ItemContainerStyleSelectorProperty { get; }
    TransitionCollection ItemContainerTransitions { get; set; }
    public static DependencyProperty ItemContainerTransitionsProperty { get; }
    object ItemsSource { get; set; }
    public static DependencyProperty ItemsSourceProperty { get; }
    DataTemplate ItemTemplate { get; set; }
    public static DependencyProperty ItemTemplateProperty { get; }
    DataTemplateSelector ItemTemplateSelector { get; set; }
    public static DependencyProperty ItemTemplateSelectorProperty { get; }
    event TypedEventHandler<TreeView, TreeViewDragItemsCompletedEventArgs> DragItemsCompleted;
    event TypedEventHandler<TreeView, TreeViewDragItemsStartingEventArgs> DragItemsStarting;
    DependencyObject ContainerFromItem(object item);
    DependencyObject ContainerFromNode(TreeViewNode node);
    object ItemFromContainer(DependencyObject container);
    TreeViewNode NodeFromContainer(DependencyObject container);
  }
  public sealed class TreeViewCollapsedEventArgs {
    object Item { get; }
  }
  public sealed class TreeViewDragItemsCompletedEventArgs
  public sealed class TreeViewDragItemsStartingEventArgs
  public sealed class TreeViewExpandingEventArgs {
    object Item { get; }
  }
  public class TreeViewItem : ListViewItem {
    bool HasUnrealizedChildren { get; set; }
    public static DependencyProperty HasUnrealizedChildrenProperty { get; }
    object ItemsSource { get; set; }
    public static DependencyProperty ItemsSourceProperty { get; }
  }
  public sealed class WebView : FrameworkElement {
    event TypedEventHandler<WebView, WebViewWebResourceRequestedEventArgs> WebResourceRequested;
  }
  public sealed class WebViewWebResourceRequestedEventArgs
}
namespace Windows.UI.Xaml.Controls.Maps {
  public enum MapTileAnimationState
  public sealed class MapTileBitmapRequestedEventArgs {
    int FrameIndex { get; }
  }
  public class MapTileSource : DependencyObject {
    MapTileAnimationState AnimationState { get; }
    public static DependencyProperty AnimationStateProperty { get; }
    bool AutoPlay { get; set; }
    public static DependencyProperty AutoPlayProperty { get; }
    int FrameCount { get; set; }
    public static DependencyProperty FrameCountProperty { get; }
    TimeSpan FrameDuration { get; set; }
    public static DependencyProperty FrameDurationProperty { get; }
    void Pause();
    void Play();
    void Stop();
  }
  public sealed class MapTileUriRequestedEventArgs {
    int FrameIndex { get; }
  }
}
namespace Windows.UI.Xaml.Controls.Primitives {
  public class CommandBarFlyoutCommandBar : CommandBar
  public sealed class CommandBarFlyoutCommandBarTemplateSettings : DependencyObject
  public class FlyoutBase : DependencyObject {
    bool AreOpenCloseAnimationsEnabled { get; set; }
    public static DependencyProperty AreOpenCloseAnimationsEnabledProperty { get; }
    bool InputDevicePrefersPrimaryCommands { get; }
    public static DependencyProperty InputDevicePrefersPrimaryCommandsProperty { get; }
    bool IsOpen { get; }
    public static DependencyProperty IsOpenProperty { get; }
    FlyoutShowMode ShowMode { get; set; }
    public static DependencyProperty ShowModeProperty { get; }
    public static DependencyProperty TargetProperty { get; }
    void ShowAt(DependencyObject placementTarget, FlyoutShowOptions showOptions);
  }
  public enum FlyoutPlacementMode {
    Auto = 13,
    BottomEdgeAlignedLeft = 7,
    BottomEdgeAlignedRight = 8,
    LeftEdgeAlignedBottom = 10,
    LeftEdgeAlignedTop = 9,
    RightEdgeAlignedBottom = 12,
    RightEdgeAlignedTop = 11,
    TopEdgeAlignedLeft = 5,
    TopEdgeAlignedRight = 6,
  }
  public enum FlyoutShowMode
  public class FlyoutShowOptions
  public class NavigationViewItemPresenter : ContentControl
}
namespace Windows.UI.Xaml.Core.Direct {
  public interface IXamlDirectObject
  public sealed class XamlDirect
  public struct XamlDirectContract
  public enum XamlEventIndex
  public enum XamlPropertyIndex
  public enum XamlTypeIndex
}
namespace Windows.UI.Xaml.Hosting {
  public class DesktopWindowXamlSource : IClosable
  public sealed class DesktopWindowXamlSourceGotFocusEventArgs
  public sealed class DesktopWindowXamlSourceTakeFocusRequestedEventArgs
  public sealed class WindowsXamlManager : IClosable
  public enum XamlSourceFocusNavigationReason
  public sealed class XamlSourceFocusNavigationRequest
  public sealed class XamlSourceFocusNavigationResult
}
namespace Windows.UI.Xaml.Input {
  public sealed class CanExecuteRequestedEventArgs
  public sealed class ExecuteRequestedEventArgs
  public sealed class FocusManager {
    public static event EventHandler<GettingFocusEventArgs> GettingFocus;
    public static event EventHandler<FocusManagerGotFocusEventArgs> GotFocus;
    public static event EventHandler<LosingFocusEventArgs> LosingFocus;
    public static event EventHandler<FocusManagerLostFocusEventArgs> LostFocus;
  }
  public sealed class FocusManagerGotFocusEventArgs
  public sealed class FocusManagerLostFocusEventArgs
  public sealed class GettingFocusEventArgs : RoutedEventArgs {
    Guid CorrelationId { get; }
  }
  public sealed class LosingFocusEventArgs : RoutedEventArgs {
    Guid CorrelationId { get; }
  }
  public class StandardUICommand : XamlUICommand
  public enum StandardUICommandKind
  public class XamlUICommand : DependencyObject, ICommand
}
namespace Windows.UI.Xaml.Markup {
  public sealed class FullXamlMetadataProviderAttribute : Attribute
  public interface IXamlBindScopeDiagnostics
  public interface IXamlType2 : IXamlType
}
namespace Windows.UI.Xaml.Media {
  public class Brush : DependencyObject, IAnimationObject {
    void PopulatePropertyInfo(string propertyName, AnimationPropertyInfo propertyInfo);
    virtual void PopulatePropertyInfoOverride(string propertyName, AnimationPropertyInfo animationPropertyInfo);
  }
}
namespace Windows.UI.Xaml.Media.Animation {
  public class BasicConnectedAnimationConfiguration : ConnectedAnimationConfiguration
  public sealed class ConnectedAnimation {
    ConnectedAnimationConfiguration Configuration { get; set; }
  }
  public class ConnectedAnimationConfiguration
  public class DirectConnectedAnimationConfiguration : ConnectedAnimationConfiguration
  public class GravityConnectedAnimationConfiguration : ConnectedAnimationConfiguration
  public enum SlideNavigationTransitionEffect
  public sealed class SlideNavigationTransitionInfo : NavigationTransitionInfo {
    SlideNavigationTransitionEffect Effect { get; set; }
    public static DependencyProperty EffectProperty { get; }
  }
}
namespace Windows.UI.Xaml.Navigation {
  public class FrameNavigationOptions
}
namespace Windows.Web.UI {
  public interface IWebViewControl2
  public sealed class WebViewControlNewWindowRequestedEventArgs {
    IWebViewControl NewWindow { get; set; }
    Deferral GetDeferral();
  }
  public enum WebViewControlPermissionType {
    ImmersiveView = 6,
  }
}
namespace Windows.Web.UI.Interop {
  public sealed class WebViewControl : IWebViewControl, IWebViewControl2 {
    event TypedEventHandler<WebViewControl, object> GotFocus;
    event TypedEventHandler<WebViewControl, object> LostFocus;
    void AddInitializeScript(string script);
  }
}

Removals:


namespace Windows.Gaming.UI {
  public sealed class GameMonitor
  public enum GameMonitoringPermission
}

Additional Store Win32 APIs:

The following APIs have been added to the Application Certification Kit list.

  • CM_Get_Device_Interface_List_SizeA
  • CM_Get_Device_Interface_List_SizeW
  • CM_Get_Device_Interface_ListA
  • CM_Get_Device_Interface_ListW
  • CM_MapCrToWin32Err
  • CM_Register_Notification
  • CM_Unregister_Notification
  • CoDecrementMTAUsage
  • CoIncrementMTAUsage
  • ColorAdapterGetCurrentProfileCalibration
  • ColorAdapterGetDisplayCurrentStateID
  • ColorAdapterGetDisplayProfile
  • ColorAdapterGetDisplayTargetWhitePoint
  • ColorAdapterGetDisplayTransformData
  • ColorAdapterGetSystemModifyWhitePointCaps
  • ColorAdapterRegisterOEMColorService
  • ColorAdapterUnregisterOEMColorService
  • ColorProfileAddDisplayAssociation
  • ColorProfileGetDisplayDefault
  • ColorProfileGetDisplayList
  • ColorProfileGetDisplayUserScope
  • ColorProfileRemoveDisplayAssociation
  • ColorProfileSetDisplayDefaultAssociation
  • DeviceIoControl
  • EventEnabled
  • EventProviderEnabled
  • ExitProcess
  • GetProductInfo
  • GetSystemTimes
  • InstallColorProfileA
  • InstallColorProfileW
  • LocalFileTimeToLocalSystemTime
  • LocalSystemTimeToLocalFileTime
  • MLCreateOperatorRegistry
  • RoIsApiContractMajorVersionPresent
  • RoIsApiContractPresent
  • RtlVirtualUnwind
  • SetProcessValidCallTargetsForMappedView
  • sqlite3_value_nochange
  • sqlite3_vtab_collation
  • sqlite3_vtab_nochange
  • sqlite3_win32_set_directory
  • sqlite3_win32_set_directory16
  • sqlite3_win32_set_directory8
  • ubiditransform_close
  • ubiditransform_open
  • ubiditransform_transform
  • ubrk_getBinaryRules
  • ubrk_openBinaryRules
  • unum_formatDoubleForFields
  • uplrules_getKeywords
  • uspoof_check2
  • uspoof_check2UTF8
  • uspoof_closeCheckResult
  • uspoof_getCheckResultChecks
  • uspoof_getCheckResultNumerics
  • uspoof_getCheckResultRestrictionLevel
  • uspoof_openCheckResult

The post Windows 10 SDK Preview Build 17754 available now! appeared first on Windows Developer Blog.

.NET Framework September 2018 Security and Quality Rollup

$
0
0

Today, we are releasing the September 2018 Security and Quality Rollup.

Security

CVE-2018-8421 – Windows Remote Code Execution Vulnerability

This security update resolves a vulnerability in Microsoft .NET Framework that could allow remote code execution when .NET Framework processes untrusted input. An attacker who successfully exploits this vulnerability in software by using .NET Framework could take control of an affected system. The attacker could then install programs; view, change, or delete data; or create new accounts that have full user rights. Users whose accounts are configured to have fewer user rights on the system could be less affected than users who operate with administrative user rights.

To exploit the vulnerability, an attacker would first have to convince the user to open a malicious document or application.

This security update addresses the vulnerability by correcting how .NET Framework validates untrusted input.

CVE-2018-8421

Getting the Update

The Security and Quality Rollup is available via Windows Update, Windows Server Update Services, Microsoft Update Catalog, and Docker.

Microsoft Update Catalog

You can get the update via the Microsoft Update Catalog. For Windows 10, .NET Framework updates are part of the Windows 10 Monthly Rollup.

The following table is for Windows 10 and Windows Server 2016+.

Product Version Security and Quality Rollup KB
Windows 10 1803 (April 2018 Update) Catalog
4457128
.NET Framework 3.5 4457128
.NET Framework 4.7.2 4457128
Windows 10 1709 (Fall Creators Update) Catalog
4457142
.NET Framework 3.5 4457142
.NET Framework 4.7.1, 4.7.2 4457142
Windows 10 1703 (Creators Update) Catalog
4457138
.NET Framework 3.5 4457138
.NET Framework 4.7, 4.7.1, 4.7.2 4457138
Windows 10 1607 (Anniversary Update)
Windows Server 2016
Catalog
4457131
.NET Framework 3.5 4457131
.NET Framework 4.6.2, 4.7, 4.7.1, 4.7.2 4457131
Windows 10 1507 Catalog
4457132
.NET Framework 3.5 4457132
.NET Framework 4.6, 4.6.1, 4.6.2 4457132

The following table is for earlier Windows and Windows Server versions.

Product Version Security and Quality Rollup KB Security Only Update KB
Windows 8.1
Windows RT 8.1
Windows Server 2012 R2
Catalog
4457920
Catalog
4457916
.NET Framework 3.5 4457045 4457056
.NET Framework 4.5.2 4457036 4457028
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457034 4457026
Windows Server 2012 Catalog
4457919
Catalog
4457915
.NET Framework 3.5 4457042 4457053
.NET Framework 4.5.2 4457037 4457029
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457033 4457025
Windows 7
Windows Server 2008 R2
Catalog
4457918
Catalog
4457914
.NET Framework 3.5.1 4457044 4457055
.NET Framework 4.5.2 4457038 4457030
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 4457035 4457027
Windows Server 2008 Catalog
4457921
Catalog
4457917
.NET Framework 2.0, 3.0 4457043 4457054
.NET Framework 4.5.2 4457038 4457030
.NET Framework 4.6 4457035 4457027

Docker Images

We are updating the following .NET Framework Docker images for today’s release:

Note: Look at the “Tags” view in each repository to see the updated Docker image tags.

Previous Monthly Rollups

The last few .NET Framework Monthly updates are listed below for your convenience:

Video: R and Python in in Azure HDInsight

$
0
0

Azure HDInisght was recently updated with version 9.3 of ML Services in HDInsight, which provides integration with R and Python. In particular, it makes it possible to run R and Python within HDInsight's managed Spark instance. The integration provides:  

In the episode of Data Exposed embedded below, you can learn more about the ML Service 9.3 in HDInsight and see some demos of the R and Python integration.

Channel 9: Introducing ML Services 9.3 in Azure HDInsight

Visual Studio 2017 version 15.9 Preview 2

$
0
0

Today, we are releasing the second preview of Visual Studio 2017 version 15.9, and it can be downloaded here. This latest preview contains new features and improvements to Universal Windows Platform development, C++ debugging, and export installation settings. Read more in the feature highlight summary below and check out the Visual Studio 2017 version 15.9 Preview 2 release notes for more details on all the new, exciting features contained in this Preview.

Universal Windows Platform Development

In Visual Studio 2017 version 15.9 preview 2, we have a combination of new features and product enhancements for Universal Windows Platform developers. Read below to see what’s new!

Latest Windows 10 Insider Preview SDK: The most recent Windows 10 Insider Preview SDK (build 17754) is now included as an optional component for the Universal Windows Platform development workload. UWP developers can install this SDK to access the latest APIs available in the upcoming release of Windows 10.

Visual Studio Installer showing Windows 10 Preview SDK as optional

MSIX Package Creation: You can now create .MSIX packages in Visual Studio either through the Universal Windows Platform packaging tooling or through the Windows Application Packaging Project template. To generate a .MSIX package, set your Minimum Version to the latest Windows 10 Insider Preview SDK (build 17754), and then follow the standard package creation workflow in Visual Studio. To learn more about .MSIX packaging, check out this Build talk.

F5 Performance Improvements: We have optimized our F5 (build and deploy) tooling for faster developer productivity with UWP development. These improvements will be most noticeable for developers that target a remote PC using Windows authentication; but, all other deployment types should see improved performance as well.

UWP XAML Designer Reliability: For developers building UWP applications with a target version of Fall Creators Update (build 16299) or higher, you should notice fewer XAML designer crashes. Specifically, when the designer tries to render controls that throw with catchable exceptions, the rendered control will be replaced with a fallback control rather than crashing the design surface. These fallback controls will have a yellow border (shown below) to indicate to developers that the control has been replaced.

Fallback controls shown with yellow border

Step Back in the C++ Debugger

In Preview 2, we’ve added “Step Back” for C++ developers targeting Windows. With this feature, you can now return to a previous state while debugging without having to restart the entire process. It’s installed, but set to “off” by default as part of the C++ workload. To enable it, go to Tools -> Options -> IntelliTrace and select the “IntelliTrace snapshots” option. This will enable snapshots for both Managed and Native code.

Tools Options Dialog showing IntelliTrace snapshots option

Once “Step Back” is enabled, you will see snapshots appear in the Events tab of the Diagnostic Tools Window when you are stepping through C++ code.

Diagnostics Tools Window showing Once snapshots in events tab when Step Back is enabled when stepping through C code.

Clicking on any event will take you back to its respective snapshot. Or, you can simply use the Step Backward button on the debug command bar to go back in time. You can see “Step Back” in action in concurrence with “Step Over” in the gif below.

Gif showing Step Over and Step Backward

Acquisition

We’ve made it easier to keep your installation settings consistent across multiple installations of Visual Studio. You can now use the Visual Studio Installer to export a .vsconfig file for a given instance of Visual Studio. This file will contain information about the workloads and components you have installed. You can then import this file to add your workload and component selections to a new or existing installation of Visual Studio.

Dialog showing option to export a vsconfig file and also import it to add your workload configuration to a new or existing Visual Studio installation

Try out the Preview

If you’re not familiar with Visual Studio Previews, take a moment to read the Visual Studio 2017 Release Rhythm. Remember that Visual Studio 2017 Previews can be installed side-by-side with other versions of Visual Studio and other installs of Visual Studio 2017 without adversely affecting either your machine or your productivity. Previews provide an opportunity for you to receive fixes faster and try out upcoming functionality before they become mainstream. Similarly, the Previews enable the Visual Studio engineering team to validate usage, incorporate suggestions, and detect flaws earlier in the development process. We are highly responsive to feedback coming in through the Previews and look forward to hearing from you.

Please get the Visual Studio Preview today, exercise your favorite workloads, and tell us what you think. If you have an Azure subscription, you can provision virtual machine of this preview. You can report issues to us via the Report a Problem tool in Visual Studio or you can share a suggestion on UserVoice. You’ll be able to track your issues in the Visual Studio Developer Community where you can ask questions and find answers. You can also engage with us and other Visual Studio developers through our Visual Studio conversation in the Gitter community (requires GitHub account). Thank you for using the Visual Studio Previews.

Angel Zhou Program Manager, Visual Studio

Angel Zhou is a program manager on the Visual Studio release engineering team, which is responsible for making Visual Studio releases available to our customers around the world.

Viewing all 5971 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>