Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

Announcing Azure Private Link

$
0
0

Customers love the scale of Azure that gives them the ability to expand across the globe, while being highly available. Through the rapidly growing adoption of Azure, customers need to access the data and services privately and securely from their networks grow exponentially. To help with this, we’re announcing the preview of Azure Private Link.

Azure Private Link is a secure and scalable way for Azure customers to consume Azure Services like Azure Storage or SQL, Microsoft Partner Services or their own services privately from their Azure Virtual Network (VNet). The technology is based on a provider and consumer model where the provider and the consumer are both hosted in Azure. A connection is established using a consent-based call flow and once established, all data that flows between the service provider and service consumer is isolated from the internet and stays on the Microsoft network. There is no need for gateways, network address translation (NAT) devices, or public IP addresses to communicate with the service.

Azure Private Link brings Azure services inside the customer’s private VNet. The service resources can be accessed using the private IP address just like any other resource in the VNet. This significantly simplifies the network configuration by keeping access rules private.

Diagram showing the Private Link topology. Starting from services attached to Private Link, then linked and made available in the customer VNet through a Private Endpoint.

Today we would like to highlight a few unique key use cases that are made possible by the Azure Private Link announcement:

Private connectivity to Azure PaaS services

Multi-tenant shared services such as Azure Storage and Azure SQL Database are outside your VNet and have been reachable only via the public interface. Today, you can secure this connection using VNet service endpoints which keep the traffic within the Microsoft backbone network and allow the PaaS resource to be locked down to just your VNet. However, the PaaS endpoint is still served over a public IP address and therefore not reachable from on-premises through Azure ExpressRoute private peering or VPN gateway. With today’s announcement of Azure Private Link, you can simply create a private endpoint in your VNet and map it to your PaaS resource (Your Azure Storage account blob or SQL Database server). These resources are then accessible over a private IP address in your VNet, enabling connectivity from on-premises through Azure ExpressRoute private peering and/or VPN gateway and keep the network configuration simple by not opening it up to public IP addresses.

Private connectivity to your own service

This new offering is not limited to Azure PaaS services, you can leverage it for your own service as well. Today, as a service provider in Azure, you have to make your service accessible over a public interface (IP address) in order for it to be accessible for other consumers running in Azure. You could use VNet peering and connect to the consumer’s VNet to make it private, but it is not scalable and will soon run into IP address conflicts. With today’s announcement, you can run your service completely private in your own VNet behind an Azure Standard Load Balancer, enable it for Azure Private Link, and allow it to be accessed by consumers running in different VNet, subscription, or Azure Active Directory (AD) tenant all using simple clicks and approval call flow. As a service consumer all you will have to do is create a private endpoint in your own VNet and consume the Azure Private Link service completely private without opening your access control lists (ACLs) to any public IP address space.

Before and after diagram, showing services traditionally exposed through public IPs vs exposed privately in the VNet  through Private Link

Private connectivity to SaaS service

Microsoft’s multiple partners already offer many different software-as-a-service (SaaS) solutions to Azure customers today. These solutions are offered over the public endpoints and to consume these SaaS solutions, Azure customers must open their private networks to the public internet. Customers want to consume these SaaS solutions within their private networks as if they are deployed right within their networks. The ability to consume the SaaS solutions privately within the customer's own network has been a common request. With Azure Private Link, we’re extending the private connectivity experience to Microsoft partners. This is a very powerful mechanism for Microsoft partners to reach Azure customers. We're confident that a lot of future Azure Marketplace offerings will be made through Azure Private Link. 

Key highlights of Azure Private Link

  • Private on-premises access: Since PaaS resources are mapped to private IP addresses in the customer’s VNet, they can be accessed via Azure ExpressRoute private peering. This effectively means that the data will traverse a fully private path from on-premises to Azure. The configuration in the corporate firewalls and route tables can be simplified to allow access only to the private IP addresses.
  • Data exfiltration protection: Azure Private Link is unique with respect to mapping a specific PaaS resource to private IP address as opposed to mapping an entire service as other cloud providers do. This essentially means that any malicious intent to exfiltrate the data to a different account using the same private endpoint will fail, thus providing built-in data exfiltration protection.
  • Simple to setup: Azure Private Link is simple to setup with minimal networking configuration needed. Connectivity works on an approval call flow and once a PaaS resource is mapped to a private endpoint, the connectivity works out of the box without any additional configurations on route tables and Azure Network Security Groups (NSGs).

Azure portal experience for Private Link.

  • Overlapping address space: Traditionally, customers use VNet peering as the mechanism to connect multiple VNets. VNet peering requires the VNets to have non-overlapping address space. In enterprise use cases, its often common to find networks with an overlapping IP address space. Azure Private Link provides an alternative way to privately connect applications in different VNets that have an overlapping IP address space.

Simple diagram illustrating services on VMs in one VNet being exposed to users in another VNet across private IP space with Azure Private Link.

Roadmap

Today, we’re announcing Azure Private Link preview in a limited set of regions. We will be expanding to more regions in the near future. In addition, we will also be adding more Azure PaaS services to Azure Private Link including Azure Cosmos DB, Azure MySQL, Azure PostgreSQL, Azure MariaDB, Azure Application Service, and Azure Key Vault, and Partner Services in coming months.

We encourage you to try out the Azure Private Link preview and look forward to hearing and incorporating your feedback. Please refer to the documentation for additional details.


SAP on Azure Architecture – Designing for security

$
0
0

This blog post was contributed to by Chin Lai The, Technical Specialist, SAP on Azure.

This is the first in a four-part blog series on designing a great SAP on Azure Architecture, and will focus on designing for security.

Great SAP on Azure Architectures are built on the pillars of security, performance and scalability, availability and recoverability, and efficiency and operations.

Microsoft investments in Azure Security

Microsoft invests $1 billion annually on security research and development and has 3,500 security professionals employed across the company. Advanced AI is leveraged to analyze 6.5 trillion global signals from the Microsoft cloud platforms and detect and respond to threats. Enterprise-grade security and privacy are built into the Azure platform including enduring, rigorous validation by real world tests, such as the Red Team exercises. These tests enable Microsoft to test breach detection and response as well as accurately measure readiness and impacts of real-world attacks, and are just one of the many operational processes that provide best-in-class security for Azure.

Azure is the platform of trust, with 90 compliance certifications spanning nations, regions, and specific industries such as health, finance, government, and manufacturing. Moreover, Azure Security and Compliance Blueprints can be used to easily create, deploy, and update your compliant environments.

Security – a shared responsibility

It’s important to understand the shared responsibility model between you as a customer and Microsoft. The division of responsibility is dependent on the cloud model used - SaaS, PaaS, or IaaS. As a customer, you are always responsible for your data, endpoints, account/access management, irrespective of the chosen cloud deployment.

SAP on Azure is delivered using the IaaS cloud model, which means security protections are built into the service by Microsoft at the physical datacenter, physical network, and physical hosts. However, for all areas beyond the Azure hypervisor i.e. the operating systems and applications, customers need to ensure their enterprise security controls are implemented.

A diagram looking at the responsibilities of the customer versus the service they are using.

Key security considerations for deploying SAP on Azure

Resource based access control & resource locking

Role-based access control (RBAC) is an authorization system which provides fine-grained access for the management of Azure resources. RBAC can be used to limit access and control permissions on Azure resources for the various teams within your IT operations.

For example, the SAP basis team members can be permissioned to deploy virtual machines (VMs) into Azure virtual networks (VNets). However, the SAP basis team can be restricted from creating or configuring VNets. On the flip side, members of the networking team can create and configure VNets, however, they are prohibited from deploying or configuring VMs in VNets where SAP applications are running.

We recommend validating and testing the RBAC design early during the lifecycle of your SAP on Azure project.

Another important consideration is Azure resource locking which can be used to prevent accidental deletion or modification of Azure resources such as VMs and disks.  It is recommended to create the required Azure resources at the start of your SAP project. When all additons, moves, and changes are finished, and the SAP on Azure deployment is operational all resources can be locked. Following, only a super administrator can unlock a resource and permit the resource (such as a VM) to be modified.

Secure authentication

Single-sign-on (SSO) provides the foundation for integrating SAP and Microsoft products, and for years Kerberos tokens from Microsoft Active Directory have been enabling this capability for both SAP GUI and web-browser based applications when combined with third party security products.

When a user logs onto their workstation and successfully authenticates against Microsoft Active Directory they are issued a Kerberos token. The Kerberos token can then be used by a 3rd party security product to handle the authentication to the SAP application without the user having to re-authenticate. Additionally, data in transit from the users front-end towards the SAP application can also be encrypted by integrating the security product with secure network communications (SNC) for DIAG (SAP GUI), RFC and SPNEGO for HTTPS.

Azure Active Directory (Azure AD) with SAML 2.0 can also be used to provide SSO to a range of SAP applications and platforms such as SAP NetWeaver, SAP HANA and the SAP Cloud Platform.

This video demonstrates the end-to-end enablement of SSO between Azure AD and SAP NetWeaver

Protecting your application and data from network vulnerabilities

Network security groups (NSG) contain a list of security rules that allow or deny network traffic to resources within your Azure VNet. NSGs can be associated to subnets or individual network interfaces attached to VMs. Security rules can be configured based on source/destination, port, and protocol.
NSG’s influence network traffic for the SAP system. In the diagram below, three subnets are implemented, each having an NSG assigned - FE (Front-End), App and DB.

  1. A public internet user can reach the SAP Web-Dispatcher over port 443
  2. The SAP Web-Dispatcher can reach the SAP Application server over port 443
  3. The App Subnet accepts traffic on port 443 from 10.0.0.0/24
  4. The SAP Application server sends traffic on port 30015 to the SAP DB server
  5. The DB subnet accepts traffic on port 30015 from 10.0.1.0/24.
  6. Public Internet Access is blocked on both App Subnet and DB Subnet.

A diagram showing three subnets with NSGs influencing network traffic.

SAP deployments using the Azure virtual Ddatacenter architecture will be implemented using a hub and spoke model. The hub VNet is the central point for connectivity where an Azure Firewall or other type of network virtual appliances (NVA) is implemented to inspect and control the routing of traffic to the spoke VNet where your SAP applications reside.

Within your SAP on Azure project, it is recommended to validate that that inspection devices and NSG security rules are working as desired, this will ensure that your SAP resources are shielded appropriately against network vulnerabilities.

Maintaining data integrity through encryption methods

Azure Storage service encryption is enabled by default on your Azure Storage account where it cannot be disabled. Therefore, customer data at rest on Azure Storage is secured by default where data is encrypted/decrypted transparently using 256-bit AES. The encrypt/decrypt process has no impact on Azure Storage performance and is cost free.  You have the option of Microsoft managing the encryption keys or you can manage your own keys with Azure Key Vault. Azure Key Vault can be used to manage your SSL/TLS certificates which are used to secure interfaces and internal communications within the SAP system.

Azure also offers virtual machine disk encryption using BitLocker for Windows and DM-Crypt for Linux to provide volume encryption for virtual machine operating system and data disks. Disk encryption is not enabled by default.

Our recommended approach to encrypting your SAP data at rest is as follows:

  • Azure Disk Encryption for SAP Application servers – operating system disk and data disks.
  • Azure Disk Encryption for SAP Database servers – operating system disks and those data disk not used by the DBMS.
  • SAP Database servers - leverage Transparent Data Encryption offered by the DBMS provider to secure your data and log files and to ensure the backups are also encrypted.

Hardening the operating system

Security is a shared responsibility between Microsoft and you as a customer where your customer specific security controls need to be applied to the operating system, database, and the SAP application layer. For example, you need to ensure the operating system is hardened to eradicate vulnerabilities which could lead to attacks on the SAP database.

Windows, SUSE Linux, RedHat Linux and others are supported for running SAP applications on Azure and various images of these operating systems are available within the Azure Marketplace. You can further harden these images to comply with the security policies of your enterprise and within the guidance from the Center of Internet Security (CIS)- Microsoft Azure foundations benchmark.

Enterprises generally have operational processes in place for updating and patching of their IT software including the operating system. Once an operating system vulnerability has been exposed, it is published in security advisories and usually remediated quickly. The operating system vendor regularly provides security updates and patches. You can use the Update Management solution in Azure Automation to manage operating system updates for your Windows and Linux VMs in Azure. A best practice approach is a selective installation of security updates for the operating system on a regular cadence and installation of other updates such as new features during maintenance windows.

Learn more

Within this blog we have touched upon a selection of security topics as they relate to deploying SAP on Azure. Incorporating solid security practices will lead to a secure SAP deployment on Azure.

Azure Security Center is the place to learn about the best practices for securing and monitoring your Azure deployments. Also, please read the Azure Security Center technical documentation along with Azure Sentinel to understand how to detect vulnerabilities, generate alerts when exploitations have occurred and provide guidance on remediation.

In blog number two in our series we will cover designing for performance and scalability.

An image of the Security Center Overview page.

Hustle Up! Discover Microsoft Store resources for a better side hustle

Windows 10 SDK Preview Build 18980 available now!

$
0
0

Today, we released a new Windows 10 Preview Build of the SDK to be used in conjunction with Windows 10 Insider Preview (Build 18980 or greater). The Preview SDK Build 18980 contains bug fixes and under development changes to the API surface area.

The Preview SDK can be downloaded from developer section on Windows Insider.

For feedback and updates to the known issues, please see the developer forum. For new developer feature requests, head over to our Windows Platform UserVoice.

Things to note:

  • This build works in conjunction with previously released SDKs and Visual Studio 2017 and 2019. You can install this SDK and still also continue to submit your apps that target Windows 10 build 1903 or earlier to the Microsoft Store.
  • The Windows SDK will now formally only be supported by Visual Studio 2017 and greater. You can download the Visual Studio 2019 here.
  • This build of the Windows SDK will install on only on Windows 10 Insider Preview builds.
  • In order to assist with script access to the SDK, the ISO will also be able to be accessed through the following static URL: https://software-download.microsoft.com/download/sg/Windows_InsiderPreview_SDK_en-us_18980_1.iso.

Tools Updates:

Message Compiler (mc.exe)

  • Now detects the Unicode byte order mark (BOM) in .mc files. If the If the .mc file starts with a UTF-8 BOM, it will be read as a UTF-8 file. Otherwise, if it starts with a UTF-16LE BOM, it will be read as a UTF-16LE file. If the -u parameter was specified, it will be read as a UTF-16LE file. Otherwise, it will be read using the current code page (CP_ACP).
  • Now avoids one-definition-rule (ODR) problems in MC-generated C/C++ ETW helpers caused by conflicting configuration macros (e.g. when two .cpp files with conflicting definitions of MCGEN_EVENTWRITETRANSFER are linked into the same binary, the MC-generated ETW helpers will now respect the definition of MCGEN_EVENTWRITETRANSFER in each .cpp file instead of arbitrarily picking one or the other).

Windows Trace Preprocessor (tracewpp.exe)

  • Now supports Unicode input (.ini, .tpl, and source code) files. Input files starting with a UTF-8 or UTF-16 byte order mark (BOM) will be read as Unicode. Input files that do not start with a BOM will be read using the current code page (CP_ACP). For backwards-compatibility, if the -UnicodeIgnore command-line parameter is specified, files starting with a UTF-16 BOM will be treated as empty.
  • Now supports Unicode output (.tmh) files. By default, output files will be encoded using the current code page (CP_ACP). Use command-line parameters -cp:UTF-8 or -cp:UTF-16 to generate Unicode output files.
  • Behavior change: tracewpp now converts all input text to Unicode, performs processing in Unicode, and converts output text to the specified output encoding. Earlier versions of tracewpp avoided Unicode conversions and performed text processing assuming a single-byte character set. This may lead to behavior changes in cases where the input files do not conform to the current code page. In cases where this is a problem, consider converting the input files to UTF-8 (with BOM) and/or using the -cp:UTF-8 command-line parameter to avoid encoding ambiguity.

TraceLoggingProvider.h

  • Now avoids one-definition-rule (ODR) problems caused by conflicting configuration macros (e.g. when two .cpp files with conflicting definitions of TLG_EVENT_WRITE_TRANSFER are linked into the same binary, the TraceLoggingProvider.h helpers will now respect the definition of TLG_EVENT_WRITE_TRANSFER in each .cpp file instead of arbitrarily picking one or the other).
  • In C++ code, the TraceLoggingWrite macro has been updated to enable better code sharing between similar events using variadic templates.

Signing your apps with Device Guard Signing

  • We are making it easier for you to sign your app. Device Guard signing is a Device Guard feature that is available in Microsoft Store for Business and Education. Signing allows enterprises to guarantee every app comes from a trusted source. Our goal is to make signing your MSIX package easier. Documentation on Device Guard Signing can be found here: https://docs.microsoft.com/windows/msix/package/signing-package-device-guard-signing

Breaking Changes:

Removal of api-ms-win-net-isolation-l1-1-0.lib

In this release api-ms-win-net-isolation-l1-1-0.lib has been removed from the Windows SDK. Apps that were linking against api-ms-win-net-isolation-l1-1-0.lib can switch to OneCoreUAP.lib as a replacement.

Removal of IRPROPS.LIB

In this release irprops.lib has been removed from the Windows SDK. Apps that were linking against irprops.lib can switch to bthprops.lib as a drop-in replacement.

API Updates, Additions and Removals:

The following APIs have been added to the platform since the release of Windows 10 SDK, version 1903, build 18362.

Additions

 

namespace Windows.AI.MachineLearning {
  public sealed class LearningModelSessionOptions {
    bool CloseModelOnSessionCreation { get; set; }
  }
}
namespace Windows.ApplicationModel {
  public sealed class AppInfo {
    public static AppInfo Current { get; }
    Package Package { get; }
    public static AppInfo GetFromAppUserModelId(string appUserModelId);
    public static AppInfo GetFromAppUserModelIdForUser(User user, string appUserModelId);
  }
  public interface IAppInfoStatics
  public sealed class Package {
    StorageFolder EffectiveExternalLocation { get; }
    string EffectiveExternalPath { get; }
    string EffectivePath { get; }
    string InstalledPath { get; }
    bool IsStub { get; }
    StorageFolder MachineExternalLocation { get; }
    string MachineExternalPath { get; }
    string MutablePath { get; }
    StorageFolder UserExternalLocation { get; }
    string UserExternalPath { get; }
    IVectorView<AppListEntry> GetAppListEntries();
    RandomAccessStreamReference GetLogoAsRandomAccessStreamReference(Size size);
  }
}
namespace Windows.ApplicationModel.AppService {
  public enum AppServiceConnectionStatus {
    AuthenticationError = 8,
    DisabledByPolicy = 10,
    NetworkNotAvailable = 9,
    WebServiceUnavailable = 11,
  }
  public enum AppServiceResponseStatus {
    AppUnavailable = 6,
    AuthenticationError = 7,
   DisabledByPolicy = 9,
    NetworkNotAvailable = 8,
    WebServiceUnavailable = 10,
  }
  public enum StatelessAppServiceResponseStatus {
    AuthenticationError = 11,
    DisabledByPolicy = 13,
    NetworkNotAvailable = 12,
    WebServiceUnavailable = 14,
  }
}
namespace Windows.ApplicationModel.Background {
  public sealed class BackgroundTaskBuilder {
    void SetTaskEntryPointClsid(Guid TaskEntryPoint);
  }
  public sealed class BluetoothLEAdvertisementPublisherTrigger : IBackgroundTrigger {
    bool IncludeTransmitPowerLevel { get; set; }
    bool IsAnonymous { get; set; }
    IReference<short> PreferredTransmitPowerLevelInDBm { get; set; }
    bool UseExtendedFormat { get; set; }
  }
  public sealed class BluetoothLEAdvertisementWatcherTrigger : IBackgroundTrigger {
    bool AllowExtendedAdvertisements { get; set; }
  }
}
namespace Windows.ApplicationModel.ConversationalAgent {
  public sealed class ActivationSignalDetectionConfiguration
  public enum ActivationSignalDetectionTrainingDataFormat
  public sealed class ActivationSignalDetector
  public enum ActivationSignalDetectorKind
  public enum ActivationSignalDetectorPowerState
  public sealed class ConversationalAgentDetectorManager
  public sealed class DetectionConfigurationAvailabilityChangedEventArgs
  public enum DetectionConfigurationAvailabilityChangeKind
  public sealed class DetectionConfigurationAvailabilityInfo
  public enum DetectionConfigurationTrainingStatus
}
namespace Windows.ApplicationModel.DataTransfer {
  public sealed class DataPackage {
    event TypedEventHandler<DataPackage, object> ShareCanceled;
  }
}
namespace Windows.Devices.Bluetooth {
  public sealed class BluetoothAdapter {
    bool IsExtendedAdvertisingSupported { get; }
    uint MaxAdvertisementDataLength { get; }
  }
}
namespace Windows.Devices.Bluetooth.Advertisement {
  public sealed class BluetoothLEAdvertisementPublisher {
    bool IncludeTransmitPowerLevel { get; set; }
    bool IsAnonymous { get; set; }
    IReference<short> PreferredTransmitPowerLevelInDBm { get; set; }
    bool UseExtendedAdvertisement { get; set; }
  }
  public sealed class BluetoothLEAdvertisementPublisherStatusChangedEventArgs {
    IReference<short> SelectedTransmitPowerLevelInDBm { get; }
  }
  public sealed class BluetoothLEAdvertisementReceivedEventArgs {
    BluetoothAddressType BluetoothAddressType { get; }
    bool IsAnonymous { get; }
    bool IsConnectable { get; }
    bool IsDirected { get; }
    bool IsScannable { get; }
    bool IsScanResponse { get; }
    IReference<short> TransmitPowerLevelInDBm { get; }
  }
  public enum BluetoothLEAdvertisementType {
    Extended = 5,
  }
  public sealed class BluetoothLEAdvertisementWatcher {
    bool AllowExtendedAdvertisements { get; set; }
  }
  public enum BluetoothLEScanningMode {
   None = 2,
  }
}
namespace Windows.Devices.Bluetooth.Background {
  public sealed class BluetoothLEAdvertisementPublisherTriggerDetails {
    IReference<short> SelectedTransmitPowerLevelInDBm { get; }
  }
}
namespace Windows.Devices.Display {
  public sealed class DisplayMonitor {
    bool IsDolbyVisionSupportedInHdrMode { get; }
  }
}
namespace Windows.Devices.Input {
  public sealed class PenButtonListener
  public sealed class PenDockedEventArgs
  public sealed class PenDockListener
  public sealed class PenTailButtonClickedEventArgs
  public sealed class PenTailButtonDoubleClickedEventArgs
  public sealed class PenTailButtonLongPressedEventArgs
  public sealed class PenUndockedEventArgs
}
namespace Windows.Devices.Sensors {
  public sealed class Accelerometer {
    AccelerometerDataThreshold ReportThreshold { get; }
  }
  public sealed class AccelerometerDataThreshold
  public sealed class Barometer {
    BarometerDataThreshold ReportThreshold { get; }
  }
  public sealed class BarometerDataThreshold
  public sealed class Compass {
    CompassDataThreshold ReportThreshold { get; }
  }
  public sealed class CompassDataThreshold
  public sealed class Gyrometer {
    GyrometerDataThreshold ReportThreshold { get; }
  }
  public sealed class GyrometerDataThreshold
  public sealed class Inclinometer {
    InclinometerDataThreshold ReportThreshold { get; }
  }
  public sealed class InclinometerDataThreshold
  public sealed class LightSensor {
    LightSensorDataThreshold ReportThreshold { get; }
  }
  public sealed class LightSensorDataThreshold
  public sealed class Magnetometer {
    MagnetometerDataThreshold ReportThreshold { get; }
  }
  public sealed class MagnetometerDataThreshold
}
namespace Windows.Foundation.Metadata {
  public sealed class AttributeNameAttribute : Attribute
  public sealed class FastAbiAttribute : Attribute
  public sealed class NoExceptionAttribute : Attribute
}
namespace Windows.Globalization {
  public sealed class Language {
    string AbbreviatedName { get; }
    public static IVector<string> GetMuiCompatibleLanguageListFromLanguageTags(IIterable<string> languageTags);
  }
}
namespace Windows.Graphics.Capture {
  public sealed class GraphicsCaptureSession : IClosable {
    bool IsCursorCaptureEnabled { get; set; }
  }
}
namespace Windows.Graphics.DirectX {
  public enum DirectXPixelFormat {
    SamplerFeedbackMinMipOpaque = 189,
    SamplerFeedbackMipRegionUsedOpaque = 190,
  }
}
namespace Windows.Graphics.Holographic {
  public sealed class HolographicFrame {
    HolographicFrameId Id { get; }
  }
  public struct HolographicFrameId
  public sealed class HolographicFrameRenderingReport
  public sealed class HolographicFrameScanoutMonitor : IClosable
  public sealed class HolographicFrameScanoutReport
  public sealed class HolographicSpace {
    HolographicFrameScanoutMonitor CreateFrameScanoutMonitor(uint maxQueuedReports);
  }
}
namespace Windows.Management.Deployment {
  public sealed class AddPackageOptions
  public enum DeploymentOptions : uint {
    StageInPlace = (uint)4194304,
  }
  public sealed class PackageManager {
    IAsyncOperationWithProgress<DeploymentResult, DeploymentProgress> AddPackageByUriAsync(Uri packageUri, AddPackageOptions options);
    IIterable<Package> FindProvisionedPackages();
    PackageStubPreference GetPackageStubPreference(string packageFamilyName);
    IAsyncOperationWithProgress<DeploymentResult, DeploymentProgress> RegisterPackageByNameAsync(string name, RegisterPackageOptions options);
    IAsyncOperationWithProgress<DeploymentResult, DeploymentProgress> RegisterPackageByUriAsync(Uri manifestUri, RegisterPackageOptions options);
    IAsyncOperationWithProgress<DeploymentResult, DeploymentProgress> RegisterPackagesByFullNameAsync(IIterable<string> packageFullNames, DeploymentOptions deploymentOptions);
    void SetPackageStubPreference(string packageFamilyName, PackageStubPreference useStub);
    IAsyncOperationWithProgress<DeploymentResult, DeploymentProgress> StagePackageByUriAsync(Uri packageUri, StagePackageOptions options);
  }
  public enum PackageStubPreference
  public enum PackageTypes : uint {
    All = (uint)4294967295,
  }
  public sealed class RegisterPackageOptions
  public enum RemovalOptions : uint {
    PreserveRoamableApplicationData = (uint)128,
  }
  public sealed class StagePackageOptions
  public enum StubPackageOption
}
namespace Windows.Media.Audio {
  public sealed class AudioPlaybackConnection : IClosable
  public sealed class AudioPlaybackConnectionOpenResult
  public enum AudioPlaybackConnectionOpenResultStatus
  public enum AudioPlaybackConnectionState
}
namespace Windows.Media.Capture {
  public sealed class MediaCapture : IClosable {
    MediaCaptureRelativePanelWatcher CreateRelativePanelWatcher(StreamingCaptureMode captureMode, DisplayRegion displayRegion);
  }
  public sealed class MediaCaptureInitializationSettings {
    Uri DeviceUri { get; set; }
    PasswordCredential DeviceUriPasswordCredential { get; set; }
  }
  public sealed class MediaCaptureRelativePanelWatcher : IClosable
}
namespace Windows.Media.Capture.Frames {
  public sealed class MediaFrameSourceInfo {
    Panel GetRelativePanel(DisplayRegion displayRegion);
  }
}
namespace Windows.Media.Devices {
  public sealed class PanelBasedOptimizationControl
}
namespace Windows.Media.MediaProperties {
  public static class MediaEncodingSubtypes {
    public static string Pgs { get; }
    public static string Srt { get; }
    public static string Ssa { get; }
    public static string VobSub { get; }
  }
  public sealed class TimedMetadataEncodingProperties : IMediaEncodingProperties {
    public static TimedMetadataEncodingProperties CreatePgs();
    public static TimedMetadataEncodingProperties CreateSrt();
    public static TimedMetadataEncodingProperties CreateSsa(byte[] formatUserData);
    public static TimedMetadataEncodingProperties CreateVobSub(byte[] formatUserData);
  }
}
namespace Windows.Networking.BackgroundTransfer {
  public sealed class DownloadOperation : IBackgroundTransferOperation, IBackgroundTransferOperationPriority {
    void RemoveRequestHeader(string headerName);
    void SetRequestHeader(string headerName, string headerValue);
  }
  public sealed class UploadOperation : IBackgroundTransferOperation, IBackgroundTransferOperationPriority {
    void RemoveRequestHeader(string headerName);
    void SetRequestHeader(string headerName, string headerValue);
  }
}
namespace Windows.Networking.Connectivity {
  public enum NetworkAuthenticationType {
    Owe = 12,
  }
}
namespace Windows.Networking.NetworkOperators {
  public sealed class NetworkOperatorTetheringAccessPointConfiguration {
    TetheringWiFiBand Band { get; set; }
    bool IsBandSupported(TetheringWiFiBand band);
    IAsyncOperation<bool> IsBandSupportedAsync(TetheringWiFiBand band);
  }
  public sealed class NetworkOperatorTetheringManager {
    public static void DisableNoConnectionsTimeout();
    public static IAsyncAction DisableNoConnectionsTimeoutAsync();
    public static void EnableNoConnectionsTimeout();
    public static IAsyncAction EnableNoConnectionsTimeoutAsync();
    public static bool IsNoConnectionsTimeoutEnabled();
  }
  public enum TetheringWiFiBand
}
namespace Windows.Networking.PushNotifications {
  public static class PushNotificationChannelManager {
    public static event EventHandler<PushNotificationChannelsRevokedEventArgs> ChannelsRevoked;
  }
  public sealed class PushNotificationChannelsRevokedEventArgs
  public sealed class RawNotification {
    IBuffer ContentBytes { get; }
  }
}
namespace Windows.Security.Authentication.Web.Core {
  public sealed class WebAccountMonitor {
    event TypedEventHandler<WebAccountMonitor, WebAccountEventArgs> AccountPictureUpdated;
  }
}
namespace Windows.Security.Isolation {
  public sealed class IsolatedWindowsEnvironment
  public enum IsolatedWindowsEnvironmentActivator
  public enum IsolatedWindowsEnvironmentAllowedClipboardFormats : uint
  public enum IsolatedWindowsEnvironmentAvailablePrinters : uint
  public enum IsolatedWindowsEnvironmentClipboardCopyPasteDirections : uint
  public struct IsolatedWindowsEnvironmentContract
  public struct IsolatedWindowsEnvironmentCreateProgress
  public sealed class IsolatedWindowsEnvironmentCreateResult
  public enum IsolatedWindowsEnvironmentCreateStatus
  public sealed class IsolatedWindowsEnvironmentFile
  public static class IsolatedWindowsEnvironmentHost
  public enum IsolatedWindowsEnvironmentHostError
  public sealed class IsolatedWindowsEnvironmentLaunchFileResult
  public enum IsolatedWindowsEnvironmentLaunchFileStatus
  public sealed class IsolatedWindowsEnvironmentOptions
  public static class IsolatedWindowsEnvironmentOwnerRegistration
  public sealed class IsolatedWindowsEnvironmentOwnerRegistrationData
  public sealed class IsolatedWindowsEnvironmentOwnerRegistrationResult
  public enum IsolatedWindowsEnvironmentOwnerRegistrationStatus
  public sealed class IsolatedWindowsEnvironmentProcess
  public enum IsolatedWindowsEnvironmentProcessState
  public enum IsolatedWindowsEnvironmentProgressState
  public sealed class IsolatedWindowsEnvironmentShareFolderRequestOptions
  public sealed class IsolatedWindowsEnvironmentShareFolderResult
  public enum IsolatedWindowsEnvironmentShareFolderStatus
  public sealed class IsolatedWindowsEnvironmentStartProcessResult
  public enum IsolatedWindowsEnvironmentStartProcessStatus
  public sealed class IsolatedWindowsEnvironmentTelemetryParameters
  public static class IsolatedWindowsHostMessenger
  public delegate void MessageReceivedCallback(Guid receiverId, IVectorView<object> message);
}
namespace Windows.Storage {
  public static class KnownFolders {
    public static IAsyncOperation<StorageFolder> GetFolderAsync(KnownFolderId folderId);
    public static IAsyncOperation<KnownFoldersAccessStatus> RequestAccessAsync(KnownFolderId folderId);
    public static IAsyncOperation<KnownFoldersAccessStatus> RequestAccessForUserAsync(User user, KnownFolderId folderId);
  }
  public enum KnownFoldersAccessStatus
  public sealed class StorageFile : IInputStreamReference, IRandomAccessStreamReference, IStorageFile, IStorageFile2, IStorageFilePropertiesWithAvailability, IStorageItem, IStorageItem2, IStorageItemProperties, IStorageItemProperties2, IStorageItemPropertiesWithProvider {
    public static IAsyncOperation<StorageFile> GetFileFromPathForUserAsync(User user, string path);
  }
  public sealed class StorageFolder : IStorageFolder, IStorageFolder2, IStorageFolderQueryOperations, IStorageItem, IStorageItem2, IStorageItemProperties, IStorageItemProperties2, IStorageItemPropertiesWithProvider {
    public static IAsyncOperation<StorageFolder> GetFolderFromPathForUserAsync(User user, string path);
  }
}
namespace Windows.Storage.Provider {
  public sealed class StorageProviderFileTypeInfo
  public sealed class StorageProviderSyncRootInfo {
    IVector<StorageProviderFileTypeInfo> FallbackFileTypeInfo { get; }
  }
  public static class StorageProviderSyncRootManager {
    public static bool IsSupported();
  }
}
namespace Windows.System {
  public sealed class UserChangedEventArgs {
    IVectorView<UserWatcherUpdateKind> ChangedPropertyKinds { get; }
  }
  public enum UserWatcherUpdateKind
}
namespace Windows.UI.Composition.Interactions {
  public sealed class InteractionTracker : CompositionObject {
    int TryUpdatePosition(Vector3 value, InteractionTrackerClampingOption option, InteractionTrackerPositionUpdateOption posUpdateOption);
  }
  public enum InteractionTrackerPositionUpdateOption
}
namespace Windows.UI.Input {
  public sealed class CrossSlidingEventArgs {
    uint ContactCount { get; }
  }
  public sealed class DraggingEventArgs {
    uint ContactCount { get; }
  }
  public sealed class GestureRecognizer {
    uint HoldMaxContactCount { get; set; }
    uint HoldMinContactCount { get; set; }
    float HoldRadius { get; set; }
    TimeSpan HoldStartDelay { get; set; }
    uint TapMaxContactCount { get; set; }
    uint TapMinContactCount { get; set; }
    uint TranslationMaxContactCount { get; set; }
    uint TranslationMinContactCount { get; set; }
  }
  public sealed class HoldingEventArgs {
    uint ContactCount { get; }
    uint CurrentContactCount { get; }
  }
  public sealed class ManipulationCompletedEventArgs {
    uint ContactCount { get; }
    uint CurrentContactCount { get; }
  }
  public sealed class ManipulationInertiaStartingEventArgs {
    uint ContactCount { get; }
  }
  public sealed class ManipulationStartedEventArgs {
    uint ContactCount { get; }
  }
  public sealed class ManipulationUpdatedEventArgs {
    uint ContactCount { get; }
    uint CurrentContactCount { get; }
  }
  public sealed class RightTappedEventArgs {
    uint ContactCount { get; }
  }
  public sealed class SystemButtonEventController : AttachableInputObject
  public sealed class SystemFunctionButtonEventArgs
  public sealed class SystemFunctionLockChangedEventArgs
  public sealed class SystemFunctionLockIndicatorChangedEventArgs
  public sealed class TappedEventArgs {
    uint ContactCount { get; }
  }
}
namespace Windows.UI.Input.Inking {
  public sealed class InkModelerAttributes {
    bool UseVelocityBasedPressure { get; set; }
  }
}
namespace Windows.UI.Text {
  public enum RichEditMathMode
  public sealed class RichEditTextDocument : ITextDocument {
    void GetMath(out string value);
    void SetMath(string value);
    void SetMathMode(RichEditMathMode mode);
  }
}
namespace Windows.UI.ViewManagement {
  public sealed class ApplicationView {
    bool CriticalInputMismatch { get; set; }
    bool TemporaryInputMismatch { get; set; }
    void ApplyApplicationUserModelID(string value);
  }
  public sealed class UISettings {
    event TypedEventHandler<UISettings, UISettingsAnimationsEnabledChangedEventArgs> AnimationsEnabledChanged;
    event TypedEventHandler<UISettings, UISettingsMessageDurationChangedEventArgs> MessageDurationChanged;
  }
  public sealed class UISettingsAnimationsEnabledChangedEventArgs
  public sealed class UISettingsMessageDurationChangedEventArgs
}
namespace Windows.UI.ViewManagement.Core {
  public sealed class CoreInputView {
    event TypedEventHandler<CoreInputView, CoreInputViewHidingEventArgs> PrimaryViewHiding;
    event TypedEventHandler<CoreInputView, CoreInputViewShowingEventArgs> PrimaryViewShowing;
  }
  public sealed class CoreInputViewHidingEventArgs
  public enum CoreInputViewKind {
    Symbols = 4,
  }
  public sealed class CoreInputViewShowingEventArgs
  public sealed class UISettingsController
}
namespace Windows.UI.Xaml.Controls {
  public class HandwritingView : Control {
    UIElement HostUIElement { get; set; }
    public static DependencyProperty HostUIElementProperty { get; }
    CoreInputDeviceTypes InputDeviceTypes { get; set; }
    bool IsSwitchToKeyboardButtonVisible { get; set; }
    public static DependencyProperty IsSwitchToKeyboardButtonVisibleProperty { get; }
    double MinimumColorDifference { get; set; }
    public static DependencyProperty MinimumColorDifferenceProperty { get; }
    bool PreventAutomaticDismissal { get; set; }
    public static DependencyProperty PreventAutomaticDismissalProperty { get; }
    bool ShouldInjectEnterKey { get; set; }
    public static DependencyProperty ShouldInjectEnterKeyProperty { get; }
    event TypedEventHandler<HandwritingView, HandwritingViewCandidatesChangedEventArgs> CandidatesChanged;
    event TypedEventHandler<HandwritingView, HandwritingViewContentSizeChangingEventArgs> ContentSizeChanging;
    void SelectCandidate(uint index);
    void SetTrayDisplayMode(HandwritingViewTrayDisplayMode displayMode);
  }
  public sealed class HandwritingViewCandidatesChangedEventArgs
  public sealed class HandwritingViewContentSizeChangingEventArgs
  public enum HandwritingViewTrayDisplayMode
}
namespace Windows.UI.Xaml.Core.Direct {
  public enum XamlEventIndex {
    HandwritingView_ContentSizeChanging = 321,
  }
  public enum XamlPropertyIndex {
    HandwritingView_HostUIElement = 2395,
    HandwritingView_IsSwitchToKeyboardButtonVisible = 2393,
    HandwritingView_MinimumColorDifference = 2396,
    HandwritingView_PreventAutomaticDismissal = 2397,
    HandwritingView_ShouldInjectEnterKey = 2398,
  }
}
 

The post Windows 10 SDK Preview Build 18980 available now! appeared first on Windows Developer Blog.

HDInsight support in Azure CLI now out of preview

$
0
0

We are pleased to share that support for HDInsight in Azure CLI is now generally available. The addition of the az hdinsight command group allows you to easily manage your HDInsight clusters using simple commands while taking advantage of all that Azure CLI has to offer, such as cross-platform support and tab completion.

Key Features

  • Cluster CRUD: Create, delete, list, resize, show properties, and update tags for your HDInsight clusters.
  • Script actions: Execute script actions, list and delete persistent script actions, promote ad-hoc script executions to persistent script actions, and show the execution history of script action runs.
  • Manage Azure Monitor integration: Enable, disable, and show the status of Azure Monitor integration on HDInsight clusters.
  • Applications: Create, delete, list, and show properties for applications on your HDInsight clusters.
  • Core usage: View available core counts by region before deploying large clusters.

A gif showing the creation of an HDInsight cluster using a single, simple Azure CLI command.

Create an HDInsight cluster using a single, simple Azure CLI command

Azure CLI benefits

  • Cross platform: Use Azure CLI on Windows, macOS, Linux, or the Azure Cloud Shell in a browser to manage your HDInsight clusters with the same commands and syntax across platforms.
  • Tab completion and interactive mode: Autocomplete command and parameter names as well as subscription-specific details like resource group names, cluster names, and storage account names. Don't remember your 88-character storage account key off the top of your head? Azure CLI can autocomplete that as well!
  • Customize output: Make use of Azure CLI's globally available arguments to show verbose or debug output, filter output using the JMESPath query language, and change the output format between json, tab-separated values, or ASCII tables, and more.

Getting started

You can get up and running to start managing your HDInsight clusters using Azure CLI with 3 easy steps.

  1. Install Azure CLI for Windows, macOS, or Linux. Alternatively, you can use Azure Cloud Shell to use Azure CLI in a browser.
  2. Log in using the az login command.
  3. Take a look at our reference documentation, “az hdinsight” or run az hdinsight -h to see a full list of supported HDInsight commands and descriptions and start using Azure CLI to manage your HDInsight clusters.

About HDInsight

Azure HDInsight is an easy, cost-effective, enterprise-grade service for open source analytics that enables customers to easily run popular open source frameworks, such as Apache Hadoop, Spark, Kafka, and more. The service is available in 28 public regions and Azure Government Clouds in the US, Germany, and China. Azure HDInsight powers mission-critical applications in a wide variety of sectors and enables a wide range of use cases including ETL, streaming, and interactive querying.

Azure Files premium tier gets zone redundant storage

$
0
0

Azure Files premium tier is now zone redundant!

We're excited to announce the general availability of zone redundant storage (ZRS) for Azure Files premium tier. Azure Files premium tier with ZRS replication enables highly performant, highly available file services, that are built on solid-state drives (SSD).

Azure Files ZRS premium tier should be considered for managed file services where performance and regional availability are critical for the business. ZRS provides high availability by synchronously writing three replicas of your data across three different Azure Availability Zones, thereby protecting your data from cluster, datacenter, or entire zone outage. Zonal redundancy enables you to read and write data even if one of the availability zones is unavailable.

With the release of the ZRS for Azure Files premium tier, premium tier now offers two sets of durability options to meet your storage needs, zone redundant storage (ZRS) for intra-region high availability and locally-redundant storage (LRS) for lower-cost single region durable storage.

Getting started

You can create ZRS Azure premium files account through Azure Portal, Azure CLI, or Azure PowerShell.

Azure Files premium tier requires FileStorage as the account kind. To create a ZRS account in the Azure Portal, set the following properties:

An image showing the account settings.

Currently, ZRS option for Azure Files premium tier is available in West Europe and we will be gradually expanding the regional coverage. Stay up to date on the premium tier ZRS region availability through the Azure documentation.

Migration from LRS premium files account to ZRS premium files account requires manual copy or movement of data from an existing LRS account to a new ZRS account. Live account migration on request is not supported yet. Please check the migration documentation for the latest information.

Refer to the pricing page for the latest pricing information.

To learn more about premium tier, visit Azure Files premium tier documentation. Give it a try and share your feedback on the Azure Storage forum or email us at azurefiles@microsoft.com.

Happy sharing!

Why banks are adopting a modern approach to cybersecurity—the Zero Trust model

How to debug and profile any EXE with Visual Studio

$
0
0

Have you ever needed to debug or profile an executable (.exe file) that you don’t have source for or can’t build? Then the least known Visual Studio project type, the EXE project, is for you!

In Visual Studio you can open any EXE as a ‘project’. Just go to File->Open->Project/Solution and browse to the .exe file. Like you would if it was a .sln file. Visual Studio will then open that EXE as a project. This feature has been around for a long time. It works on all currently supported Visual Studio versions and the docs for it are at  ‘Debug an app that isn’t part of a Visual Studio solution‘.

.Opening an exe file with Visual Studio 2019

Debugging

Just as with a normal project you can start debugging with F5, which will launch the EXE and attach the debugger. If you want to debug startup you can launch with F11, which will launch the EXE and stop on the first line of user code. Both of these options are available on the context menu for the EXE project in Solution Explorer window as illustrated below:

Starting debugging from the solution node

For debugging will need to have symbols, PDB files, for the EXE and any DLLs you need to debug. Visual Studio will follow the same process to try to obtain symbols as it does when debugging a normal project. Since it’s not likely that the PDB files were distributed alongside the EXE you might want to locate them from a build drop or, better yet, from a symbol server. More information and best practices for symbols can be found in this blog.

To effectively debug you’ll also need the source code that was used to build the EXE, even for just a few files that you care about. You’ll need to locate those files and open them in Visual Studio. If the source code isn’t the exact same as the source code that was built the EXE Visual Studio will warn you when you try to insert a breakpoint and the breakpoint won’t bind. That behavior can be overridden from the Breakpoint Settings peek window. In the settings peek window click on the Must match source link text and then check the box to allow mismatched source, as illustrated below. Of course, with mismatched source you never really know what’s going to happen, so use at your peril.

Setting the breakpoint to not require exact source.

If the EXE was built with SourceLink enabled then information about the source will be included in the PDBs and Visual Studio will try to download the source automatically. This is a really good reason to use SourceLink with your projects. Even if you have a local enlistment you might not have the same version that was used to build the binary. SourceLink is your sure-fire way to make sure that the right source is linked with the right binary.

If you can’t obtain any source code you still have a couple of options:

  1. Use a tool to decompile the assemblies back into C#, which you can recompile into new assemble to patch the old one.
    1. ILSpy is a great choice for this, but there’s plenty of other good paid and free tools out there.
  2. Use the Disassembly tool window in Visual Studio.
    1. The Source Not Found document has a link to view disassembly. Be warned, if you’re use to debugging C# code the disassembly view is a tool of last resort.

Lastly, if you need to pass in any arguments to the EXE that’s being debugged you can configure them along with other options in the Project Properties page (Right Click->Properties on the project node in solution explorer).

Project properties page for the exe project.

Profiling

You can also use the profiling tools with the EXE by launching them from the Debug -> Performance Profiling. From the launch page of the profiling tools you can select what tools to use against the EXE. More information on profiling can be found in this docs (https://docs.microsoft.com/en-us/visualstudio/profiling/profiling-feature-tour?view=vs-2019).

Launching the performance profiler for an exe project.

Conclusion

That’s it. A brief overview of how you can use Visual Studio to debug and profile applications that you aren’t building and might not even have source for. So, next time you need to debug or profile an EXE don’t forget you can open it as a Solution in Visual Studio!

 

 

The post How to debug and profile any EXE with Visual Studio appeared first on The Visual Studio Blog.


Introducing HP Elite Dragonfly—the ultralight PC designed for the mobile workforce

Three ways to leverage composite indexes in Azure Cosmos DB

$
0
0

Composite indexes were introduced in Azure Cosmos DB at Microsoft Build 2019. With our latest service update, additional query types can now leverage composite indexes. In this post, we’ll explore composite indexes and highlight common use cases.

Index types in Azure Cosmos DB

Azure Cosmos DB currently has the following index types that are used for the following types of queries:

Range indexes:

  • Equality queries
  • Range queries
  • ORDER BY queries on a single property
  • JOIN queries

Spatial indexes:

  • Geospatial functions

Composite indexes:

  • ORDER BY queries on multiple properties
  • Queries with a filter as well as an ORDER BY clause
  • Queries with a filter on two or more properties

Composite index use cases

By default, Azure Cosmos DB will create a range index on every property. For many workloads, these indexes are enough, and no further optimizations are necessary. Composite indexes can be added in addition to the default range indexes. Composite indexes have both a path and order (ASC or DESC) defined for each property within the composite index.

ORDER BY queries on multiple properties

If a query has an ORDER BY clause with two or more properties, a composite index is required. For example, the following query requires a composite index defined on age and name (age ASC, name ASC):

SELECT * FROM c WHERE c.age ASC, c.name ASC

This query will sort all results in ascending order by the value of the age property. If two documents have the same age value, the query will sort the documents by name.

Queries with a filter as well as an ORDER BY clause

If a query has a filter as well as an ORDER BY clause on different properties, a composite index will improve performance. For example, the following query will require fewer request units (RU’s) if a composite index on name and age is defined and the query is updated to include the name in the ORDER BY clause:

Original query utilizing range index:

SELECT * FROM c WHERE c.name = “Tim” ORDER BY c.age ASC

Revised query utilizing a composite index on name and age:

SELECT * FROM c WHERE c.name = “Tim” ORDER BY c.name ASC, c.age ASC

While a composite index will significantly improve query performance, you can still run the original query successfully without a composite index. When you run the revised query with a composite index, it will sort documents by the age property. Since all documents matching the filter have the same name value, the query will return them in ascending order by age.

Queries with a filter on multiple properties

If a query has a filter with two or more properties, adding a composite index will improve performance.

Consider the following query:

SELECT * FROM c WHERE c.name = “Tim” and c.age > 18

In the absence of a composite index on (name ASC, and age ASC), we will utilize a range index for this query. We can improve the efficiency of this query by creating a composite index for name and age.

Queries with multiple equality filters and a maximum of one range filter (such as >,<, <=, >=, !=) will utilize the composite index. In some cases, if a query can’t fully utilize a composite index, it will use a combination of the defined composite indexes and range indexes. For more information, reference our indexing policy documentation.

Composite index performance benefits

We can run some sample queries to highlight the performance benefits of composite indexes. We will use a nutrition dataset that is used in Azure Cosmos DB labs.

In this example, we will optimize a query that has a filter as well as an ORDER BY clause. We will start with the default indexing policy which indexes all properties with a range index. Executing the following query as referenced in the image below in the Azure Portal, we observe the query metrics:

Query metrics:

Query which uses range index and consumes 21.8 RU’s.

This query, with the default indexing policy, required 21.8 RU’s.

Adding a composite index on foodGroup and _ts and updating the query text to include foodGroup in the ORDER BY clause significantly reduced the query’s RU charge.

Query metrics:

Query which uses composite index and consumes 4.07 RU’s.

After adding a composite index, the query’s RU charge decreased from 21.8 RU’s to only 4.07 RU’s. This query optimization will be particularly impactful as the total data size increases. The benefits of a composite index are significant when the properties in the ORDER BY clause have a high cardinality.

Creating composite indexes

You can learn more about creating composite indexes in this documentation. It’s simple to update the indexing policy directly through the Azure Portal. While creating a composite index for data that’s already in Azure Cosmos DB, the index update will utilize the RU’s leftover from normal operations. After the new indexing policy is defined, Azure Cosmos DB will automatically index properties with a composite index as they’re written.

Explore whether composite indexes will improve RU utilization for your existing workloads on Azure Cosmos DB.

Microsoft releases 18M building footprints in Africa to enable AI Assisted Mapping

$
0
0

In the last ten years, 2 billion people were affected by disasters according to the World Disasters report 2018. In 2017, 201 million people needed humanitarian assistance and 18 million were displaced due to weather related disasters. Many of these disaster-prone areas are literally “missing” from the map, making it harder for first responders to prepare and deliver relief efforts.

Since the inception of Tasking Manager, the Humanitarian OpenStreetMap Team (HOT) community has mapped at an incredible rate with 11 million square kilometers mapped in Africa alone. However, large parts of Africa with populations prone to disasters still remain unmapped — 60% of the 30 million square kilometers.

Under Microsoft’s AI for Humanitarian Action program, Bing Maps together with Microsoft Philanthropies is partnering with HOT on an initiative to bring AI Assistance as a resource in open map building. The initiative focuses on incorporating design updates, integrating machine learning, and bringing new open building datasets into Tasking Manager.

The Bing Maps team has been harnessing the power of Computer Vision to identify map features at scale. Building upon their work in the United States and Canada, Bing Maps is now releasing country-wide open building footprints datasets in Uganda and Tanzania. This will be one of the first open building datasets in Africa and will be available for use within OpenStreetMap (OSM).

In Tasking Manager specifically, the dataset will be used to help in task creation with the goal of improving task completion rates. Tasking Manager relies on ‘ML enabler’ to connect with building datasets through an API. This API-based integration makes it convenient to access not just Africa building footprints, but all open building footprints datasets from Bing Maps through ML Enabler, and thus the OpenStreetMap ecosystem.

“Machine learning datasets for OSM need to be open. We need to go beyond identifying roads and buildings and open datasets allow us to experiment and uncover new opportunities. Open Building Dataset gives us the ability to not only explore quality and validation aspects, but also advance how ML data assists mapping.”
- Tyler Radford (Executive Director, Humanitarian OpenStreetMap Team)

Africa presented several challenges: stark difference in landscape from the United States or Canada, unique settlements such as Tukuls, dense urban areas with connected structures, imagery quality and vintage, and lack of training data in rural areas. The team identified areas with poor recall by leveraging population estimates from CIESIN. Subsequent targeted labeling efforts across Bing Maps and HOT improved model recall especially in rural areas. A two-step process with semantic segmentation followed by polygonization resulted in 18M building footprints — 7M in Uganda and 11M in Tanzania.

Extractions Musoma, TanzaniaExtractions in Musoma, Tanzania

Bing Maps is making this data open for download free of charge and usable for research, analysis and of course, OSM. In OpenStreetMap there are currently 14M building footprints in Uganda and Tanzania (the last time our team counted). We are working to determine overlaps.

We will be making the data available on Github to download. The CNTK toolkit developed by Microsoft is open source and available on GitHub as well. The ResNet3 model is also open source and available on GitHub. The Bing Maps computer vision team will be presenting the work in Africa at the annual International State of the Map conference in Heidelberg, Germany and at the HOT Summit.

- Bing Maps Team

Microsoft releases 18M building footprints in Uganda and Tanzania to enable AI Assisted Mapping

$
0
0

In the last ten years, 2 billion people were affected by disasters according to the World Disasters report 2018. In 2017, 201 million people needed humanitarian assistance and 18 million were displaced due to weather related disasters. Many of these disaster-prone areas are literally “missing” from the map, making it harder for first responders to prepare and deliver relief efforts.

Since the inception of Tasking Manager, the Humanitarian OpenStreetMap Team (HOT) community has mapped at an incredible rate with 11 million square kilometers mapped in Africa alone. However, large parts of Africa with populations prone to disasters still remain unmapped — 60% of the 30 million square kilometers.

Under Microsoft’s AI for Humanitarian Action program, Bing Maps together with Microsoft Philanthropies is partnering with HOT on an initiative to bring AI Assistance as a resource in open map building. The initiative focuses on incorporating design updates, integrating machine learning, and bringing new open building datasets into Tasking Manager.

The Bing Maps team has been harnessing the power of Computer Vision to identify map features at scale. Building upon their work in the United States and Canada, Bing Maps is now releasing country-wide open building footprints datasets in Uganda and Tanzania. This will be one of the first open building datasets in Africa and will be available for use within OpenStreetMap (OSM).

In Tasking Manager specifically, the dataset will be used to help in task creation with the goal of improving task completion rates. Tasking Manager relies on ‘ML enabler’ to connect with building datasets through an API. This API-based integration makes it convenient to access not just Africa building footprints, but all open building footprints datasets from Bing Maps through ML Enabler, and thus the OpenStreetMap ecosystem.

“Machine learning datasets for OSM need to be open. We need to go beyond identifying roads and buildings and open datasets allow us to experiment and uncover new opportunities. Open Building Dataset gives us the ability to not only explore quality and validation aspects, but also advance how ML data assists mapping.”
- Tyler Radford (Executive Director, Humanitarian OpenStreetMap Team)

Africa presented several challenges: stark difference in landscape from the United States or Canada, unique settlements such as Tukuls, dense urban areas with connected structures, imagery quality and vintage, and lack of training data in rural areas. The team identified areas with poor recall by leveraging population estimates from CIESIN. Subsequent targeted labeling efforts across Bing Maps and HOT improved model recall especially in rural areas. A two-step process with semantic segmentation followed by polygonization resulted in 18M building footprints — 7M in Uganda and 11M in Tanzania.

Extractions Musoma, TanzaniaExtractions in Musoma, Tanzania

Bing Maps is making this data open for download free of charge and usable for research, analysis and of course, OSM. In OpenStreetMap there are currently 14M building footprints in Uganda and Tanzania (the last time our team counted). We are working to determine overlaps.

We will be making the data available on Github to download. The CNTK toolkit developed by Microsoft is open source and available on GitHub as well. The ResNet3 model is also open source and available on GitHub. The Bing Maps computer vision team will be presenting the work in Uganda and Tanzania at the annual International State of the Map conference in Heidelberg, Germany and at the HOT Summit.

- Bing Maps Team

Team Safety with the Fist to Five method

$
0
0

The success of any agile team is built on the shoulders of it’s individuals and their ability to work as a team. Those individuals need to feel safe to express their ideas and thoughts without other members passing judgement on each other. For every great idea your team has, there are dozens of other ideas that don’t stick. If team members are not in safe and inclusive environment, those ideas will never be proposed. This stifles team grown and innovation. Having a team culture where individuals feel good about expressing ideas and thoughts is how a high functioning agile team is built and is able to excel.

But how do you know if your team environment is a safe one? Just because a person is quiet does not mean they don’t feel safe. Maybe the individual who dominates the team meetings feels like they have to continuously defend themselves. Lets be realistic, if you are not on a healthy team, you probably don’t feel safe enough to talk about why it isn’t safe. So how do you gauge on how healthy your team is?

Some of our teams inside Microsoft have started using a method that we’ve been calling the ‘Team Safety Fist to Five’ and we wanted to share this approach.

What is the Fist to Five method?

The Fist to Five method is a long time Agile practice to help teams come to a consensus on a topic where a decision must be made. Think of it as a democracy for decision making. For the decision at hand, each team member is asked to put up their fist and show of fingers.

  • 1 Finger, you don’t agree

  • 3 Fingers, you are not sure of the decision, but you will go along with the team

  • 5 Fingers, you strongly agree

In a typical Fist to Five session, if anyone is under a 3, the decision does not go through. You then talk it through and vote again until a decision is reached.

Revised for team safety

For Fist to Five to work when assessing team safety, the process needs be revised. First, instead of a decision, you start with the simple question…

“How safe do you feel right now?”

Meaning, how comfortable do you feel about voicing your honest opinion within the confines of the team?

Obviously, if you don’t feel safe, you are not going to be honest if everyone knows your response. So instead of raising your fist, everyone on the team writes down their safety number on piece of paper and puts it into a hat. This keeps your answer anonymous.

  • 1, Don’t ask me, and if you do, I will probably won’t tell the truth

  • 3, Cautious and reserved. I will be careful on what opinions I express to the team

  • 5, I feel good and I will speak up and give you my honest opinion

Fist to five image

Once the responses are collected they are read out loud. Now everyone on the team understands how the rest of the team is feeling. If there is a problem, the team can pull together and fix it. It is important to understand that this exercise is about determining and acknowledging if a safety problem exists within your team. Culture and safety issues can penetrate your team in a variety of different ways. Some examples include…

  • Low team autonomy where decisions are handed down rather than made by the team
  • Poor team leadership
  • Difficult customer engagement
  • Burn out or increase in stress
  • Team members who dominate conversations or who are being perceived as combative
  • Individuals that carry baggage from previous teams
  • Conflicting individual personalities
  • New team and/or new team members

Fixing team safety

Unfortunately fixing team safety and culture issues is much harder than identifying them. The solutions can be unique to the individuals and team themselves. Although, we don’t have a one size fits all solution, we have found a couple of strategies that have successfully started the process of improving the team culture. In many ways, a lot of this goes back to the basics of leadership (i.e. not telling people what to do but driving clarity, building trust and rapport).

Encourage regular 1 on 1’s

Start meeting with your team members for 1 on 1 time on a regular basis. This includes not only the team leader meeting with everyone, but also across the different team members. You will gain insights and perspectives you would not normally get in a group settling. It allows you to build a better relationship with your co-workers. Better relationships builds trust and safety.

Do a better team building

Team building is a great way to get to know your co-workers. Stay away from the cliche events like laser tag, rock wall, and bowling. Instead, find an excuse to get together in person, grab a room, some food & drinks and just get to know each other and have a proper conversation. Ask questions and talk about things you care about, things you are passionate about. Start with topics that are personal and outside of work. Then ease into light discussions about work topics and some of the product challenges you are trying to solve. This will stir ideas and build trust across the team. Putting yourselves into situation where everyone is being heard and your co-workers really do care.

Summary

Team and psychological safety are a key component for any team to work well together and be successful. The Fist to Five method for team safety is an interesting approach to identifying safety and culture issues among your team. It’s probably a terrible name as it doesn’t even involve a fist any more, but we would love to know what you think and learn about the ways that you have been trying to encourage team safety. Please let us know in the comments or catch me on Twitter.

The post Team Safety with the Fist to Five method appeared first on Azure DevOps Blog.

Patching the new Cascadia Code to include Powerline Glyphs and other Nerd Fonts for the Windows Terminal

$
0
0

Microsoft released a nice new ligature-friendly open source font this week called Cascadia Code. It'll eventually be shipped with the open source Windows Terminal (you can get it from the store fee) but for now you can just download and install the TTF.

I've blogged about Fira Code and Monospaced Programming Fonts with Ligatures before. Just like keyboards, mice, monitors, text editors, and all the other things that we as developers put in our toolkits, Fonts are a very personal thing. Lots of folks have tweeted me, "why is this better than <font I use>." I dunno. Try it. Coke vs. Pepsi. If it makes you happy, use it.

I use Cascadia Code for my Terminals and I use Fira Code for my code editor. ¯_(ツ)_/¯

That said, one important thing that you may want to know about is that you have FULL control of your fonts! Lots of folks want certain glyphs, or a fancy bash prompt, or they use posh-git, or PowerLine, or all of the above.

Right now Cascadia Code doesn't include every glyph in the world, but don't let that hold you back. Fix it.

For example, if I go install "Oh my Posh" and spice up my PowerShell Core prompt, it might look like this with Cascadia Code today.

Cascadia Code with no Nerd Fonts

But if I patch Cascadia Code on my own machine to include Nerd Fonts and other glyphs, I'll get this lovely prompt in Windows Terminal:

Cascadia Code with Nerd Fonts and PowerLine

So you have the power to do a lot of things. Don't be satisfied. Nest, and make your prompt your own! There are lots of Nerd Fonts but I want to patch Cascadia Code today (I'm sure they'll do it themselves one day, but I'm impatient) and make it look the way I want. You can to!

Starting with FontForge in Ubuntu under WSL

Using WSL2 and Ubuntu, I installed the Nerd Fonts Patcher and ran it on my downloaded version of Cascadia code like this:

scott@IRONHEART:/mnt/d/github/nerd-fonts$ fontforge -script font-patcher /mnt/c/Users/scott/Downloads/Cascadia.ttf

Copyright (c) 2000-2014 by George Williams. See AUTHORS for Contributors.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
with many parts BSD <http://fontforge.org/license.html>. Please read LICENSE.
Based on sources from 11:21 UTC 24-Sep-2017-ML-D.
Based on source from git with hash:
The following table(s) in the font have been ignored by FontForge
Ignoring 'DSIG' digital signature table
Warning: Mac string is a subset of the Windows string in the 'name' table
for the License string in the English (US) language.
Adding 53 Glyphs from Seti-UI + Custom Set
╢████████████████████████████████████████╟ 100%
Adding 198 Glyphs from Devicons Set
╢████████████████████████████████████████╟ 100%

Done with Patch Sets, generating font...

Generated: Cascadia Code Nerd Font

Cool! I could even go nuts and add -c and add thousands of glyphs. It just depends on what I need. I could just go --powerline and --fontawesome and call it a day. It's up to you! Salt your Fonts to taste!

Now I can install my local modified TTF like any other, then go into my profile.json in Windows Terminal and set the font face to my new personal custom "CascadiaCode Nerd Font!" Boom. All set.

UPDATE:  Alistair has created a forked version with the added glyphs. You may (or may not) be able to download his forked and renamed version from this Github comment. Slick!

Please also check out my YouTube video on blinging out your PowerShell prompt in the Windows Terminal!

Check out my YouTubes


Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!



© 2019 Scott Hanselman. All rights reserved.
     

Top Stories from the Microsoft DevOps Community – 2019.09.20

$
0
0

Do tools facilitate the adoption of DevOps culture, or is it the other way around? How important are the technical tools in helping us deliver value to our end-users with higher quality and less friction?

This week, our community shared how Azure DevOps and its integrations are helping them collaborate and deliver software in various environments. If you have stories about tools helping or hindering culture improvements, please share it with me on Twitter!

Our DevOps Culture in Action: A Case Study of DevOps Standardization at Predica
There is an age-old debate on the interdependency between tools and culture. I believe that quality tools can help shape culture by making productive habits as easy as possible. In this post, Daniel Krzyczkowski walks through how Predica leverages the full suite of Azure DevOps capabilities, and, in particular, the Azure Boards, Dashboards, Azure DevOps Wiki, and our Microsoft Teams integration to foster team collaboration, and strive for high quality and transparency. It is always inspiring to see the real world examples!

Using VS Code Extension for Azure Pipeline – Part1
And isn’t it great when the tools you love integrate with each other? This post from Chaminda Chandrasekara covers configuring YAML pipelines in VSCode using the recently released Azure Pipelines extension Visual Studio Code extension. Thank you, Chaminda!

Automate Building, Testing and Deploying SharePoint Framework Projects with Azure Pipelines in Four Steps
I never suspected a day would come when I would say this, but – you can now implement CI/CD for SharePoint projects using YAML Pipelines in Azure DevOps. This great post from Andrew Connell covers how to use the public template repo to Build, Test and Deploy your SharePoint project using Azure Pipelines. And, according to Andrew, you could be done before your coffee is ready!

Deploying a React Application to Azure Storage via Azure DevOps
Deploying static websites should definitely be among the things you can get done before your coffee is ready. This post from Napoleon Jones covers configuring continuous integration and deployment of a simple React application into Azure Storage using a YAML pipeline in Azure DevOps. Simple and powerful!

Run Azure DevOps Unit Tests with the Azure Storage Emulator on Hosted build agents
And, on the topic of Azure Storage, this post from Tobias Zimmergren explains how to run the Azure Storage Emulator on an Azure DevOps Hosted Agent for the purposes of Unit Testing. Thank you Tobias!

If you’ve written an article about Azure DevOps or find some great content about DevOps on Azure, please share it with the #AzureDevOps hashtag on Twitter!

The post Top Stories from the Microsoft DevOps Community – 2019.09.20 appeared first on Azure DevOps Blog.


Come meet Microsoft at DjangoCon 2019

$
0
0

DjangoCon 2019 is happening next week in San Diego and Microsoft is pleased to be supporting the conference as Gold sponsors this year. We will have some members from our Python team, our Azure Cloud Advocates, and the PostgreSQL team.

Be sure to check out our talks from our team during the regular conference, all on Tuesday Sept 24th:

After the conference we’ll update this post with links to slides and content for the talks, so be sure to check back here!

If you are at the conference, come by our booth to meet the team and ask us your questions about Python, Django, and PostgreSQL in VS Code and Azure. If you cannot make it, or want more information after the conference, here are some helpful resources that cover a subset of what we’ll be sharing:

 

The post Come meet Microsoft at DjangoCon 2019 appeared first on Python.

What’s the difference between a console, a terminal, and a shell?

$
0
0

I see a lot of questions that are close but the questions themselves show an underlying misunderstanding of some important terms.

  • Why would I use Windows Terminal over PowerShell?
  • I don't need WSL for bash, I use Cygwin.
  • Can I use conemu with PowerShell Core or do I need to use Windows Terminal?

Let's start with a glossary and clarify some words first.

Terminal

The word Terminal comes from terminate, indicating that it's the terminating end or "terminal" end of a communications process. You'll often hear "dumb terminal" when referring to a text-based environment where the computer you are sitting next to is just taking input and showing text while the real work happens at the other end in a mainframe or large computer.

TTY or "teletypewriter" was the first kind of terminal. Rather than a screen you'd have a literal typewriter in front of you. When you type on it, you're seeing the text on a piece of paper AND inputing that text into a computer. When that computer replies, you'll see the typewriter automatically type on the same paper.

Photo by Texas.713 used under CC

When we refer to a Terminal in the software sense, we're referring to a literal software version of a TTY or Terminal. The Windows Terminal is that. It's really good at displaying textual output. It can take input and pass it on. But the Terminal isn't smart. It doesn't actually process your input, it doesn't look at your files or think.

Console

Folks in the mid 20th century would have a piece of furniture in their living room called a console or console cabinet. A Console in the context of computers is a console or cabinet with a screen and keyboard combined inside it. But, it's effectively a Terminal. Technically the Console is the device and the Terminal is now the software program inside the Console.

image

In the software world a Terminal and a Console are, for all intents, synonymous.

Shell

A shell is the program that the terminal sends user input to. The shell generates output and passes it back to the terminal for display. Here's some examples of Shells:

  • bash, fish, zsh, ksh, sh, tsch
  • PowerShell, pwsh
  • cywin
  • cmd, yori, 4dos, command.com

Here's an important point that should make more sense now that you have these terminals - Your choice of shell doesn't and shouldn't dictate your choice of terminal application.

Aside: WSL and WSL2 (the Windows Subsystem for Linux) are a complete local Linux (or many Linuxes) that run on Windows 10. They are full and real. WSL2 ships a real Linux kernel and runs in on Windows. Cygwin is NOT a Linux. Cygwin is a large collection of GNU and Open Source tools which provide functionality similar to Linux on Windows - but it is not Linux. It's a simulacrum. It's GNU utils compiled against Win32. It's great, but it's important for you to know what the difference is. Cygwin may let you run your shell scripts but it will NOT run Apache, Docker, or other real ELF-binaries and Linux apps.

Your Choice of Windows Consoles?

There are a number of shells that ship with Windows. Here's a few I'm running now. Note the "chrome" or the border and title around them? Those shells are all hosted by a the legacy Windows console you have never heard of called conhost.exe. You can go to the command prompt, type powershell, cmd, or ubuntu and any number of shells will run. Conhost does the work of input and output.

Now, forget that conhost exists, because it sucks - it's super old.

Shells that come with Windows

Pseudo Console, Pseudo Terminal, PTY, Pseudo TTY (ConPTY)

Pseudo Terminals are terminal emulators or software interfaces that emulate terminals. They pretend to be terminals like the ones above. *Nix systems have long had a pseudo-terminal (PTY) infrastructure and now Windows as a pseudoconsole (ConPTY) as well.

Window's new ConPTY interface is the future of consoles and terminals on Windows. If you choose a 3rd party (non-built-in) console applications for Windows, make sure it supports ConPTY and it'll be a better experience than some of the older consoles that use screen scraping or other hacks.

image

Back to your choice of Windows Consoles

Remembering there's a lot of shells you can use in Windows, there's a lot of 3rd party consoles you can use if you don't like conhost.exe (and you shouldn't).

All of these Terminals support ALL the shells above and any shells I've missed. Because a shell isn't a terminal. Pick the one that makes you happy. I use PowerShell Core and Ubuntu in WSL2 in the Windows Terminal.

Windows Terminal

Hope this helps clear things up.


Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!



© 2019 Scott Hanselman. All rights reserved.
     

The Marco Polo Network uses Azure and Corda blockchain to modernize trade finance

$
0
0

The Marco Polo Network is now generally available on Azure to help both trade banks and corporations take advantage the R3 Corda distributed ledger to better facilitate global trade in this ever-changing world. Regardless of what headlines will lead you to believe, international trade is the lifeblood of the modern global economy. Each year, hundreds of trillions of dollars in goods, assets, credit, and money change hands to keep the engine of global trade running. When a multinational corporation (acting as a seller or exporter) sends goods to their customers (acting as buyers or importers,) the corporation often doesn’t receive payment for 30-90 days. This problem can be exacerbated by variables such as tariffs or new customs duties. To manage cash flow while waiting for payment, sellers often resort to taking out short-term loans from trade banks. But trade banks find it difficult to keep pace having to rely on aging systems and siloed data that increases cost and process friction for all involved.

The disadvantages of disconnected trade

If global trade is an engine, financing is the fuel. But many trade banks rely on decades-old, paper-based processes that slow trade flow and add complexity, with antiquated financing tools that make onboarding expensive, reconciliation cumbersome, and the customer experience poor.

Furthermore, as global providers of trade and supply chain finance, trade banks must manage transactions between sellers and buyers while navigating increasingly complex regulatory processes pronounced by national boundaries. Due to these global regulations, banks can be forced to use different financing platforms for each geolocation, leading to an overabundance of disconnected management tools.

Without a network to exchange data and a platform for viewing and managing transactions, banks have tremendous difficulty processing and executing their clients trade and supply chain financing transactions. At the same time, buyers and sellers can lack awareness of their own financial health due to paper-based trade contracts which aren’t immediately understood across the organization. Furthermore, many small- and medium-sized import and export businesses are unable to scale due to staggering overhead costs.

A cloud-based network to streamline global trade

To improve efficiency in global trade finance, technology firms Trade IX and R3 partnered together with leading banks to create the Marco Polo Network. Launched in 2017, Marco Polo provides a digital, distributed technology platform that allows trading parties to automate and streamline their trade and supply chain finance activities. Applications are built and deployed on top of the platform that allow banks and corporations to perform specific product and trade orchestrations. Trading parties - buyers, sellers, logistics providers, insurers, banks, and other key stakeholders- are able to exchange trade data and assets securely, in real time, and peer to peer using an open and distributed network powered by Corda. Importantly, the network and platform are open - meaning third-parties can build, develop, and deploy their own solutions on the network and platform.

The Marco Polo Network, a platform built by TradeIX using 18 distinct Azure services and R3’s Corda distributed ledger technology, is revolutionizing trade finance. TradeIX packaged Corda and the Marco Polo Network application stack, or node, for deployment using Azure Container Instances and the Azure Container Registry. This gave participating banks and corporations the flexibility to pursue one of two different hosting options; run a Marco Polo node inside of the TradeIX Azure tenant or pull down the application binaries as Docker images from an Azure Container Registry where they could then be deployed within the bank’s Azure tenant. The result is a transformational technology and distributed platform that enables the world’s leading trade banks and their corporate clients to exchange data in real-time resulting in streamlined, automated business activities that increase efficiency and transparency for receivables financing and cash flow management. TradeIX built these exciting new collaboration capabilities into the Marco Polo Network using an innovative, integrated application stack comprised of Corda, Azure SQL Server, CosmosDB and Microsoft Dynamics 365 technologies.

One of the more novel features of the Marco Polo Network is the use of the R3 Corda distributed ledger to ensure that all of the counterparties involved in a financing request have a secure medium by which they can securely and seamlessly exchange trade data, contracts, and financial assets that are critical to completing a supply chain finance transaction. By hosting this platform in the cloud, TradeIX delivers an improved customer experience by providing a single infrastructure for banks and clients to manage their transactions—regardless of geolocation, currency, type of transaction, and industry. Because it’s an open, cloud-native network, Marco Polo Network members can share best practices, run pilot programs, and adjust the platform to meet their specific needs. However, this openness should not come at the expense of the security and compliance fundamentals required by the world’s leading banks and corporations. Microsoft and TradeIX implemented a host of Azure security controls such as Log Analytics, Security Center, Application Gateway, and DDOS Protection to ensure that the Marco Polo Network would be well-positioned to maintain the highest levels of trust, transparency, standards conformance for all members across the network.

In the near future, the Marco Polo Network will also provide corporate treasurers with an ERP-embedded Marco Polo App supported by Dynamics 365, that allows companies to manage their trade finance directly within their own ERP system. The TradeIX – Dynamics 365 interface enables corporations to submit requests for finance directly to their trade bank of choice where it will be automatically acknowledged, received, and processed by the bank’s Corda instance resulting in a free exchange of data without the need for manual reconciliation.

Reducing expenses, improving revenue

An important objective of the Marco Polo Network is to obtain all trade data necessary for a transaction as directly as possible, from the original data source. This also includes external third parties such as logistic providers. Imagine a scenario where two companies (a buyer and a seller) and their corresponding banks, exchange order and delivery data via the Marco Polo Network. Payment terms would then be secured by an irrevocable payment commitment, triggered through automated matching of trade data. This would then be followed by an automatic matching of trade data achieved with involvement of the executing logistics provider, which enters the relevant transport details directly into the network. The ability for the third-party logistics provider to automatically trigger a payment from buyer to supplier following goods delivery with data reconciliation flowing across multiple banks simultaneously demonstrates the real-world value of the Marco Polo Network.

A growing network, built with business in mind

Because the Marco Polo Network is governed by member banks, the model promotes an atmosphere of collaboration across the global trade industry. This formalized governance framework has helped the Marco Polo Network onboard trade banks and corporations across Africa, Asia, Europe, the Middle East, as well as North and South America. Companies of all sizes will benefit from better visibility into trading relationships and easier access to financing options, beyond point to point relationships, to a global network of trading parties.

I’m very pleased to see Microsoft’s Azure team is pushing the boundaries of banking and technology innovation with their partnership with the Marco Polo Network built by TradeIX. These 2 solutions coupled with Corda creates a very compelling and modern proposition for any smart business looking to take advantage of the benefits that distributed architecture offers.” - Andrew Speers, Director, Product and Innovation at NatWest and Board Director at the Corda Network Foundation.

“International trade is indeed the lifeblood of the economy, which is why R3 is so proud to be a part of the Marco Polo Network. Together, Corda and Microsoft Azure are enabling TradeIX’s mission to transform trade finance, by bringing much needed efficiencies to this market, which holds hidden treasure in the hunt for high yields. We are honored to be part of the ecosystem that will build trade finance solutions on blockchain, and are excited to see what’s next” - Ricardo Correia, Head of Partners at R3.

“It is exciting to be part of the growing ecosystem building trade finance solutions on blockchain. Microsoft is honoured to be providing our global scale cloud as a foundation to R3 and TradeIX to speed this solution to market,”  - Michael Glaros, Azure Blockchain Engineering, Microsoft.

“One of the founding technology decisions that were made for the Marco Polo Network was to use the infrastructure provided by Microsoft Azure. We firmly believe that our partnership with Microsoft provides Marco Polo members with the best infrastructure and highest security and transparency standards combined with improved customer experience.” Oliver Belin CMO, TradeIX. 

Extending the power of Azure AI to business users

$
0
0

Today, Alysa Taylor, Corporate Vice President of Business Applications and Industry, announced several new AI-driven insights applications for Microsoft Dynamics 365.

Powered by Azure AI, these tightly integrated AI capabilities will empower every employee in an organization to make AI real for their business today. Millions of developers and data scientists around the world are already using Azure AI to build innovative applications and machine learning models for their organizations. Now business users will also be able to directly harness the power of Azure AI in their line of business applications.

What is Azure AI?

Azure AI is a set of AI services built on Microsoft’s breakthrough innovation from decades of world-class research in vision, speech, language processing, and custom machine learning. What I find particularly exciting is that Azure AI provides our customers with access to the same proven AI capabilities that power Xbox, HoloLens, Bing, and Office 365.

Azure AI helps organizations:

  • Develop machine learning models that can help with scenarios such as demand forecasting, recommendations, or fraud detection using Azure Machine Learning.
  • Incorporate vision, speech, and language understanding capabilities into AI applications and bots, with Azure Cognitive Services and Azure Bot Service.
  • Build knowledge-mining solutions to make better use of untapped information in their content and documents using Azure Search.

Bringing the power of AI to Dynamics 365 and the Power Platform

The release of the new Dynamics 365 insights apps, powered by Azure AI, will enable Dynamics 365 users to apply AI in their line of business workflows. Specifically, they benefit from the following built-in Azure AI services:

  • Azure Machine Learning which powers personalized customer recommendations in Dynamics 365 Customer Insights, analyzes product telemetry in Dynamics 365 Product Insights, and predicts potential failures in business-critical equipment in Dynamics 365 Supply Chain Management.
  • Azure Cognitive Services and Azure Bot Service that enable natural interactions with customers across multiple touchpoints with Dynamics 365 Virtual Agent for Customer Service.
  • Azure Search which allows users to quickly find critical information in records such as accounts, contacts, and even in documents and attachments such as invoices and faxes in all Dynamics 365 insights apps.

Furthermore, since Dynamics 365 insights apps are built on top of Azure AI, business users can now work with their development teams using Azure AI to add custom AI capabilities to their Dynamics 365 apps.

The Power Platform, comprised of three services - Power BI, PowerApps, and Microsoft Flow, also benefits from Azure AI innovations. While each of these services is best-of-breed individually, their combination as the Power Platform is a game-changer for our customers.

Azure AI enables Power Platform users to uncover insights, develop AI applications, and automate workflows through low-code, point-and-click experiences. Azure Cognitive Services and Azure Machine Learning empower Power Platform users to:

  • Extract key phrases in documents, detect sentiment in content such as customer reviews, and build custom machine learning models in Power BI.
  • Build custom AI applications that can predict customer churn, automatically route customer requests, and simplify inventory management through advanced image processing with PowerApps.
  • Automate tedious tasks such as invoice processing with Microsoft Flow.

The tight integration between Azure AI, Dynamics 365, and the Power Platform will enable business users to collaborate effortlessly with data scientists and developers on a common AI platform that not only has industry leading AI capabilities but is also built on a strong foundation of trust. Microsoft is the only company that is truly democratizing AI for businesses today.

And we’re just getting started. You can expect even deeper integration and more great apps and experiences that are built on Azure AI as we continue this journey.

We’re excited to bring those to market and eager to tell you all about them!

Navigating the intelligent edge: answers to top questions

$
0
0

Over the past ten years, Microsoft has seen embedded IoT devices get progressively smarter and more connected, running software intelligence near the point where the data is being generated within a network. And having memory and compute capabilities at the intelligent edge solves multiple conundrums related to connectivity, bandwidth, latencies, and privacy/security.

Of course, each device that connects to a network brings the challenge of how to secure, provision, and manage them. It raises issues of privacy requirements, data regulations, bandwidth, and transfer protocols. And when you have thousands of devices connecting to each other and broader systems like the cloud, all this can get very complex, very quickly.

Here are some of the most frequent questions around the intelligent edge and examples of how Azure solutions can help simplify securing, provisioning, and managing it. To hear more in-depth thoughts on this topic, join Olivier Bloch on October 10 as he speaks at the IoT in Action event in Santa Clara.
  Blog Photos

Securing the intelligent edge

“How do I ensure the devices that are connected are the ones they say they are, and that they are authenticating to the back end and securing data in an encrypted way?”

Each device that gets installed on a network provides one more potential network doorway for bad actors. No one wants their car radio, scale, or vending machine hacked. No one wants customer data stolen. We’ve already seen too much of that in the news. Securing the intelligent edge is rightfully a key concern for customers interested in IoT technology.

The key is to start simple by building on top of solutions that have addressed these important concerns. Microsoft intelligent edge and intelligent cloud solutions have been designed to complement each other, which makes it much easier to create secure IoT solutions that you can trust.

Azure Sphere is a great place to start. It provides a turnkey IoT solution that builds on decades of Microsoft experience, ensuring comprehensive, multi-layer security from the multipoint control unit (MCU) to the operating system to the cloud.

It begins with Azure Sphere-certified MCUs from our hardware partners, with Microsoft hardware root of trust embedded into the silicone. The operating system (OS) provides in-depth defense that guards against hackers and enables automated OS and security updates. The Azure Sphere Security Service safeguards every device with seven properties of highly secured, internet-connected devices. Azure Sphere only runs signed, authentic software, reducing risk of malware or application tampering. Even if you have devices that are already installed, they can be secured with Azure Sphere guardian modules, with little or no redesign required.

Provisioning and managing the intelligent edge

“Connecting one device manually to the cloud is part of the story. But what if I need to provision and then manage a whole bunch of devices at scale?”

You want to ensure devices are easy to provision, update, and manage. You want to be able to roll out new devices, and when the time comes, retire devices. You want to provision and manage devices like you would a fleet of PCs without having to manually update software and firmware.

Again, Microsoft has solutions that simplify all of this.

Azure IoT Hub enables you to connect, manage, and scale devices to the edge with per-device authentication and scaled provisioning. Azure IoT Edge, which is an intelligent edge runtime managed and configured from Azure IoT Hub, enables you to deploy cloud workloads to run on edge devices using standard containers. IoT Edge secures the communications between IoT applications and your edge devices, enabling you to power and remotely configure the devices. Built-in device management and provisioning capabilities enable you to connect and manage devices at scale.

To implement scaled provisioning, Azure IoT Hub is paired with the Device Provisioning Service (DPS) which streamlines the enrollment process by allowing you to register and provision all your devices to IoT Hub without any human intervention. DPS takes advantage of hardware-secured modules where secure seeds are planted by silicon manufacturers and confidential compute is possible, all to establish a trusted connection and authentication with a global endpoint (DPS). This, in turn, can be configured to not only provide IoT Hub device identity and credentials back to devices, but it also can deliver a first configuration at provisioning time. It’s a powerful and scalable way to manage IoT devices during their whole life cycle from the first connection to retirement, including transfers of ownership.

Learn more about the intelligent edge at an IoT in Action event

Microsoft continues to innovate with solutions that help streamline and simplify securing, provisioning, and managing the intelligent edge. To learn more about how you can best leverage this technology, be sure to register for the upcoming Santa Clara IoT in Action event on October 10. As part of the event, I will be leading a panel discussion focused on how customers and partners are simplifying IoT and solving industry problems. 

If you can’t make it to the Santa Clara event, there will also be one-day events held in cities around the world, including Warsaw, Frankfurt, Toronto, Auckland, Taipei, Shenzhen, and more. These events are a valuable opportunity to get all your questions answered and build connections with potential IoT partners. Through interactive sessions, Microsoft will share how various solutions and accelerators can help simplify IoT so you can get secure solutions out the door faster and more cost effectively.

Prefer a virtual event? Browse the IoT in Action webinar series which features IoT industry experts discussing real-life solution use cases. You can also get started on further advancing your technical IoT skills by watching the IoT Show, joining the IoT Tech Community, and learning at IoT School.

Viewing all 5971 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>