Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

MWC Barcelona 2019—Lenovo announces new devices geared towards Firstline Workers


Azure Marketplace new offers – Volume 32

$
0
0

We continue to expand the Azure Marketplace ecosystem. From January 16 to January 31, 2019, 70 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual machines

Admin Password Manager for Enterprise

Admin Password Manager for Enterprise: Admin Password Manager for Enterprise simplifies password management while helping customers implement recommended defenses against possible cyberattacks.

Appiyo BPM - Simple lightweight process engine

Appiyo BPM - Simple lightweight process engine: Appiyo's compute engine helps you implement simple business processes and API integration scenarios. It can be used for enterprise linkage to conversational BOTs and for IOT scenarios and document management.

attendize

Attendize Open-source ticket selling system: Attendize offers a wide array of ticket and event management features, including mobile-friendly event pages, attendee management, data export, real-time event statistics, and support for multiple currencies.

AVReporter Azure

AVReporter Azure: The AVReporter energy management software contains desktop, web, and mobile interfaces; reports and graphical elements; alerts; ready-to-use dashboards; and more.

Celebrus Enterprise Customer Data Platform

Celebrus Enterprise Customer Data Platform: Celebrus captures a complete picture of customer behavior and experience, creating events and profiles in real time for 1-to-1 personalization and streaming analytics.

DNS Safety Filter

DNS Safety Filter: This solution is a DNS server with extensive filtering capabilities. It allows you to filter access to domain names by categories and block access to specified domains, and it provides access policies for different groups of machines in your network.

EDLIGO

EDLIGO: EDLIGO is a fully integrated solution that delivers real-time insights for data-driven decisions in education. It features easy-to-use dashboards and advanced predictive, causal, and prescriptive analytics.

IOTA Full Node

IOTA Full Node: A full node is a program that fully validates transactions. Setting up your own full node saves you from relying on third parties, giving you more financial control.

Jamcracker CSB Service Provider Version 7.0

Jamcracker CSB Service Provider Version 7.0: Jamcracker CSB, a purpose-built appliance for service providers, is a cloud brokerage solution for Software-as-a-Service and Infrastructure-as-a-Service products. It automates order management, provisioning, and billing.

JFrog Artifactory VM

JFrog Artifactory VM: This virtual machine comes with Java 8, JFrog Artifactory, and Nginx.

Joomla on Windows Server 2016

Joomla on Windows Server 2016: Joomla is a free and open-source content management system for building websites and powerful online apps. With Joomla, you can connect your sites to databases like MySQL, MySQLi, or PostgreSQL to manage content and delivery.

Joomla on Windows Server 2019

Joomla on Windows Server 2019: Joomla is a free and open-source content management system for building websites and powerful online apps. With Joomla, you can connect your sites to databases like MySQL, MySQLi, or PostgreSQL to manage content and delivery.

MediaWiki on Windows Server 2016

MediaWiki on Windows Server 2016: MediaWiki is a powerful, free, and open-source wiki engine written in the PHP programming language. MediaWiki software is fully customizable, with more than 1,800 extensions available for enabling features.

MediaWiki on Windows Server 2019

MediaWiki on Windows Server 2019: MediaWiki is a powerful, free, and open-source wiki engine written in the PHP programming language. MediaWiki software is fully customizable, with more than 1,800 extensions available for enabling features.

NVIDIA Quadro Virtual Workstation - Ubuntu 18.04

NVIDIA Quadro Virtual Workstation - Ubuntu 18.04: Spin up a GPU-accelerated virtual workstation in minutes, without having to manage endpoints or back-end infrastructure. NVIDIA Tesla GPUs in the cloud power high-performance simulation, rendering, and design.

NVIDIA Quadro Virtual Workstation - WinServer 2016

NVIDIA Quadro Virtual Workstation - WinServer 2016: Spin up a GPU-accelerated virtual workstation in minutes, without having to manage endpoints or back-end infrastructure. NVIDIA Tesla GPUs in the cloud power high-performance simulation, rendering, and design.

WordPress on Windows Server 2016

WordPress on Windows Server 2016: WordPress is a free and open-source website management system for blogs, applications, business sites, portfolios, and more. It's written in the PHP programming language.

WordPress on Windows Server 2019

WordPress on Windows Server 2019: WordPress is a free and open-source website management system for blogs, applications, business sites, portfolios, and more. It's written in the PHP programming language.

Web applications

Avere vFXT for Azure ARM Template

Avere vFXT for Azure ARM Template: The Avere vFXT provides scalability, flexibility, and easy access to cloud-based or file-based storage locations for users tasked with managing critical high-performance computing (HPC) workloads.

Cisco CSR 1000V DMVPN Transit VNET

Cisco CSR 1000V DMVPN Transit VNET: A transit VNet is a common strategy to connect geographically dispersed VNets and remote networks. It simplifies network management and minimizes the number of connections required to connect VNets and remote networks.

Citrix SD-WAN Standard Edition 10.2

Citrix SD-WAN Standard Edition 10.2: Citrix SD-WAN Standard Edition for Azure logically bonds network links into a single, secure, logical virtual path. Organizations can leverage broadband, MPLS, 4G/LTE, satellite, and other connections.

CloudMigrator Azure Environment

CloudMigrator: Use CloudMigrator to securely migrate email, contacts, calendars, and files from enterprise sources to Microsoft Office 365. CloudMigrator is highly configurable, allowing you to complete the most complex and demanding migrations with ease.

Customer-Facing Anti-Phishing

Customer-Facing Anti-Phishing: Segasec specializes in helping organizations mitigate the risk of their customers becoming victims of online fraud and phishing scams. Segasec's solution requires zero onboarding and no integration, so companies can start immediately.

Discovery Hub with SQL MI and AAS

Discovery Hub® with SQL MI and AAS: Discovery Hub Application Server for Azure is a high-performance data management platform that accelerates your time to data insights.

FortiWeb Web Application Firewall - HA

FortiWeb Web Application Firewall - HA: Whether to simply meet compliance standards or to protect mission-critical hosted applications, FortiWeb's web application firewalls (WAFs) provide advanced features and AI-based machine learning detection engines.

HPC Azure Cluster Management Service

HPC Azure Cluster Management Service: This self-hosted service can help you manage your HPC clusters on Azure. The service provides cluster diagnostics (including benchmark and MPI diagnostics), monitoring, and management features.

JFrog Artifactory Enterprise ARM Template

JFrog Artifactory Enterprise ARM Template: JFrog Artifactory Enterprise delivers end-to-end automation and management of your binaries and artifacts. It is a scalable, universal binary repository manager that integrates with your DevOps tools and platforms.

MSPControl

MSPControl: The powerful MSPControl platform gives users simple multi-tenant point-and-click control over Windows Server applications, including Microsoft IIS, Microsoft SQL Server, and Microsoft Exchange.

TimeXtender Discovery Hub with SQL Managed Instance

TimeXtender Discovery Hub with Managed Instance: Discovery Hub supports core analytics, the modern data warehouse, artificial intelligence, and the internet of things (IoT). This offering deploys the Discovery Hub application server and Azure SQL Managed Instance.

Waves MultiNode

Waves MultiNode: Waves MultiNode is a ready-to-use blockchain software solution for those who want to launch their own private blockchain network with a specific business logic, or to start development of a decentralized application.

Waves Node

Waves Node: Waves Node is a ready-to-use blockchain software solution for those who want to help maintain the Waves network without any hardware or specialist experience.

Container solutions

Entitystream Custodian

Entitystream Custodian: Custodian is a full-stack master data management solution that enables its users to connect data from different parts of their organization with the need for complex data management projects.

EntityStream Mars

EntityStream Mars: Mars is a microservice that allows you to present it with two records and have it index, standardize, and compare them so you can understand how similar they are in real-life scenarios.

Jsonnet Container Image

Jsonnet Container Image: Jsonnet is a data templating language for application and tool developers. It’s based on JSON.

Kubeapps AppRepository Controller Container Image

Kubeapps AppRepository Controller Container Image: Kubeapps AppRepository Controller is one of the main components of Kubeapps, a web-based application deployment and management tool for Kubernetes clusters. This controller monitors resources.

Kubeapps Chart Repo Container Image

Kubeapps Chart Repo Container Image: Kubeapps Chart Repo is one of the main components of Kubeapps, a web-based application deployment and management tool for Kubernetes clusters. It scans a chart repository and populates its metadata.

Kubeapps Chartsvc Container Image

Kubeapps Chartsvc Container Image: Kubeapps Chartsvc is one of the main components of Kubeapps, a web-based application deployment and management tool for Kubernetes clusters. This service reads metadata about the repositories.

Kubeapps Tiller Proxy Container Image

Kubeapps Tiller Proxy Container Image: Kubeapps Tiller Proxy is one of the main components of Kubeapps, a web-based application deployment and management tool for Kubernetes clusters. This proxy provides a secure way to authenticate users.

Redis Enterprise Software

Redis Enterprise Software: With Redis Enterprise Software, a dataset can grow beyond the largest node in the cluster and be processed by any number of cores.

Scribendi Accelerator

Scribendi Accelerator: The Scribendi Accelerator is an advanced grammatical error correction tool designed to support professional editors during the editing process and increase their productivity.

TensorFlow ResNet Container Image

TensorFlow ResNet Container Image: TensorFlow ResNet is a client utility for use with TensorFlow Serving and ResNet models.

Consulting services

Accelerate Advanced Analytics- 4-week Assessment

Accelerate Advanced Analytics: 4-week Assessment: This engagement by Applied Cloud Systems involves an assessment of your analytics capabilities, a period of incubation in Azure, and a path forward. This is a four-week assessment for small to medium-sized organizations.

Accelerate Advanced Analytics- 8-week Assessment

Accelerate Advanced Analytics: 8-week Assessment: This engagement by Applied Cloud Systems involves an assessment of your analytics capabilities, a period of incubation in Azure, and a path forward. This is an eight-week assessment for small to medium-sized organizations.

Accelerate Azure- 4-week Assessment

Accelerate Azure: 4-week Assessment: Applied Cloud Systems will provide direction and velocity for Azure adoption, innovation, and operations in this four-week assessment intended for small to medium-sized organizations.

Accelerate Azure- 8-week Assessment

Accelerate Azure: 8-week Assessment: Applied Cloud Systems will provide direction and velocity for Azure adoption, innovation, and operations in this eight-week assessment intended for small to medium-sized organizations.

Accelerate DevOps- 4-week Assessment

Accelerate DevOps: 4-week Assessment: This four-week assessment by Applied Cloud Systems will guide your organization through application modernization and delivery leveraging Azure DevOps.

Accelerate DevOps- 8-week Assessment

Accelerate DevOps: 8-week Assessment: This eight-week assessment by Applied Cloud Systems will guide your organization through application modernization and delivery leveraging Azure DevOps.

AgileIdentity- Azure AD- 3-week Implementation

AgileIdentity: Azure AD: 3-week Implementation: Easily manage identities across thousands of apps and platforms with Azure Active Directory and Azure AD Premium consulting by Agile IT.

Application Containerization- 3-wk Assessment

Application Containerization: 3-wk Assessment: For enterprises looking to host their applications on Azure Container Services, DXC Technology uses tools and expertise to determine the technical feasibility, business suitability, and transformation path for each app.

Application Containerization Quickstart- 4-wk POC

Application Containerization Quickstart: 4-wk POC: For enterprises looking to improve the elasticity of their applications, DXC Technology provides this service to containerize applications and deploy them to Azure Container Services.

Application Services Quickstart- 4-wk POC

Application Services Quickstart: 4-wk POC: For enterprises looking to host their applications on Microsoft Azure, DXC Technology provides a comprehensive migration solution.

Azure DevOps4Dynamics- 3-Day Maturity Assessment

Azure4DevOps: 3 Day DevOps Maturity Assessment: This assessment by Testhouse will identify your DevOps maturity and provide a roadmap for moving your IT organization toward the highest level.

Azure4DevOps- 3 Day DevOps Maturity Assessment

Azure DevOps4Dynamics: 3-Day Maturity Assessment: Assess your IT organization’s DevOps maturity and generate an improvement roadmap to ensure you can effectively manage the application lifecycle of your enterprise software investment on Azure.

Azure DevOps Migration- 10 Day Engagement

Azure DevOps Migration: 10 Day Engagement: Readify’s Azure DevOps engagement offers you the opportunity to work with our experts to evaluate current barriers and begin to implement new tools, services, and practices in Azure DevOps.

Azure Disaster Recovery- 1-Day Workshop

Azure Disaster Recovery: 1-Day Workshop: In this session, one of InsITe’s skilled engineers will walk you through Azure Backup and Azure Site Recovery services, demonstrating how you can immediately begin leveraging active datacenter replication.

Azure Integration Services- 1-wk POC

Azure Integration Services: 1-wk POC: This limited implementation by VNB Consulting is designed to evaluate if Azure Integration Services is right for your organization’s hybrid or cloud integration platform initiative.

Azure Migration- 1-day Assessment

Azure Migration: 1-day Assessment: CHISW Development LTD's assessment will cover an environment review, premigration requirements, migration, post-migration requirements and enhancements, and a security check.

Azure Security- 3-Day Workshop

Azure Security: 3-Day Workshop: This workshop by Catapult Systems is designed to help customers understand the basic security requirements and services available in Azure.

Azure Site Recovery-1 Server- 1-day Implementation

Azure Site Recovery-1 Server: 1-day Implementation: Forsyte will take the time to understand your disaster recovery needs, define an Azure Site Recovery strategy, configure a server, configure the target environment in Azure, create a replication, and initiate the server replication.

BizTalk Health Check- Upgrade Azure Assessment 1-Wk

BizTalk Health Check/Upgrade Azure Assessment 1-Wk: The BizTalk Health Check by TwoConnect is the first step toward considering an Azure migration and achieving optimal performance for your BizTalk system.

BizTalk Support and Azure Managed Services 3-Day PoC

BizTalk Support & Azure Managed Services 3-Day PoC: TwoConnect’s award-winning BizTalk support and Azure Managed Services team will focus on supporting, maintaining, and adapting your integration solutions to ensure the seamless continuity of your business operations.

Azure Integration Services- 1-wk POC

BizTalk to Azure Migration: 2-day Assessment: This assessment by VNB Consulting will be held at your facility or conducted remotely, and it will involve an evaluation of your BizTalk environment that results in a detailed plan for a BizTalk-to-Azure migration.

Cloud Migration - Transformation- 1-Hour Briefing

Cloud Migration - Transformation: 1-Hour Briefing: Learn how Wintellisys Inc.'s capabilities, approach, and methodologies can help customers accelerate their journey to the cloud. Get an overview of the migration suite and a demonstration of a few of the key components.

Disaster Recovery- 2-Week Implementation

Disaster Recovery: 2-Week Implementation: Catapult Systems will discover, assess, and define the architecture and recovery process for two on-premises workloads to Azure.

EDI on Azure LogicApps- 2-day POC

EDI on Azure LogicApps: 2-day POC: This limited implementation by VNB Consulting is designed to evaluate if Azure Integration Services (Azure Logic Apps and Integration Account) is right for your organization’s EDI-in-the-cloud initiative.

Govern Azure- 8-week Assessment

Govern Azure: 8-week Assessment: Applied Cloud Systems' assessment will include all aspects of the governance process, including architecture, acceptable usage, security, monitoring, and cost management.

TFS-Azure DevOps Migration - 4-week Implementation

TFS-Azure DevOps Migration: 4-week Implementation: Canarys will help you each step of the way on your journey, from acquiring the licenses required to use Azure DevOps to migrating projects from Team Foundation Server to Azure DevOps (formerly VSTS).

TIC Modernization 1-Hour Briefing

TIC Modernization 1-Hour Briefing: Practical Solutions Inc. will discuss how any federal agency can remove barriers to the cloud and modern technology adoption. We will also show how agencies can meet the OMB Trusted Internet Connections (TIC) initiative.

Website - App Migration- 3 Day Assessment

Website / App Migration: 3 Day Assessment: BUI will analyze your IIS/Linux installation and identify which sites can be migrated to Microsoft Azure, highlighting any elements that cannot be migrated or are unsupported on the platform.

Windows Server and SQL 2008 EOS Migration- 6 Wk Imp.

Windows Server & SQL 2008 EOS Migration- 6 Wk Imp.: This implementation by ANS Group is a packaged migration service that combines assessing, planning, and transitioning stand-alone services into Microsoft Azure.

Azure Stack laaS – part two

$
0
0

This blog post was co-authored by David Armour, Principal Program Manager, Azure Stack.

Start with what you already have

Every organization has a unique journey to the cloud. This journey is based on the organization’s history, business specifics, culture, and maybe most importantly, their starting point. While it can be hard for some to say goodbye to their current virtualization environment and way of doing things, the journey to the cloud provides many options, features, functionalities, and opportunities to improve existing governance, operations, and implement new ones. The journey to the cloud can also provide the opportunity to redesign applications and take advantage of the cloud architecture. Additionally, Microsoft Azure gives you the option to host your virtual machines (VMs) in the public cloud or in your own facility with Azure Stack.

In most cases, this journey starts with a lift and shift of the existing servers, either virtual machines or physical servers. Because Azure Stack at its core is an infrastructure-as-a-service (IaaS) platform, the right way to think about this first phase of the journey is as a lift and optimize process. Moving the servers should be the first step towards enabling modern operations across your workloads. That could mean something as little as selecting the right size for your VMs so that you “pay for what you use,” enabling self-service by doing it yourself, automating deployments, or even building on the success of others.

What to think about when migrating

The Azure migration center provides a good model to help start the assessment, make sure you have the right stakeholders involved and help create the proper frame for your migration.

As you start this assessment, there are several factors which you can use to identify what is the best suited platform for your workload, whether that is Azure or Azure Stack:

  • Cost
  • Connectivity requirements
  • Potential regulations and data gravity requirements
  • High availability and regional requirements

Flowchart displaying migration from suited platforms to Azure

After you complete the assessment and planning, you will need to select the right tool for the migration.

Our partner ecosystem includes ISVs that have built solutions which range from simple migrations, to “as a Service” solutions. There are also Microsoft migration options which require manual steps to implement, but offer a potential lower cost.

Partner migration options and third party tools available

Partner options

Azure Stack has ISV solutions for every stage of application migration, from envisioning and/or discovery, to modernization by leveraging PaaS capabilities. Each have their own capabilities and improve the process in their own way.

  • Carbonite – Offers server migration, backup, high availability of Windows Servers, and enterprise protection for Microsoft Windows endpoints.
  • Cloudbase – Offers a migration-as-a-service solution called Coriolis which integrates with Azure Migrate and uses it for the initial assessment, as well as the VM-size mapping.
    • Coriolis will be available as a trial version in the Azure Stack Marketplace, offering free VM migrations to validate the process and make sure it is the right solution. 
  •    Commvault – Complements migration, management, protection, and activation of data on Microsoft Azure Stack and other hybrid cloud infrastructure solutions. Commvault helps enterprises increase agility, reduce costs, and discover valuable insights.

    • Commvault is available in the Azure Stack Marketplace and it offers a 60-day free trial that can be upgraded in place to a full version.   


    • This isn’t really a migration tool, instead it can offer fault tolerance and high availability for your solution.     

    • This solution can also help with creating fault tolerance and high availability across multiple stamps. Please see our demo of an application running across two Azure Stack stamps.

    • It is also available in the Azure Stack Marketplace and offers a 30-day free trial   

Microsoft migration options

The Storage Migration Service makes it easier to migrate servers and to target VMs in Azure Stack. You can use the graphical tool that inventories data on servers and then transfer that data and configuration to the VMs already deployed on Azure Stack. The service works without apps or users having to change anything. Depending on the assessment, some of these workloads might go to Azure IaaS, or Azure Files.

Use Storage Migration Service because you've got a server or lots of servers that you want to migrate to Azure Stack virtual machines. Storage Migration Service is designed to help by doing the following:

  • Inventory multiple servers and their data.
  • Rapidly transfer files, file shares, and security configuration from the source servers.
  • Optionally take over the identity of the source servers, also known as cutting over, so that users and apps don't have to change anything to access existing data.
  • Manage one or multiple migrations from the Windows Admin Center user interface.

Typically, in your migration journey, you will use a mixture of tools. So you will need to understand the options available in order to select the right tool for the specific workloads.

Learn more

In this blog series

We hope you come back to read future posts in this blog series. Here are some of our planned upcoming topics:

Instantly restore your Azure Virtual Machines using Azure Backup

$
0
0

Today, we are delighted to share the release of Azure Backup Instant Restore capability for Azure Virtual Machines (VMs). Instant Restore helps Azure Backup customers quickly recover VMs from the snapshots stored along with the disks. In addition, users get complete flexibility in configuring the retention range of snapshots at the backup policy level depending on the requirements and criticality of the virtual machines associated, giving users more granular control over their resources.

Key benefits

  • Instant recovery point: Snapshots taken as a part of the backup job are stored along with the disk and are available for recovery instantly. This eliminates the wait time for snapshots to copy to the vault before a restore can be triggered.
  • In-place restore capability: With instant restore, users also get a capability to perform in-place restore, thus, overwriting the data in the original disk rather than creating a copy of the disk at an alternate location. It is particularly useful in scenarios where there is a need to rollback a patch. Once the snapshot phase is done, users can go ahead and use the local snapshot to restore if the patch goes bad.
  • Flexibility to choose retention range for snapshots at backup policy level: Depending on the operational recovery requirements of VMs, the user has the flexibility to configure snapshot retention range at a VM backup policy level. The snapshot retention range will apply to all VMs associated with the policy and can be between one to five days, two days being the default value.

In addition, users get Azure Backup support for Standard SSD disks and disks up to 4TB size.

How to change the snapshot retention period?

We are enabling this experience starting today and rolling it out region by region. You can check the availability in your region today.

Portal:

Users can change the snapshot retention to any value between one and five days from the default value of two days.

Instant Restore

Next steps

Announcing Azure Integration Service Environment for Logic Apps

$
0
0

A new way to integrate with resources in your virtual network

We strive with every service to provide experiences that significantly improve the development experience. We’re always looking for common pain points that everybody building software in the cloud deals with. And once we find those pain points, we build best-of-class software to address the need.

In critical business scenarios, you need to have the confidence that your data is flowing between all the moving parts. The core Logic Apps offering is a great, multi-faceted service for integrating between data sources and services, but sometimes it is necessary to have dedicated service to ensure that your integration processes are as performant as can be. That’s why we developed the Integration Service Environment (ISE), a fully isolated integration environment.

What is an Integration Service Environment?

An Integration Service Environment is a fully isolated and dedicated environment for all enterprise-scale integration needs. When you create a new Integration Service Environment, it is injected into your Azure virtual network, which allows you to deploy Logic Apps as a service on your VNET.

  • Direct, secure access to your virtual network resources. Enables Logic Apps to have secure, direct access to private resources, such as virtual machines, servers, and other services in your virtual network including Azure services with service endpoints and on-premises resources via an Express Route or site to site VPN.
  • Consistent, highly reliable performance. Eliminates the noisy neighbor issue, removing fear of intermittent slowdowns that can impact business critical processes with a dedicated runtime where only your Logic Apps execute in.
  • Isolated, private storage. Sensitive data subject to regulation is kept private and secure, opening new integration opportunities.
  • Predicable pricing. Provides a fixed monthly cost for Logic Apps. Each Integration Service Environment includes the free usage of 1 Standard Integration Account and 1 Enterprise connector. If your Logic Apps action execution count exceeds 50 million action executions per month, the Integration Service Environment could provide better value.

Integration Service Environments are available in every region that Logic Apps is currently available in, with the exception of the following locations:

  • West Central US
  • Brazil South
  • Canada East

Logic Apps is great for customers who require a highly reliable, private integration service for all their data and services. You can try the public preview by signing up for an Azure account. If you’re an existing customer, you can find out how to get started by visiting our documentation, “Connect to Azure virtual networks from Azure Logic Apps by using an integration service environment.”

Running Cognitive Services on Azure IoT Edge

$
0
0

This blog post is co-authored by Emmanuel Bertrand, Senior Program Manager, Azure IoT.

We recently announced Azure Cognitive Services in containers for Computer Vision, Face, Text Analytics, and Language Understanding. You can read more about Azure Cognitive Services containers in this blog, “Brining AI to the edge.”

Today, we are happy to announce the support for running Azure Cognitive Services containers for Text Analytics and Language Understanding containers on edge devices with Azure IoT Edge. This means that all your workloads can be run locally where your data is being generated while keeping the simplicity of the cloud to manage them remotely, securely and at scale.

Image displaying containers that align with Vision and Langauge

Whether you don’t have a reliable internet connection, or want to save on bandwidth cost, have super low latency requirements, or are dealing with sensitive data that needs to be analyzed on-site, Azure IoT Edge with the Cognitive Services containers gives you consistency with the cloud. This allows you to run your analysis on-site and a single pane of glass to operate all your sites.

These container images are directly available to try as IoT Edge modules on the Azure Marketplace:

  • Key Phrase Extraction extracts key talking points and highlights in text either from English, German, Spanish, or Japanese.
  • Language Detection detects the natural language of text with a total of 120 languages supported.
  • Sentiment Analysis detects the level of positive or negative sentiment for input text using a confidence score across a variety of languages.
  • Language Understanding applies custom machine learning intelligence to a user’s conversational and natural language text to predict overall meaning and pull out relevant and detailed information.

Please note, the Face and Recognize Text containers are still gated behind a preview, thus are not yet available via the marketplace. However you can deploy them manually by first signing up to for the preview to get access.

In this blog, we describe how to provision Language Detection container on your edge device locally and how you manage it through Azure IoT.

Set up an IoT Edge device and its IoT Hub

Follow the first steps in this quick-start for setting up your IoT Edge device and your IoT Hub.

It first walks your through creating an IoT Hub and then registering an IoT Edge device to your IoT hub. Here is a screenshot of a newly created edge device called “LanguageDetection" under the IoT Hub called “CSContainers". Select the device, copy its primary connection string, and save it for later.

Screenshot of a newly created Edge device

Next, it guides you through setting up the IoT Edge device. If you don’t have a physical edge device, it is recommended to deploy the Ubuntu Server 16.04 LTS and Azure IoT Edge runtime virtual machine (VM) which is available on the Azure Marketplace. It is an Azure Virtual Machine that comes with IoT Edge pre-installed.

The last step is to connect your IoT Edge device to your IoT Hub by giving it its connection string created above. To do that, edit the device configuration file under /etc/iotedge/config.yaml file and update the connection string. After the connection string is update, restart the edge device with sudo systemctl restart iotedge.

Provisioning a Cognitive Service (Language Detection IoT Edge module)

The images are directly available as IoT edge modules from the Iot Hub marketplace.

Screenshot of Language Detection Container offering in the Azure Marketplace

Here we’re using the Language Detection image as an example, however other images work the same way. To download the image, search for the image and select Get it now, this will take you to the Azure portal “Target Devices for IoT Edge Module” page. Select your subscription with your IoT Hub, select Find Device and your IoT Edge device, then click the Select and Create buttons.

Screenshot of target devices for IoT Edge modules

Screenshot for selecting device to deploy on and creating

Configuring your Cognitive Service

Now you’re almost ready to deploy the Cognitive Service to your IoT Edge device. But in order to run a container you need to get a valid API key and billing endpoints, then pass them as environment variables in the module details.

Screenshot of setting deployment modules

Go to the Azure portal and open the Cognitive Services blade. If you don’t have a Cognitive Service that matches the container, in this case a Text Analytics service, then select add and create one. Once you have a Cognitive Service get the endpoint and API key, you’ll need this to fire up the container:

Screenshot showing the retrival of the endpoint and API key

The endpoint is strictly used for billing only, no customer data ever flows that way. Copy your billing endpoint value to the “billing” environment variable and copy your API key value to the “apikey” environment variable.

Deploy the container

All required info is now filled in and you only need to complete the IoT Edge deployment. Select Next and then Submit. Verify that the deployment is happening properly by refreshing the IoT Edge device details section.

Verify that the deployment is happening properly by refreshing the IoT Edge device details section.

Screenshot of IoT Edge device details section

Trying it out

To try things out, we’ll make an HTTP call to the IoT Edge device that has the Cognitive Service container running.

For that, we’ll first need to make sure that the port 5000 of the edge device is open. If you’re using the pre-built Ubuntu with IoT Edge Azure VM as an edge device, first go to VM details, then Settings, Networking, and Outbound port rule to add an outbound security rule to open port 5000. Also copy the Public IP address of your device.

Now you should be able to query the Cognitive Service running on your IoT Edge device from any machine with a browser. Open your favorite browser and go to http://your-iot-edge-device-ip-address:5000.

Now, select Service API Description or jump directly to http://your-iot-edge-device-ip-address:5000/swagger. This will give you a detailed description of the API.

Screenshot showing a detailed description of the Language Detection Cognitive Service API

Select Try it out and then Execute, you can change the input value as you like.

Screenshot of executing the API

The result will show up further down on the page and should look something like the following image:

Screenshot of results after executing the API

Next steps

You are now up and running! You are running the Cognitive Services on your own IoT Edge device, remotely managed via your central IoT Hub. You can use this setup to manage millions of devices in a secure way.

You can play around with the various Cognitive Services already available in the Azure Marketplace and try out various scenarios. Have fun!

How to port desktop applications to .NET Core 3.0

$
0
0

In this post, I will describe how to port a desktop application from .NET Framework to .NET Core. I picked a WinForms application as an example. Steps for WPF application are similar and I’ll describe what needs to be done different for WPF as we go.

The post How to port desktop applications to .NET Core 3.0 appeared first on .NET Blog.

Dependency Autocompletion, Performance Improvements and More for Java on Visual Studio Code

$
0
0

Welcome to February update of Java on Visual Studio Code! We’d like to share a few new improvements to further enhance your productivity, including

  • Dependency auto-completion and more Maven updates
  • Performance improvements
  • Standalone file supports
  • Multiple source folders support
  • Easy launch for multi-main-class projects
  • Hide temporary files
  • Bulk generate getters and setters
  • Test configuration and report update
  • Including IntelliCode to Java Extension Pack

Try these new features by installing Java Extension Pack with Visual Studio Code.

The post Dependency Autocompletion, Performance Improvements and More for Java on Visual Studio Code appeared first on The Visual Studio Blog.


Python in Visual Studio Code – February 2019 Release

Latest enhancements now available for Cognitive Services’ Computer Vision

$
0
0

This blog was co-authored by Lei Zhang, Principal Research Manager, Computer Vision

You can now extract more insights and unlock new workflows from your images with the latest enhancements to Cognitive Services’ Computer Vision service.

1. Enrich insights with expanded tagging vocabulary

Computer Vision has more than doubled the types of objects, situations, and actions it can recognize per image.

Before

Black and white picture of New York City before Computer Vision

Now

Black and white picture of New York City now, after Computer vision

2. Automate cropping with new object detection feature

Easily automate cropping and conduct basic counting of what you need from an image with the new object detection feature. Detect thousands of real life or man-made objects in images. Each object is now highlighted by a bounding box denoting its location in the image.

Picture of subway train and detecting real life or man-made objects

3. Monitor brand presence with new brand detection feature

You can now track logo placement of thousands of global brands from the consumer electronics, retail, manufacturing, entertainment industries.

Picture of Microsoft logo and the brand detection feature

With these enhancements, you can:

  • Do at-scale image and video-frame indexing, making your media content searchable. If you’re in media, entertainment, advertising, or stock photography, rich image and video metadata can unlock productivity for your business.
  • Derive insights from social media and advertising campaigns by understanding the content of images and videos and detecting logos of interest at scale. Businesses like digital agencies have found this capability useful for tracking the effectiveness of advertising campaigns. For example, if your business launches an influencer campaign, you can apply Custom Vision to automatically generate brand inclusion metrics pulling from influencer-generated images and videos.

In some cases, you may need to further customize the image recognition capabilities beyond what the enhanced Computer Vision service now provides by adding specific tagging vocabulary or object types that are relevant to your use case. Custom Vision service allows you to easily customize and deploy your model without requiring machine-learning expertise.

See it in action through the Computer Vision demo. If you’re ready to start building to unlock these insights, visit our documentation pages for image tagging, object detection, and brand detection.

Announcing the general availability of Azure Lab Services

$
0
0

Today, we are very excited to announce the general availability of Azure Lab Services - your computer labs in the cloud.

With Azure Lab Services, you can easily set up and provide on-demand access to preconfigured virtual machines (VMs) to teach a class, train professionals, run hackathons or hands-on labs, and more. Simply input what you need in a lab and let the service roll it out to your audience. Your users go to a single place to access all their VMs across multiple labs, and connect from there to learn, explore, and innovate.

Since our preview announcement, we have had many customers use the service to conduct classes, training sessions, boot camps, hands on labs, and more! For classroom or professional training, you can provide students with a lab of virtual machines configured with exactly what you need for class and give each student a specified number of hours to use the VMs for homework or personal projects. You can run a hackathon or a hands-on lab at conferences or events and scale up to hundreds of virtual machines for your attendees. You can also create an invite-only private lab of virtual machines installed with your prerelease software to give preview customers access to early trials or set up interactive sales demos.

Top three reasons customers use Azure Lab Services

Automatic management of Azure infrastructure and scale

Azure Lab Services is a managed service, which means that provisioning and management of a lab’s underlying infrastructure is handled automatically by the service. You can just focus on preparing the right lab experience for your users. Let the service handle the rest and roll out your lab’s virtual machines to your audience. Scale your lab to hundreds of virtual machines with a single click.

Publishing a template in Azure Lab Services

Simple experience for your lab users

Users who are invited to your lab get immediate access to the resources you give them inside your labs. They just need to sign in to see the full list of virtual machines they have access to across multiple labs. They can click on a single button to connect to the virtual machines and start working. Users don’t need Azure subscriptions to use the service.

Screen shot displaying a sample virtual machine in Azure Lab Services

Cost optimization and tracking 

Keep your budget in check by controlling exactly how many hours your lab users can use the virtual machines. Set up schedules in the lab to allow users to use the virtual machines only during designated time slots or set up reoccurring auto-shutdown and start times. Keep track of individual users’ usage and set limits.

An example screen shot of how to set up scheduled in Azure Lab Services

Get started now

Try Azure Lab Services today! Get started by creating a lab account for your organization or team. All labs are managed under a lab account. You can give permissions to people in your organization to create labs in your lab account.

To learn more, visit the Azure Lab Services documentation. Ask any questions you have on Stack Overflow. Last of all, don’t forget to subscribe to our Service Updates and view other Azure Lab Services posts on the Azure blog to get the latest news.

General availability pricing

Azure Lab Services GA pricing goes into effect on May 1, 2019. Until then, you will continue to be billed based on the preview pricing. Please see the Azure Lab Services pricing page for complete details.

What’s next

We continue to listen to our customers to prioritize and ship new features and updates. Several key features will be enabled in the coming months:

  • Ability to reuse and share custom virtual machine images across labs
  • Feature to enable connections between a lab and on-premise resources
  • Ability to create GPU virtual machines inside the labs

We always welcome any feedback and suggestions. You can make suggestions or vote on priorities on our UserVoice feedback forum.

Visual Studio 2019 Release Candidate (RC) now available

An update to C# versions and C# tooling

$
0
0

Starting with Visual Studio 2019 Preview 4, we’ll be adjusting how C# versions are treated in .NET tooling.

Summary of changes

Firstly, we’re adding two new Language Version (LangVersion) values: LatestMajor and Preview. Here’s how they stack up with the currently supported list of values:

LangVersion
Meaning

LatestMajor
Latest supported major C# language version

Preview
Latest available preview C# language version

Latest
Latest supported C# language version (including minor version)

ISO-1
C# 1.0/1.2

ISO-2
C# 2.0

3
C# 3.0

4
C# 4.0

5
C# 5.0

6
C# 6.0

7
C# 7.0

7.1
C# 7.1

7.2
C# 7.2

7.3
C# 7.3

8.0
C# 8.0

 

The post An update to C# versions and C# tooling appeared first on .NET Blog.

Open Existing CMake Caches in Visual Studio

$
0
0

Visual Studio typically manages all the details of CMake for you, under the hood, when you open a project. However, some development workflows require more fine-grained control over how CMake is invoked. The latest Visual Studio 2019 Preview lets you have complete control over CMake if your project needs more flexibility. You can now give your custom or preferred tools complete control of your project’s CMake cache and build tree instead of letting Visual Studio manage it for you.

In Visual Studio 2019 Preview 3, you can open CMake caches by opening a CMakeCache.txt file. This feature may sound familiar if you ever used the previous functionality that would attempt to clone an existing CMake cache. Opening an existing cache is much more powerful; instead of cloning the cache, Visual Studio will operate with the existing cache in place.

When you open an existing cache, Visual Studio will not attempt to manage your cache and build tree for you. Instead, your tools have complete control. This feature supports caches created by any recent version of CMake (3.8 or higher) instead of just the one that ships with Visual Studio.

Why Open a CMake Existing Cache?

When you open a CMake project, Visual Studio manages CMake for you behind the scenes. It will automatically configure the project’s cache and build tree and regenerates the cache when it is out of date due to changes to the project or CMakeSettings. This lets developers get to their code as quickly as possible without needing to worry about the details of CMake itself. Sometimes, however, giving users more control over what CMake is doing makes sense. That is where Open Existing Cache comes in.

Open Existing Cache gives you complete control over how CMake configures your project. This is essential if you use custom build frameworks, meta-build systems, or external tools to drive CMake in your development workflow. For example, if you use a script to set up your project’s environment and build tree, you can now just run that script and open the cache that it generated in Visual Studio. When you open an existing cache, Visual Studio defers to your tooling to manage the cache instead of driving CMake directly.

There are other reasons why you might want to open existing caches as well. For complex projects, the cache might be slow to generate. Now, instead of opening the project and waiting for the cache to configure once again, you can point Visual Studio towards a cache you have already configured. This also can simplify your workflow if you are working with multiple editors or prefer external tools such as CMakeGui to Visual Studio’s CMakeSettings.json to configure your CMake projects.

How Open Existing Cache Works

To get started with an existing cache, first you will need to generate one outside of the IDE. How you do this will depend completely on your development workflow. For example, you might use custom build scripts specific to your project – or a meta-build system that invokes CMake – or maybe external tools such as CMakeGui. If you are interested in trying this out with a sample project, creating a cache with CMakeGui is a good way to get started.

Once you have generated the cache you can open it directly with “File > Open > CMake” by navigating to the CMakeCache.txt file. You will notice that there is no longer a “Import Existing Cache” wizard because the cache is opened directly.

Alternatively, if you have already opened the project in Visual Studio, you can add an existing cache to it the same way you add a new configuration (click “Add Configuration” on the CMakeSettings.json file in the Solution Explorer or right click anywhere in the file’s editor pane):

Once you have done this, the project should behave exactly as if you had opened in Visual Studio and let it manage the cache for you. You should have full IntelliSense, build, and debug support. One thing you may notice, however, is that cache generation is disabled:

Generate isn’t available because Visual Studio doesn’t know how to regenerate the cache yet. If you want Visual Studio to do this for you instead of managing it yourself, it is possible configure Visual Studio to invoke an external tool to regenerate the cache automatically by creating a task.

Configuring External Caches

External caches can be managed in CMakeSettings.json just like any other configuration. When you first open a cache, a CMakeSettings.json file will be generated that looks something like this:

Cache configurations are simple since they mostly defer to other tools to set up the build. The simplest just points to the CMake cache directory with the “cacheRoot” parameter.

You can also mix and match external caches with ones managed by Visual Studio by creating additional configurations.

Remote Caches

You may have noticed that there is also a template to create a configuration for a remote external cache. This allows you to connect to an existing cache on a remote Linux machine. This can be a little trickier to set up than working with one on Windows (we’re still refining this) but it is possible so long as you have the source code on both the local and remote machine.

To open an external cache on a remote machine, first you will need to open the source directory in Visual Studio on the local Windows machine. Next, create a remote external cache configuration from the template and edit the cache with the path of the cache on the remote machine. By default, Visual Studio will not automatically synchronize the source code from local machine to the remote when working with external caches, but you can change this by setting “remoteCopySources” to “true”.

Send Us Feedback

This feature is under active development so please tell us what works well for your projects and what doesn’t. The best way to get in touch with us is to create a feedback or suggestion ticket on Developer Community. Once a ticket is created you can track our progress in fixing the issue. For other feedback or questions feel free to leave a comment or email us at cmake@microsoft.com.

The post Open Existing CMake Caches in Visual Studio appeared first on C++ Team Blog.

CUDA 10.1 available now, with support for latest Microsoft Visual Studio 2019 versions

$
0
0

We are pleased to echo NVIDIA’s announcement for CUDA 10.1 today, and are particularly excited about CUDA 10.1’s continued compatibility for Visual Studio. CUDA 10.1 will work with RC, RTW and future updates of Visual Studio 2019. To stay committed to our promise for a Pain-free upgrade to any version of Visual Studio 2017 that also carries forward to Visual Studio 2019, we partnered closely with NVIDIA for the past few months to make sure CUDA users can easily migrate between Visual Studio versions. Congratulations to NVIDIA for this milestone and thank you for a great collaboration!

A Bit of Background

In various updates of Visual Studio 2017 (e.g. 15.5) and even earlier major Visual Studio versions, we discovered that some of the library headers became incompatible with CUDA’s NVCC compiler in 9.x versions. The crux of the problem is about two C++ compilers adding modern C++ standard features at different paces but having to work with a common set of C++ headers (e.g. STL headers). We heard from many of you that this issue is forcing you to stay behind on older versions of Visual Studio. Thank you for that feedback. Together with NVIDIA, we have a solution in place that enables all Visual Studio 2017 updates and Visual Studio 2019 versions to work with CUDA 10.0+ tools. This is also reinforced by both sides adding tests and validation processes as part of release quality gates. For example, we continue to run (NVIDIA/Cutlass), a CUDA C++ project, as part of our MSVC Real-World Testing repository as a requirement for shipping. We also have targeted unit tests for PR build validations to guard against potential incompatibility issues.

In closing

We’d love for you to download Visual Studio 2019 RC and try out all the new C++ features and improvements. As always, we welcome your feedback. We can be reached via the comments below or via email (visualcpp@microsoft.com). If you encounter other problems with MSVC in Visual Studio 2019 please let us know through Help > Report A Problem in the product, or via Developer Community. Let us know your suggestions through UserVoice. You can also find us on Twitter (@VisualC) and Facebook (msftvisualcpp).

The post CUDA 10.1 available now, with support for latest Microsoft Visual Studio 2019 versions appeared first on C++ Team Blog.


Hosted Pipelines Announcements: VS 2019, Mojave, and more

Introducing Microsoft Azure Sentinel, intelligent security analytics for your entire enterprise

$
0
0

Security can be a never-ending saga—a chronicle of increasingly sophisticated attacks, volumes of alerts, and long resolution timeframes where today’s Security Information and Event Management (SIEM) products can’t keep pace.

SecOps teams are inundated with a very high volume of alerts and spend far too much time in tasks like infrastructure set up and maintenance. As a result, many legitimate threats go unnoticed. An expected shortfall of 3.5M security professionals by 2021 will further increase the challenges for security operations teams. You need a solution that empowers your existing SecOps team to see the threats clearer and eliminate the distractions.

That’s why we reimagined the SIEM tool as a new cloud-native solution called Microsoft Azure Sentinel. Azure Sentinel provides intelligent security analytics at cloud scale for your entire enterprise. Azure Sentinel makes it easy to collect security data across your entire hybrid organization from devices, to users, to apps, to servers on any cloud.  It uses the power of artificial intelligence to ensure you are identifying real threats quickly and unleashes you from the burden of traditional SIEMs by eliminating the need to spend time on setting up, maintaining, and scaling infrastructure. Since it is built on Azure, it offers nearly limitless cloud scale and speed to address your security needs. Traditional SIEMs have also proven to be expensive to own and operate, often requiring you to commit upfront and incur high cost for infrastructure maintenance and data ingestion. With Azure Sentinel there are no upfront costs, you pay for what you use.

Many enterprises are using Office 365 and are increasingly adopting the advanced security and compliance offerings included in Microsoft 365. There are many cases when you want to combine security data from users and end point applications with information from your infrastructure environment and third-party data to understand a complete attack.

It would be ideal if you could do this all within the compliance boundaries of a single cloud provider. Today we are announcing that you can bring your Office 365 activity data to Azure Sentinel for free. It takes just a few clicks and you retain the data within the Microsoft cloud.

“With Microsoft Azure Sentinel, we can better address the main SIEM landscape challenges for our clients, along with simplifying data residency and GDPR concerns.”

Andrew Winkelmann, Global Security Consulting Practice Lead, Accenture

Azure Sentinel overview dashboard

Let’s look at how Azure Sentinel will help you deliver cloud-native security operations:

Collect data across your enterprise easily – With Azure Sentinel you can aggregate all security data with built-in connectors, native integration of Microsoft signals, and support for industry standard log formats like common event format and syslog. In just a few clicks you can import your Microsoft Office 365 data for free and combine it with other security data for analysis. Azure Sentinel uses Azure Monitor which is built on a proven and scalable log analytics database that ingests more than 10 petabytes every day and provides a very fast query engine that can sort through millions of records in seconds.

We continue to collaborate with many partners in the Microsoft Intelligent Security Association. Azure Sentinel connects to popular solutions including Palo Alto Networks, F5, Symantec, Fortinet, and Check Point with many more to come. Azure Sentinel also integrates with Microsoft Graph Security API, enabling you to import your own threat intelligence feeds and customizing threat detection and alert rules. There are custom dashboards that give you a view optimized for your specific use-case.

Adam Geller, Senior Vice President, SaaS, virtualization, and cloud-delivered security of Palo Alto Networks said, “We’re pleased with our ongoing collaboration with Microsoft and the work we’re doing to deliver greater security orchestration for our joint customers. This latest integration allows customers to forward their physical and virtualized next generation firewall logs to Azure Sentinel and use custom dashboards and artificial intelligence to rapidly uncover potential security incidents. Palo Alto Networks customers can also extend AutoFocus and other third-party threat intelligence to Azure Sentinel via our new integration between MineMeld and the Microsoft Graph Security API.”

Analyze and detect threats quickly with AI on your side – Security analysts face a huge burden from triaging as they sift through a sea of alerts, and correlate alerts from different products manually or using a traditional correlation engine. That’s why Azure Sentinel uses state of the art, scalable machine learning algorithms to correlate millions of low fidelity anomalies to present a few high fidelity security incidents to the analyst. ML technologies will help you quickly get value from large amounts of security data you are ingesting and connect the dots for you. For example, you can quickly see a compromised account that was used to deploy ransomware in a cloud application. This helps reduce noise drastically, in fact we have seen an overall reduction of up to 90 percent in alert fatigue during evaluations. Early adopters are seeing the benefits of threat detections with AI. Reed M. Wiedower, CTO of New Signature said, “We see a huge value with Azure Sentinel because of its ability to generate insights across a vast array of different pieces of infrastructure.”

These built-in machine learning models are based on the learnings from the Microsoft security team over many years of defending our customer’s cloud assets. You do not need to be a data scientist to leverage use these benefits you just turn them on. Of course, if you are a data scientist and you want to customize and enrich the detections then you can bring your own models to Azure Sentinel using the built-in Azure Machine Learning service. Additionally, Azure Sentinel can connect to user activity and behavior data from Microsoft 365 security products which can be combined with other sources to provide visibility into an entire attack sequence.

Investigate and hunt for suspicious activities – Graphical and AI-based investigation will reduce the time it takes to understand the full scope of an attack and its impact. You can visualize the attack and take quick actions in the same dashboard.  

Graphical and AI-based investigation workflow

Proactive hunting of suspicious activities is another critical task for the security analysts. Often the process by which SecOps collect and analyze the data is a repeatable process which can be automated. Today, Azure Sentinel provides two capabilities that enable you to automate your analysis by building hunting queries and Azure Notebooks that are based on Jupyter notebooks. We have developed a set of queries and Azure Notebooks based on the proactive hunting that Microsoft’s Incident Response and Threat Analysts teams perform. As the threat landscape evolves, so will our queries and Azure Notebooks. We will provide new queries and Azure Notebooks via the Azure Sentinel GitHub community.

Automate common tasks and threat response – While AI sharpens your focus on finding problems, once you have solved the problem you don’t want to keep finding the same problems over and over – rather you want to automate response to these issues. Azure Sentinel provides built-in automation and orchestration with pre-defined or custom playbooks to solve repetitive tasks and to respond to threats quickly. Azure Sentinel will augment existing enterprise defense and investigation tools, including best-of-breed security products, homegrown tools, and other systems like HR management applications and workflow management systems like ServiceNow.

Built-in automation with pre-defined or custom playbooks in Azure Sentinel

Microsoft’s unparalleled threat intelligence that is informed by analyzing 6.5+ trillions of signals daily and decades of security expertise at cloud scale will help you modernize your security operations.

“Azure Sentinel provides a proactive and responsive cloud-native SIEM that will help customers simplify their security operations and scale as they grow.”

Richard Diver, Cloud Security Architect, Insight Enterprises

Security doesn’t have to be an endless saga. Instead, put the cloud and large-scale intelligence to work. Make your threat protection smarter and faster with artificial intelligence. Import Microsoft Office 365 data for security analytics for free. Get started with Microsoft Azure Sentinel.

Microsoft Azure Sentinel is available in preview today in the Azure portal.

Cognitive Services Speech SDK 1.3 – February update

$
0
0

Developers can now access the latest Cognitive Services Speech SDK which now supports:

  • Selection of the input microphone through the AudioConfig class
  • Expanded support for Debian 9
  • Unity in C# (beta)
  • Additional sample code

Read the updated Speech Services documentation to get started today.

image

What’s new

The Speech SDK supports a selection of the input microphone through the AudioConfig class, meaning you can stream audio data to the Speech Service from a non-default microphone. For more details see the documentation and the how-to guide on selecting an audio input device with the Speech SDK. This is not yet available from JavaScript.

The Speech SDK now also supports Unity in a beta version. Since this is new functionality, please provide feedback through the issue section in the GitHub sample repository. This release supports Unity on Windows x86 and x64 (desktop or Universal Windows Platform applications), and Android (ARM32/64, x86). More information is available in our Unity quickstart.

Samples

The following new content is available in our sample repository.

  • Samples for AudioConfig.FromMicrophoneInput.
  • Python samples for intent recognition and translation.
  • Samples for using the Connection object in iOS.
  • Java samples for translation with audio output.
  • New sample for use of the Batch Transcription REST API.

Improvements and changes

A number of improvements and changes have been made since our last release including:

  • Python
    • Improved parameter verification and error messages in SpeechConfig
    • AddED support for the Connection object
    • Support for 32-bit Python (x86) on Windows
    • The Speech SDK for Python is out of beta
  • iOS
    • The SDK is now built against the iOS SDK version 12.1. and supports iOS versions 9.2 and later
    • Improved reference documentation and fixed several property names
  • JavaScript
    • Added support for the Connection object
    • Added type definition files for bundled JavaScript
    • Initial support and implementation for phrase hints
    • Returned properties collection with service JSON for recognition
  • Windows DLLs now contains a version resource.

Bug fixes

  • Empty proxy username and proxy password were not handled correctly before. With this release, if you set proxy username and proxy password to an empty string, they will not be submitted when connecting to the proxy.
  • Session ID's created by the SDK were not always truly random for some languages and environments. Random generator initialization has been added to fix this.
  • Improved handling of authorization token. If you want to use an authorization token, specify in the SpeechConfig and leave the subscription key empty. Then create the recognizer as usual.
  • In some cases, the Connection object wasn't released correctly. This has been fixed.

For more details and examples for how your business can benefit from the new functionality for Speech Services, check out release notes and samples in the GitHub sample repository for Speech Services.

Creating IoT applications with Azure Database for PostgreSQL

$
0
0

There are numerous IoT use cases in different industries, with common categories like predictive maintenance, connected vehicles, anomaly detection, asset monitoring, and many others. For example, in water treatment facilities in the state of California, IoT devices can be installed in water pumps to measure horse power, flow rate, and electric usage of the water pumps. The events emitted from these devices get sent to an IoT hub every 30 seconds for aggregation and processing. A water treatment facility company could build a dashboard to monitor the water pumps and build notifications to alert the maintenance team when the event data is beyond a certain threshold. They could then alert the maintenance team to repair the water pump if the flow rate is dangerously low. This is a very typical proactive maintenance IoT use case.

Azure IoT is a complete stack of IoT solutions. It’s a collection of Microsoft managed cloud services that connect, monitor, and control billions of IoT assets. The common set of components in the Azure IoT core subsystem include:

  • IoT devices that stream the events
  • Cloud gateway, where Azure IoT is most often used to enable communication to and from devices and edge devices
  • Stream processing that ingests events from the device and triggers actions based on the output of the analysis. A common workflow is the input telemetry encoded in Avro that may return output telemetry encoded in JSON for storage
  • Storage, that’s usually a database used to store IoT event data for reporting and visualization purposes

Let’s take a look at how we implement an end to end Azure IoT solution and use Azure Database for PostgreSQL to store IoT event data in the JSONB format. Using PostgreSQL as the NoSQL data store has its own advantages with its strong native JSON processing, indexing capabilities, and plv8 extension that further enhances it by integrating the JavaScript v8 engine with SQL. Besides the managed services capabilities and lower cost, one of the key advantages of using Azure Database for PostgreSQL is its native integration with the Azure ecosystem that enables modern applications with improved developer productivity.

In this implementation, we use Azure Database for PostgreSQL with the plv8 extension as a persistent layer for IoT telemetry stream for storage, analytics, and reporting. The high-speed streaming data is first loaded into the PostgreSQL database (master server) as a persistent layer. The master server is used for high speed data ingestion and the read replicas are leveraged for reporting and downstream data processing to take data-driven actions. You can leverage the Azure IoT Hub as the event processing hub and Azure Function to trigger the processing steps and extract what’s needed from emitted events to store them in Azure Database for PostgreSQL.

139 - Azure
 

In this post, we’ll walk through the high-level implementation to get you started. Our GitHub repository has sample applications and a detailed QuickStart tutorial with step-by-step instructions for implementing the solution below. The QuickStart uses Node.js applications to send telemetry to the IoT Hub.

Step 1: Create an Azure IoT Hub and register a device with the Hub

In this implementation, the IoT sensor simulators are constantly emitting temperature and humidity data back to the cloud. The first step would be creating an Azure IoT Hub in the Azure portal using these instructions. Next, you’ll want to register the device name in the IoT Hub so that the IoT Hub can receive and process the telemetry from the registered devices.

In GitHub, you will see sample scripts to register the device using CLI and export the IoT Hub service connection string.

Step 2: Create an Azure Database for PostgreSQL server and a database IoT demo to store the telemetry data stream

Provision an Azure Database for PostgreSQL with the appropriate size. You can use the Azure portal or the Azure CLI to provision the Azure Database for PostgreSQL.

In the database, you will enable the plv8 extension and create a sample plv8 function that’s useful for querying to extract a temperature column from the JSON documents. You can use the JSON table to store the IoT telemetry data. You can locate the script to create a database and table and enable the plv8 extension in GitHub.

Step 3: Create an Azure Function Event Hub and extract message and store in PostgreSQL

Next you will create a JavaScript Azure Function with Event Hub trigger bindings to Azure IoT Hub created in step 1. Use the JavaScript index.js sample to create this function. The function is triggered for each incoming message stream in the IoT Hub. It extracts the JSON message stream and inserts the data into the PostgreSQL database created in Step 2.

Getting started by running the IoT solution end to end

We recommend that you try and implement this solution using the sample application in our GitHub repository. In GitHub, you will find steps on running the node.js application to simulate the generation of event data, creating an IoT Hub with device registration, sending the event data to the IoT Hub, deploying Azure function to extract the data from JSON message, and inserting it in Azure Database for PostgreSQL.

At the end of implementing all the steps in GitHub, you will be able to query and analyze the data using reporting tools like Power BI that allow you to build real-time dashboards as shown below.

139 - Azure 2

We hope that you enjoy working with the latest features and functionality available in our Azure Database Service for PostgreSQL. Be sure to share your feedback via User Voice for PostgreSQL.

If you need any help or have questions, please check out the Azure Database for PostgreSQL documentation.

Acknowledgements

Special thanks to Qingqing Yuan, Bassu Hiremath, Parikshit Savjani, Anitah Cantele, and Rachel Agyemang for their contributions to this post.

New device modeling experience in Azure IoT Central

$
0
0

On the Azure IoT Central team, we are constantly talking with our customers to understand how we can continue to provide more value. One of our top pieces of product feedback has been for a clearer device modeling experience that separates the device instance from the device template. Previously, viewing the device and editing the device template took place on the same page through an “Edit Template” button. This caused a lack of clarity between when you were making a change that applied to the device or if your changes were getting applied to all devices in that template. Recently we've begun a flighted rollout of a new device modeling experience that begins to directly address this feedback.

Device modeling experience in Azure IoT Central

For app builder roles, we have introduced a new “Device Templates” navigation tab that replaces the existing “Application Builder” tab, as well as updated the pattern in which you view or edit your device templates. To edit your device templates, you can visit the “Device Templates” tab to make changes. To view or interact with your device instance, you can still find this under the “Explorer” tab. We’re excited to get the first set of changes in your hands so that device templates and device explorer can continue to evolve independently from one another in order to best support how our users interact with their devices. These changes will both optimize the operator experience of viewing or interacting with devices, as well as streamline the builder workflow of creating or modifying a template.

These changes are an important first step towards continuing to optimize your device workflow for easier management and clarity. Please leave us feedback at Azure IoT Central UserVoice, as we continue to invest in understanding and solving our customer needs.

To learn more, please visit our documentation, “Set up a device template.”

Viewing all 5971 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>