Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

Anomaly detection using built-in machine learning models in Azure Stream Analytics

$
0
0

Built-in machine learning (ML) models for anomaly detection in Azure Stream Analytics significantly reduces the complexity and costs associated with building and training machine learning models. This feature is now available for public preview worldwide.

What is Azure Stream Analytics?

Azure Stream Analytics is a fully managed serverless PaaS offering on Azure that enables customers to analyze and process fast moving streams of data, and deliver real-time insights for mission critical scenarios. Developers can use a simple SQL language (extensible to include custom code) to author and deploy powerful analytics processing logic that can scale-up and scale-out to deliver insights with milli-second latencies.

Traditional way to incorporate anomaly detection capabilities in stream processing

Many customers use Azure Stream Analytics to continuously monitor massive amounts of fast-moving streams of data in order to detect issues that do not conform to expected patterns and prevent catastrophic losses. This in essence is anomaly detection.

For anomaly detection, customers traditionally relied on either sub-optimal methods of hard coding control limits in their queries, or used custom machine learning models. Development of custom learning models not only requires time, but also high levels of data science expertise along with nuanced data pipeline engineering skills. Such high barriers to entry precluded adoption of anomaly detection in streaming pipelines despite the associated value for many Industrial IoT sites.

Built-in machine learning functions for anomaly detection in Stream Analytics

With built-in machine learning based anomaly detection capabilities, Azure Stream Analytics reduces complexity of building and training custom machine learning models to simple function calls. Two new unsupervised machine learning functions are being introduced to detect two of the most commonly occurring anomalies namely temporary and persistent.

  • AnomalyDetection_SpikeAndDip function to detect temporary or short-lasting anomalies such as spike or dips. This is based on the well-documented Kernel density estimation algorithm.
  • AnomalyDetection_ChangePoint function to detect persistent or long-lasting anomalies such as bi-level changes, slow increasing and slow decreasing trends. This is based on another well-known algorithm called exchangeability martingales.

Example

SELECT  sensorid,  System.Timestamp as time, temperature as temp,        
AnomalyDetection_SpikeAndDip(temperature, 95, 120, 'spikesanddips')
OVER PARTITION BY sensorid
LIMIT DURATION(second, 120) as SpikeAndDipScores
FROM input

In the example above, AnomalyDetection_SpikeAndDip function helps monitor a set of sensors for spikes or dips in the temperature readings. Furthermore, the underlying ML model uses a user supplied confidence level of 95 percent to set the model sensitivity. A training event count of 120 that corresponds to a 120 second sliding window are supplied as function parameters. Note that the job is partitioned by sensorid, which results in multiple ML models being trained under the hood, one for each sensor and all within the same single query.

Get started today

We’re excited for you to try out anomaly detection functions in Azure Stream Analytics. To try this new feature, please refer to the feature documentation, "Anomaly Detection in Azure Stream Analytics."


.NET Framework February 2019 Security and Quality Rollup

$
0
0

Yesterday, we released the February 2019 Security and Quality Rollup.

Security

CVE-2019-0613 – Remote Code Execution Vulnerability

This security update resolves a vulnerability in .NET Framework software if the software does not check the source markup of a file. An attacker who successfully exploits the vulnerability could run arbitrary code in the context of the current user. If the current user is logged on by using administrative user rights, an attacker could take control of the affected system. An attacker could then install programs; view, change, or delete data; or create new accounts that have full user rights. Users whose accounts are configured to have fewer user rights on the system could be less affected than users who have administrative user rights.

CVE-2019-0657 – Domain Spoofing Vulnerability

This security update resolves a vulnerability in certain .NET Framework APIs that parse URLs. An attacker who successfully exploits this vulnerability could use it to bypass security logic that’s intended to make sure that a user-provided URL belonged to a specific host name or a subdomain of that host name. This could be used to cause privileged communication to be made to an untrusted service as if it were a trusted service.

CVE-2019-0657

Getting the Update

The Security and Quality Rollup is available via Windows Update, Windows Server Update Services, Microsoft Update Catalog, and Docker.

Microsoft Update Catalog

You can get the update via the Microsoft Update Catalog. For Windows 10, .NET Framework updates are part of the Windows 10 Monthly Rollup.

The following table is for Windows 10 and Windows Server 2016+ versions.

Product Version Security and Quality Rollup KB
Windows 10 1809 (October 2018 Update)
Windows Server 2019
Catalog
4483452
.NET Framework 3.5, 4.7.2 4483452
Windows 10 1803 (April 2018 Update) Catalog
4487017
.NET Framework 3.5, 4.7.2 4487017
Windows 10 1709 (Fall Creators Update) Catalog
4486996
.NET Framework 3.5, 4.7.1, 4.7.2 4486996
Windows 10 1703 (Creators Update) Catalog
4487020
.NET Framework 3.5, 4.7, 4.7.1, 4.7.2 4487020
Windows 10 1607 (Anniversary Update)
Windows Server 2016
Catalog
4487026
.NET Framework 3.5, 4.6.2, 4.7, 4.7.1, 4.7.2 4487026
Windows 10 1507 Catalog
4487018
.NET Framework 3.5, 4.6, 4.6.1, 4.6.2 4487018

The following table is for earlier Windows and Windows Server versions.

Product Version Security and Quality Rollup KB Security Only Update KB
Windows 8.1
Windows RT 8.1
Windows Server 2012 R2
Catalog
4487080
Catalog
4487124
.NET Framework 3.5 Catalog
4483459
Catalog
4483484
.NET Framework 4.5.2 Catalog
4483453
Catalog
4483472
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog
4483450
Catalog
4483469
Windows Server 2012 Catalog
4487079
Catalog
4487122
.NET Framework 3.5 Catalog
4483456
Catalog
4483481
.NET Framework 4.5.2 Catalog
4483454
Catalog
4483473
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog
4483449
Catalog
4483468
Windows 7 SP1
Windows Server 2008 R2 SP1
Catalog
4487078
Catalog
4487121
.NET Framework 3.5.1 Catalog
4483458
Catalog
4483483
.NET Framework 4.5.2 Catalog
4483455
Catalog
4483474
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog
4483451
Catalog
4483470
Windows Server 2008 Catalog
4487081
Catalog
4487124
.NET Framework 2.0, 3.0 Catalog
4483457
Catalog
4483482
.NET Framework 4.5.2 Catalog
4483455
Catalog
4483474
.NET Framework 4.6 Catalog
4483451
Catalog
4483470

Docker Images

We are updating the following .NET Framework Docker images for today’s release:

Note: Look at the “Tags” view in each repository to see the updated Docker image tags.

Note: Significant changes have been made with Docker images recently. Please look at .NET Docker Announcements for more information.

Previous Monthly Rollups

The last few .NET Framework Monthly updates are listed below for your convenience:

Help us make the .NET Architecture guides better for you!

$
0
0

Over the last couple of years, we worked with experts to create some incredible architecture guides & reference samples for .NET developers. We focused on Microservices Architecture, Modernizing existing .NET apps, DevOps best practices, ASP.NET web apps, Azure cloud apps, Mobile apps with Xamarin and more.

We’re running a survey to find out how you’re using them and what we could improve. It should take 10 mins or less, and your answers will help us make the guides and reference samples better.


Take the survey now! 

Thank You!
.NET Team

Learn, Explore, Connect with the Bing Maps team at Microsoft Ignite the Tour

Microsoft’s Developer Blogs are Getting an Update

$
0
0

In the coming days, we’ll be moving our developer blogs to a new platform with a modern, clean design and powerful features that will make it easy for you to discover and share great content. This week, you’ll see the Visual Studio, IoTDev, and Premier Developer blogs move to a new URL  – while additional developer blogs will transition over the coming weeks. You’ll continue to receive all the information and news from our teams, including access to all past posts. Your RSS feeds will continue to seamlessly deliver posts and any of your bookmarked links will re-direct to the new URL location.

We’d love your feedback

Our most inspirational ideas have come from this community and we want to make sure our developer blogs are exactly what you need. Share your thoughts about the current blog design by taking our Microsoft Developer Blogs survey and tell us what you think! We would also love your suggestions for future topics that our authors could write about. Please use the survey to share what kinds of information, tutorials, or features you would like to see covered in future posts.

Frequently Asked Questions

For the most-asked questions, see the FAQ below. If you have any additional questions, ideas, or concerns that we have not addressed, please let us know by submitting feedback through the Developer Blogs FAQ page.

Will Saved/Bookmarked URLs Redirect?

Yes. All existing URLs will auto-redirect to the new site. If you discover any broken links, please report them through the FAQ feedback page.

Will My Feed Reader Still Send New Posts?

Yes, your RSS feed will continue to work through your current set-up. If you encounter any issues with your RSS feed, please report it with details about your feed reader.

Which Blogs Are Moving?

This migration involves the majority of our Microsoft developer blogs so expect to see changes to our Visual Studio, IoTDev, and Premier Developer blogs this week. The rest will be migrated in waves over the next few weeks.

We appreciate all of the feedback so far and look forward to showing you what we’ve been working on! For additional information about this blog migration, please refer to our Developer Blogs FAQ.

Microsoft’s Developer Blogs are Getting an Update

$
0
0

In the coming days, we’ll be moving our developer blogs to a new platform with a modern, clean design and powerful features that will make it easy for you to discover and share great content. This week, you’ll see the Visual Studio, IoTDev, and Premier Developer blogs move to a new URL  – while additional developer blogs will transition over the coming weeks. You’ll continue to receive all the information and news from our teams, including access to all past posts. Your RSS feeds will continue to seamlessly deliver posts and any of your bookmarked links will re-direct to the new URL location.

We’d love your feedback

Our most inspirational ideas have come from this community and we want to make sure our developer blogs are exactly what you need. Share your thoughts about the current blog design by taking our Microsoft Developer Blogs survey and tell us what you think! We would also love your suggestions for future topics that our authors could write about. Please use the survey to share what kinds of information, tutorials, or features you would like to see covered in future posts.

Frequently Asked Questions

For the most-asked questions, see the FAQ below. If you have any additional questions, ideas, or concerns that we have not addressed, please let us know by submitting feedback through the Developer Blogs FAQ page.

Will Saved/Bookmarked URLs Redirect?

Yes. All existing URLs will auto-redirect to the new site. If you discover any broken links, please report them through the FAQ feedback page.

Will My Feed Reader Still Send New Posts?

Yes, your RSS feed will continue to work through your current set-up. If you encounter any issues with your RSS feed, please report it with details about your feed reader.

Which Blogs Are Moving?

This migration involves the majority of our Microsoft developer blogs so expect to see changes to our Visual Studio, IoTDev, and Premier Developer blogs this week. The rest will be migrated in waves over the next few weeks.

We appreciate all of the feedback so far and look forward to showing you what we’ve been working on! For additional information about this blog migration, please refer to our Developer Blogs FAQ.

Announcing Windows Community Toolkit v5.1

$
0
0

It’s with great pleasure today that we announce the next update to the Windows Community Toolkit, version 5.1, made possible with help and contributions from our developer community. This update brings high-quality animation support with the inclusion of Lottie-Windows in the toolkit.  In addition, it provides a control for choosing Remote Devices, a new Image Cropping control, and many control accessibility fixes.

See more details on these features below.

Animations with Lottie-Windows

This new library provides high quality animation support on Windows 10 (1809) by utilizing the Windows.UI.Composition APIs and allows for the consumption of Bodymovin JSON files or optimized code-generated classes for playback in your Windows apps.  It is the successor to the LottieUWP community project.

Lottie Viewer app

Try installing the new Lottie Viewer app from the Microsoft Store to test out animations and generate optimized code for your Windows apps.

Documentation for Lottie-Windows.

Remote Device Picker

The Remote Device Picker allows a user to select a device, proximally or cloud accessible, from a dialog.  The app can then use this information to perform various scenarios like opening and communicating with an app on the chosen device.  Read more about what’s possible after selecting a device like opening an app or creating an app service.

This is possible with Project Rome which allows building people-centric experiences across devices and platforms (Android, Windows, iOS).  For examplehaving your app or website (using MS Graph APIs) continue a task to another device or create companion experiences that can communicate with each other across devices, like a remote control.

Documentation for RemoteDevicePicker.

Documentation for RemoteDevicePicker.  Read more about Project Rome capabilities here.

Image cropper

Image Cropper

The new Image Cropper control allows you to provide effortless image cropping functionality to your app.  Perfect for user selected profile pictures and photo editing tools.

Documentation for ImageCropper.

Accessibility Improvements

We’ve been striving to increase the accessibility of our controls in the last release and this one.  This update provides more fixes to keyboard navigation, high contrast, and narrator readability to the toolkit controls.

.NET Core 3.0 Preview for WPF and WinForms Packages

With the recent release of .NET Core 3.0 preview 2, we’ve shipped a toolkit 6.0 preview package for our WPF/WinForms library Microsoft.Toolkit.Win32 to support it.  Please download it here for WPF or WinForms to give it a try in your .NET Core applications.  Report any issues here.

Get started today

There are a lot more updates than we can cover here, be sure to read the release notes for more details on all the fixes provided in this update.

As a reminder, you can get started by following our docs.microsoft.com tutorial, or preview the latest features by installing the Windows Community Toolkit Sample App from the Microsoft Store. If you would like to contribute, please join us on GitHub! To join the conversation on Twitter, use the #WindowsToolkit hashtag.

Happy coding!

The post Announcing Windows Community Toolkit v5.1 appeared first on Windows Developer Blog.

Protect Azure Virtual Machines using storage spaces direct with Azure Site Recovery

$
0
0

Storage spaces direct (S2D) lets you host a guest cluster on Microsoft Azure which is especially useful in scenarios where virtual machines (VMs) are hosting a critical application like SQL, Scale out file server, or SAP ASCS. You can learn more about clustering by reading the article, “Deploying laaS VM Guest Clusters in Microsoft Azure.” I am also happy to share that with the latest Azure Site Recovery (ASR) update, you can now protect these business critical applications. The ASR support of storage spaces direct allows you to take your higher availability application and make it more resilient by providing a protection against region level failure.

We continue to deliver on our promise of simplicity and help you can protect your storage spaces direct cluster in three simple steps:

Inside the recovery services vault, select +replicate.

1. Select replication policy with application consistency off. Please note, that only crash consistency support is available.

2. Select all the nodes in the cluster and make them part of a Multi-VM consistency group. To learn more about Multi-VM consistency please visit our documentation, “Common questions: Azure-to-Azure replication.”

Configure replication settings screenshot

3. Lastly, select OK to enable the replication.

Next steps

Begin protecting virtual machines using storage spaces direct. To get started visit our documentation, “Replicate Azure Virtual Machines using storage spaces direct to another Azure region.”

Disaster recovery between Azure regions is available in all Azure regions where ASR is available. Please note, this feature is only available for Azure Virtual Machines’ disaster recovery.

Related links and additional content


Monitor at scale in Azure Monitor with multi-resource metric alerts

$
0
0

Our customers rely on Azure to run large scale applications and services critical to their business. To run services at scale, you need to setup alerts to proactively detect, notify, and remediate issues before it affects your customers. However, configuring alerts can be hard when you have a complex, dynamic environment with lots of moving parts.

Today, we are excited to release multi-resource support for metric alerts in Azure Monitor to help you set up critical alerts at scale. Metric alerts in Azure Monitor work on a host of multi-dimensional platform and custom metrics, and notify you when the metric breaches a threshold that was either defined by you or detected automatically.

With this new feature, you will be able to set up a single metric alert rule that monitors:

  • A list of virtual machines in one Azure region
  • All virtual machines in one or more resource groups in one Azure region
  • All virtual machines in a subscription in one Azure region

Benefits of using multi-resource metric alerts

  • Get alerting coverage faster: With a small number of rules, you can monitor all the virtual machines in your subscription. Multi-resource rules set at subscription or resource group level can automatically monitor new virtual machines deployed to the same resource group/subscription (in the same Azure region). Once you have such a rule created, you can deploy hundreds of virtual machines all monitored from day one without any additional effort.
  • Much smaller number of rules to manage: You no longer need to have a metric alert for every resource that you want to monitor.
  • You still get resource level notifications: You still get granular notifications per impacted resource, so you always have the information you need to diagnose issues.
  • Even simpler at scale experience: Using Dynamic Thresholds along with multi-resource metric alerts, you can monitor each virtual machine without the need to manually identify and set thresholds that fit all the selected resources. Dynamic condition type applies tailored thresholds based on advanced machine learning (ML) capabilities that learn metrics' historical behavior, as well as identifies patterns and anomalies.

Setting up a multi-resource metric alert rule

When you set up a new metric alert rule in the alert rule creation experience, use the checkboxes to select all the virtual machines you want the rule to be applied to. Please note that all the resources must be in the same Azure region.

Setting up a multi-resource metric alert rule

You can select one or more resource groups, or select a whole subscription to apply the rule to all virtual machines in the subscription.

Selecting subscription to create rule in Azure Monitor

If you select all virtual machines in your subscription, or one or more resource groups, you get the option to auto-grow your selection. Selecting this option means the alert rule will automatically monitor any new virtual machines that are deployed to this subscription or resource group. With this option selected, you don’t need to create a new rule or edit an existing rule whenever a new virtual machine is deployed.

Example of how to auto-grow a selection

You can also use Azure Resource Manager templates to deploy multi-resource metric alerts. Learn more in our documentation, “Understand how metric alerts work in Azure Monitor.”

Pricing

The pricing for metric alert rules is based on number of metric timeseries monitored by an alert rule. This same pricing applies to multi-resource metric alert rules.

Wrapping up

We are excited about this new capability that makes configuring and managing metric alerts rule at scale easier. This functionality is currently only supported for virtual machines with support for other resource types coming soon. We would love to hear what you think about it and what improvements we should make. Contact us at azurealertsfeedback@microsoft.com.

Under the hood: Performance, scale, security for cloud analytics with ADLS Gen2

$
0
0

On February 7, 2018 we announced the general availability of Azure Data Lake Storage (ADLS) Gen2. Azure is now the only cloud provider to offer a no-compromise cloud storage solution that is fast, secure, massively scalable, cost-effective, and fully capable of running the most demanding production workloads. In this blog post we’ll take a closer look at the technical foundation of ADLS that will power the end to end analytics scenarios our customers demand.

ADLS is the only cloud storage service that is purpose-built for big data analytics. It is designed to integrate with a broad range of analytics frameworks enabling a true enterprise data lake, maximize performance via true filesystem semantics, scales to meet the needs of the most demanding analytics workloads, is priced at cloud object storage rates, and is flexible to support a broad range of workloads so that you are not required to create silos for your data.

A foundational part of the platform

The Azure Analytics Platform not only features a great data lake for storing your data with ADLS, but is rich with additional services and a vibrant ecosystem that allows you to succeed with your end to end analytics pipelines.

Azure features services such as HDInsight and Azure Databricks for processing data, Azure Data Factory to ingress and orchestrate, Azure SQL Data Warehouse, Azure Analysis Services, and Power BI to consume your data in a pattern known as the Modern Data Warehouse, allowing you to maximize the benefit of your enterprise data lake.

End to end analytics graph

Additionally, an ecosystem of popular analytics tools and frameworks integrate with ADLS so that you can build the solution that meets your needs.

“Data management and data governance is top of mind for customers implementing cloud analytics solutions. The Azure Data Lake Storage Gen2 team have been fantastic partners ensuring tight integration to provide a best-in-class customer experience as our customers adopt ADLS Gen2.”

– Ronen Schwartz, Senior Vice president & General Manager of Data Integration and Cloud Integration, Informatica

"WANDisco’s Fusion data replication technology combined with Azure Data Lake Storage Gen2 provides our customers a compelling LiveData solution for hybrid analytics by enabling easy access to Azure Data Services without imposing any downtime or disruption to on premise operations.”

– David Richards, Co-Founder and CEO, WANdisco

“Microsoft continues to innovate in providing scalable, secure infrastructure which go hand in hand with Cloudera’s mission of delivering on the Enterprise Data Cloud. We are very pleased to see Azure Data Lake Storage Gen2 roll out globally. Our mutual customers can take advantage of the simplicity of administration this storage option provides when combined with our analytics platform.”

– Vikram Makhija, General Manager for Cloud, Cloudera

Performance

Performance is the number one driver of value for big data analytics workloads. The reason for this is simple, the more performant the storage layer, the less compute (the expensive part!) required to extract the value from your data. Therefore, not only do you gain a competitive advantage by achieving insights sooner, you do so at a significantly reduced cost.

“We saw a 40 percent performance improvement and a significant reduction of our storage footprint after testing one of our market risk analytics workflows at Zurich’s Investment Management on Azure Data Lake Storage Gen2.”

– Valerio Bürker, Program Manager Investment Information Solutions, Zurich Insurance

Let’s look at how ADLS achieves overwhelming performance. The most notable feature is the Hierarchical Namespace (HNS) that allows this massively scalable storage service to arrange your data like a filesystem with a hierarchy of directories. All analytics frameworks (eg. Spark, Hive, etc.) are built with an implicit assumption that the underlying storage service is a hierarchical filesystem. This is most obvious when data is written to temporary directories which are renamed at the completion of the job. For traditional cloud-based object stores, this is an O(n) complex operation, n copies and deletes, that dramatically impacts performance. In ADLS this rename is a single atomic metadata operation.

Azure Data Lake Storage diagram.jpg

The other contributor to performance is the Azure Blob Filesystem (ABFS) driver. This driver takes advantage of the fact that the ADLS endpoint is optimized for big data analytics workloads. These workloads are most sensitive to maximizing throughput via large IO operations, as distinct from other general purpose cloud stores that must optimize for a much larger range of IO operations. This level of optimization leads to significant IO performance improvements that directly benefits the performance and cost aspects of running big data analytics workloads on Azure. The ABFS driver is contributed as part of Apache Hadoop® and is available in HDInsight and Azure Databricks, as well as other commercial Hadoop distributions.

Scalable

Scalability for big data analytics is also critically important. There’s no point having a solution that works great for a few TBs of data, but collapses as the data size inevitably grows. The rate of growth of big data analytics projects tend to be non-linear as a consequence of more diverse and accessible sources of data. Most projects do benefit from the principle that the more data you have, the better the insights. However, this leads to design challenges such that the system must scale at the same rate as the growth of the data. One of the great design pivots of big data analytics frameworks, such as Hadoop and Spark, is that they scale horizontally. What this means is that as the data and/or processing grows, you can just add more nodes to your cluster and the processing continues unabated. This, however, relies on the storage layer scaling linearly as well.

This is where the value of building ADLS on top of the existing Azure Blob service shines. The EB scale of this service now applies to ADLS ensuring that no limits exist on the amount of data to be stored or accessed. In practical terms, customers can store 100s of PB of data which can be accessed with throughput to satisfy the most demanding workloads.

ADLS Gen2 Architecture diagram

Secure

For customers wanting to build a data lake to serve the entire enterprise, security is no lightweight consideration. There are multiple aspects to providing end to end security for your data lake:

Tight integration with analytics frameworks results in an end to end secure pipeline. The HDInsight Enterprise Security Package makes end-user authentication flow through the cluster and to the data in the data lake.

Get started today!

We’re excited for you to try Azure Data Lake Storage! Get started today and let us know your feedback.

Learn how to build with Azure IoT: Upcoming IoT Deep Dive events

$
0
0

Microsoft IoT Show, the place to go to hear about the latest announcements, tech talks, and technical demos, is starting a new interactive, live-streaming event and technical video series called IoT Deep Dive!

Each IoT Deep Dive will bring in a set of IoT experts, like Joseph Biron, PTC CTO of IoT, and Chafia Aouissi, Azure IoT Senior Program Manager, during the first IoT Deep Dive, "Building End to End industrial Solutions with PTC ThingWorx and Azure IoT.” Join us on February 20, 2019 from 9:00 AM - 9:45 AM Pacific Standard Time to walk-through end to end IoT solutions, technical demos, and best practices.

IoT Deep Dive event information

Come learn and ask questions about how to build IoT solutions and deep dive into intelligent edge, tooling, DevOps, security, asset tracking, and other top requested technical deep dives. Perfect for developers, architects, or anyone who is ready to accelerate going from proof of concept to production, or needs best practices tips while building their solutions.

Upcoming events

IoT Deep Dive Live: Building End to End industrial Solutions with PTC ThingWorx and Azure

PTC ThingWorx and Microsoft Azure IoT are proven industrial innovation solutions with a market-leading IoT cloud infrastructure. Sitting on top of Azure IoT, ThingWorx delivers a robust and rapid creation of IoT applications and solutions that maximizes Azure services such as IoT Hub. Join the event to learn how to build an E2E industrial solution. You can setup a reminder to join the live event.

  • When: February 20, 2019 at 9:00 AM - 9:45 AM Pacific Standard Time | Level 300
  • Learn about: ThingWorx, Vuforia Studio, Azure IoT, and Dynamics 365
  • Special guests:
    • Joseph Biron, Chief Technology Officer of IoT, PTC
    • Neal Hagermoser, Global ThingWorx COE Lead, PTC
    • Chafia Aouissi, Senior Program Manager, Azure IoT
    • Host: Pamela Cortez, Program Manager, Azure IoT
  • Industries and use cases: Smart connected product manufactures in the verticals of including automotive, industrial equipment, aerospace, electronics, and high tech.

Location Intelligence for Transportation with Azure Maps 

Come learn how to use Azure Maps to provide location intelligence in different areas of transportation such as fleet management, asset tracking, and logistics.

  • When: March 6, 2019 9:00 AM - 9:45 AM Pacific Standard Time | Level 300
  • Learn about: Azure Maps, Azure IoT Hub, Azure IoT Central, and Azure Event Grid
  • Guest speakers:
    • Ricky Brundritt, Senior Program Manager, Azure IoT
    • Pamela Cortez, Program Manager, Azure IoT
  • Industries and use cases: Fleet management, logistics, asset management, and IoT

Submit questions before the events on the Microsoft IoT tech community or during the IoT Deep Dive live event itself! All videos will be hosted on Microsoft IoT Show after the live event.

AI, Machine Learning and Data Science Roundup: February 2019

$
0
0

A monthly roundup of news about Artificial Intelligence, Machine Learning and Data Science. This is an eclectic collection of interesting blog posts, software announcements and data applications from Microsoft and elsewhere that I've noted over the past month or so.

Open Source AI, ML & Data Science News

ONNX, the open interchange format for AI models, updates to version 1.4.0 with support for models larger than 2Gb.

PT-BERT, a PyTorch implementation of Google's BERT language representation model, adds new pre-trained language models: GPT and Transformer-XL.

Stanford University has released StanfordNLP, a natural language analysis package for Python with pre-trained models for 53 languages.

The Python Foundation releases Python 3.7 on the Windows 10 App Store.

Industry News

Amazon offers recommendations to policymakers on the use of facial recognition technology and calls for regulation of its use.

AWS open-sources the Neo-AI project, a machine learning compiler and runtime that tunes Tensorflow, PyTorch, ONNX, MXNet and XGBoost models for performance on edge devices. 

Google introduces FEAST, an open-source "feature store" for managing and discovering features in machine learning models.

Google releases Natural Questions, a corpus of 300,000 questions and human-annotated answers, for question-answering research.

Uber open-sources Ludwig, a platform for training and testing deep learning models without programming, simply by specifying input and output variables in data.

Microsoft News

Microsoft has joined the SciKit-learn Consortium as a Platinum member.

Azure Data Explorer, a high-performance service for real-time analysis of streaming data, is now generally available.

HDInsight Tools for VSCode, providing a lightweight code editor for HDInsight PySpark and Hive batch jobs, is now available.

Azure Cognitive Services can now be used with a unified API key, and in more regions and with new certifications including HIPAA BAA.

An in-depth InfoWorld review of Azure ML services.

Learning resources

Papers with Code, a repository of machine learning papers with associated implementation code and evaluation metrics.

Practical Deep Learning for Coders, the popular free course from fast.ai, has been updated with all-new content. Microsoft Azure and GCP provide pre-configured environments with the required tools, data and materials.

Recommenders Hub: an open source GitHub repository containing Microsoft’s best practices and state-of-the-art algorithms for building recommendation systems.

Presentation videos and workshop materials from the recent RStudio conference are available.

Tutorial: running trained deep learning models on Apple devices, by converting CoreML exported from PyTorch into ONNX.

Tutorials from the MIT Deep Learning courses: Deep learning basics, Driving scene segmentation, DeepTraffic deep reinforcement learning.

Reference Architecture: Batch scoring of Spark models on Azure Databricks.

Applications

Facebook develops a predictive model for global electricity availability, publishes the output and some code.

The British Broadcasting Corporation releases an R package and a guide for data journalists on creating data visualizations in the BBC News style.

TWIMLAI's podcast series on Microsoft's AI for Earth , AI for Accessibility and AI for Humanitarian Action programs.

A profile of Heather Dowdy, the head of Microsoft's AI for Accessibility program.

Find previous editions of the monthly AI roundup here.

Update 19.02 for Azure Sphere public preview now available

$
0
0

The Azure Sphere 19.02 release is available today. In our second quarterly release after public preview, our focus is on broader enablement of device capabilities, reducing your time to market with new reference solutions, and continuing to prioritize features based on feedback from organizations building with Azure Sphere.

Today Azure Sphere’s hardware offerings are centered around our first Azure Sphere certified MCU, the MediaTek MT3620. Expect to see additional silicon announcements in the near future, as we work to expand our silicon and hardware ecosystems to enable additional technical scenarios and ultimately deliver more choice to manufacturers.

Our 19.02 release focuses on broadening what you can accomplish with MT3620 solutions. With this release, organizations will be able to use new peripheral classes (I2C, SPI) from the A7 core. We continue to build on the private Ethernet functionality by adding new platform support for critical networking services (DHCP and SNTP) that enable a set of brownfield deployment scenarios. Additionally, by leveraging our new reference solutions and hardware modules, device builders can now bring the security of Azure Sphere to products even faster than before.

To build applications that leverage this new funcionality, you will need to ensure that you have installed the latest Azure Sphere SDK Preview for Visual Studio. All Wi-Fi connected devices will automatically receive an updated Azure Sphere OS.

  • New connectivity options – This release supports DHCP and SNTP servers in private LAN configurations. You can optionally enable these services when connecting a MT3620 to a private Ethernet connection.
  • Broader device enablement – Beta APIs now enable hardware support for both I2C and SPI peripherals. Additionally, we have enabled broader configurability options for UART.
  • More space for applications – The MT3620 now supports 1 MB of space dedicated for your production application binaries.
  • Reducing time to market of MT3620-enabled products – To reduce complexity in getting started with the many aspects of Azure Sphere we have added several samples and reference solutions to our GitHub samples repo:
    • Private Ethernet – Demonstrates how to wire the supported microchip part and provides the software to begin developing a private Ethernet-based solution.
    • Real-time clock – Demonstrates how to set, manage, and integrate the MT3620 real time clock with your applications.
    • Bluetooth command and control – Demonstrates how to enable command and control scenarios by extending the Bluetooth Wi-Fi pairing solution released in 18.11.
    • Better security options for BLE – Extends the Bluetooth reference solution to support a PIN between the paired device and Azure Sphere.
    • Azure IoT – Demonstrates how to use Azure Sphere with either Azure IoT Central or an Azure IoT Hub.
    • CMake preview – Provides an early preview of CMake as an alternative for building Azure Sphere applications both inside and outside Visual Studio. This limited preview lets customers begin testing the use of existing assets in Azure Sphere development.
  • OS update protection – The Azure Sphere OS now protects against a set of update scenarios that would cause the device to fail to boot. The OS detects and recovers from these scenarios by automatically and atomically rolling back the device OS to its last known good configuration.
  • Latest Azure IoT SDK – The Azure Sphere OS has updated its Azure IoT SDK to the LTS Oct 2018 version.

All Wi-Fi connected devices that were previously updated to the 18.11 release will automatically receive the 19.02 Azure Sphere OS release. As a reminder, if your device is still running a release older than 18.11, it will be unable to authenticate to an Azure IoT Hub via DPS or receive OTA updates. See the Release Notes for how to proceed in that case.

As always, continued thanks to our preview customers for your comments and suggestions. Microsoft engineers and Azure Sphere community experts will respond to product-related questions on our MSDN forum and development questions on Stack Overflow. We also welcome product feedback and new feature requests.

Visit the Azure Sphere website for documentation and more information on how to get started with your Azure Sphere development kit. You can also email us at nextinfo@microsoft.com to kick off an Azure Sphere engagement with your Microsoft representative.

How Azure Security Center helps you protect your environment from new vulnerabilities

$
0
0

Recently the disclosure of a vulnerability (CVE-2019-5736) was announced in the open-source software (OSS) container runtime, runc. This vulnerability can allow an attacker to gain root-level code execution on a "host. runc" which is the underlying container runtime underneath many popular containers.

Azure Security Center can help you detect vulnerable resources in your environment within Microsoft Azure, on-premises, or other clouds. Azure Security Center can also detect that an exploitation has occurred and alert you.

Azure Security Center offers several methods that can be applied to mitigate or detect malicious behavior:

  • Strengthen security posture – Azure Security Center periodically analyzes the security state of your resources. When it identifies potential security vulnerabilities it creates recommendations. The recommendations guide you through the process of configuring the necessary controls. We have plans to add recommendations when unpatched resources are detected. You can find more information about strengthening security posture by visiting our documentation, “Managing security recommendations in Azure Security Center.”
  • File Integrity Monitoring (FIM) – This method examines files and registry keys of operating systems, application software, and more, for changes that might indicate an attack. By enabling FIM, Azure Security Center will be able detect changes in the directory which can indicate malicious activity. Guidance on how to enable FIM and add file tracking on Linux machines can be found in our documentation, “File Integrity Monitoring in Azure Security Center.”
  • Security alerts – Azure Security Center detects suspicious activities on Linux machines using auditd framework. Collected records flow into a threat detection pipeline and surface as an alert when malicious activity is detected. Security alerts coverage will soon include new analytics to identify compromised machines by runc vulnerability. You can find more information about security alerts by visiting our documentation, “Azure Security Center detection capabilities.”

To apply the best security hygiene practices, it is recommended to have your environment configured so that it posses the latest updates from your distribution provider. System updates can be performed through Azure Security Center, for more guidance visit our documentation, “Apply system updates in Azure Security Center.”

Visual Studio 2019 Preview 2 Blog Rollup

$
0
0

Visual Studio 2019 Preview 2 was a huge release for us, so we’ve written a host of articles to explore the changes in more detail. For the short version, see the Visual Studio 2019 Preview 2 Release Notes.

We’d love for you to download Visual Studio 2019 Preview, give it a try, and let us know how it’s working for you in the comments below or via email (visualcpp@microsoft.com). If you encounter problems or have a suggestion, please let us know through Help > Send Feedback > Report A Problem / Provide a Suggestion or via Visual Studio Developer Community. You can also find us on Twitter @VisualC.


Advisory on February 2019 Security update for Windows 10 update 1809

$
0
0

The February 2019 Security and Quality Rollup for Windows 10 update 1809 was released earlier this week. We have received multiple customer reports of issues when a customer has installed .NET Framework 4.8 Preview then installed the February security update, 4483452. These reports are specific to only to the scenario when a customer has installed both .NET Framework 4.8 Preview and the February security update.

We are actively working on fixing and re-shipping a correction for this issue. If you installed the February 2019 security update and have not yet seen any negative behavior, we recommend that you leave your system as-is but closely monitor them and ensure that you apply upcoming .NET Framework updates.

As a team, we regret that this release was shipped with this flaw. This release was tested using our regular and extensive testing process. We are working to improve our testing infrastructure to prevent these type of issues in the future. Again, we are sorry for any inconvenience that this product flaw has caused.

We will continue to update this post as we have new information.

Guidance

We are working on guidance and will update this pose and as we have new information.

Workaround

There are no known workarounds at this time.

Symptoms

After installing .NET Framework 4.8 Preview then the February 2019 security update, 4483452, Visual Studios will crash with this error message:

“Exception has been thrown by the target of an invocation.”

Join us April 2nd for the Launch of Visual Studio 2019!

$
0
0

At Connect(); a little over two months ago, we released the first Visual Studio 2019 Preview. Based on your inputs, we’ve made several improvements to Visual Studio 2019 in our endeavor to make this the best Visual Studio yet. On behalf of our entire team, I’m excited to announce the upcoming release of Visual Studio 2019 on April 2, 2019 at the Visual Studio 2019 Launch Event.

Visual Studio 2019 Launch Event banner

Join us at our virtual event

Join us online on April 2 starting at 9 AM Pacific Time for demos and conversations centered around development with Visual Studio 2019, Azure DevOps, and GitHub. We’ll have something for everyone, whether you’re a developer who uses C#, C++, or Python or target the web, desktop, or cloud. And, of course, you’ll be one of the first who get to try out the Visual Studio 2019 release.

Scott Hanselman will kick off the day with an online keynote highlighting the newest innovations in Visual Studio. Be prepared to be delighted and ready to return to your coding even more productively than before! The keynote will be followed by several live-streamed sessions, hosted by experts with online Q&A, that dive deeper into various features and programming languages. We’ll close out the day with a virtual attendee party sponsored by our Visual Studio ecosystem partners, so make sure you stick around!

We’re also looking to team up with community organizers around the globe to organize local, in-person launch events between April 2 and the end of June. Check out the local events page for more details on how to host one of these events or to join local events around the globe. We’ll have supporting content available on GitHub and we’ll support your local event with swag if you let us know about it by March 1.

Visual Studio 2019 Preview 3

As we keep marching towards our launch, we are continuing our Visual Studio 2019 Preview releases. Yesterday we released Visual Studio 2019 Preview 3, which includes several improvements and fixes based on the feedback you provide through the Developer Community. Be sure to check out the release notes for all the details on this release.

Save the date

You can head over to https://launch.visualstudio.com/ today to get a glimpse of what we’ll be sharing and save the date using the calendar invite or sign up for updates via e-mail. We’ll update the site regularly with more information on the full agenda as well as additional speakers. Tell us about your Visual Studio journey so far and let us know what you’re excited to see using the #VS2019 on Facebook and Twitter. We look forward to seeing you online!

The post Join us April 2nd for the Launch of Visual Studio 2019! appeared first on The Visual Studio Blog.

New Basic Process Available in Azure DevOps

$
0
0

Azure DevOps has three processes out of the box – Agile, Scrum and CMMI. Each of these processes adheres to a particular methodology. Agile being the most universal and widely used. However, the Agile process still brings a set of concepts and behaviors that are not obvious to our new users and therefore some of those users have a hard time understanding Azure Boards.

For example, the 4-level backlog hierarchy or the many state transition rules. These add complexities that new users don’t care about. New users come from tools such as GitHub with very simple work tracking, and they want Azure Boards to be just as easy.

To meet these needs, we have introduced the new “Basic” process.

Keeping it Simple

In order for the “Basic” process to work for new users, we needed to reduce the scope of the process and get down to the basics. To start we reduced the number of work item types down the to three. Epic, Issue, and Task.

By default, users can start right away by adding Issues to their board. You will notice that the board contains 3 columns. To Do, Doing, and Done. This simplified state model is used for all three work item types.

Each of the work item types in Agile, Scrum and CMMI contain many extra fields that are not needed for someone starting out simple. Our research shows that many users get confused by all of the extra fields and their purpose. In the Basic process, we kept only the core fields and removed the rest. Only fields that are required to support other functionality survived. For example, the sprint burn down requires the Task to have the remaining work field. Capacity planning requires the activity field. Here is an example of the Issue work item type. It contains all the core fields plus Effort for forecasting your velocity on the backlog.

Hierarchy is the last area we simplified in the Basic process. Instead of four levels, Basic starts with just two. Users will start with Issues, and those Issues can be broken down into Tasks. For more advanced scenarios, issues may not be enough. Some users may want a way to group their issues into specific deliverables. For these users we are providing the Epic work item type.

See our documentation to learn more about the Basic Process!

Same Great Azure DevOps Functionality

The good news with the Basic process is you still get all the great rich functionality that comes with any process in Azure DevOps. Boards, Queries, Sprints, Dashboards, etc., they are work great with Basic. As noted above, we kept the important fields on the work item types so that nothing has been compromised.

Finally, for those users where Basic is not quite enough and more is needed, you can customize your process as needed. You can add new work item types, fields, rules, etc. Checkout the documentation on customizing an inherited process for more information.

Looking for Feedback

We would love to get some feedback from you about our new Basic Process. We want to know what you like and don’t like. Is it enough? Do you need more? Please email me directly at dahellem@microsoft.com or contact me on Twitter at @danhellem

The post New Basic Process Available in Azure DevOps appeared first on Azure DevOps Blog.

Top Stories from the Microsoft DevOps Community – 2019.02.15

$
0
0

I hope you had another great week – I certainly did! I had the good fortune to spend time building out some fun container-based Azure Pipelines builds for open source projects. Expect a blog post on that soon! In the meantime, here’s some other great blog posts that I found this week.

Using Azure DevOps Pipelines to Deploy Azure Functions written in Java
Ten years ago, I came to Microsoft as a former Unix sysadmin who built Java tools on a Mac. That was strange then, but a decade later, we’re all about any language on any platform. Mani Bindra has a great post about using Azure Pipelines to deploy Azure Functions that are written in Java.

Using Environment Variables in Azure DevOps Pipelines
You can define variables within your Azure Pipelines descriptions, but how can you use them from your build scripts? It may not be obvious, but they get translated to environment variables that you can use in bash or PowerShell. K. Scott Allen explains.

BASTA! TFS & Azure DevOps Day
This looks like a great conference coming to Frankfurt at the end of this month. This is a full day of content around software development, delivery and DevOps. I’m especially interested in the “Azure DevOps meets GitHub” talk but – regrettably – I can’t make it to Germany. I hope you’re able to take my spot.

End-to-end Pull Request on Azure DevOps
When Olivier Léger said that this is an “end-to-end” guide to pull requests, I was skeptical, but it’s totally true. He explains how to create and review and pull request, but he doesn’t stop there. He also explains how to use Azure Pipelines to validate that pull request and even use triggers to deploy it automatically.

As always, if you’ve written an article about Azure DevOps or find some great content about DevOps on Azure then let me know! I’m @ethomson on Twitter.

The post Top Stories from the Microsoft DevOps Community – 2019.02.15 appeared first on Azure DevOps Blog.

NASCAR racing team Hendrick Motorsports finds competitive edge with Microsoft Teams

Viewing all 5971 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>