Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

The top 6 ways to boost ratings, reviews, and feedback for your app

$
0
0

A frequent question we get from developers is, “How can I encourage customers to review and give feedback on my app?”

User feedback isn’t just useful for product development—it can significantly affect whether a new customer decides to download your product. So, aside from checking your reviews report and feedback report, what can you do to boost your rating and get more reviews and feedback?

We’ve compiled 6 quick tips to help you out.

Ask for reviews

Be direct! It’s the best way to let customers know you want to hear what they think.

1) Ask in-app

Prompt your customers in-app with a rating and review dialog by implementing just a few lines of code. Or you can directly launch the rating and review page for your app in the Microsoft Store. You can also use the let customers directly launch Feedback Hub from your app.

2) Ask with push notifications

Send a toast or tile notification asking for ratings and reviews. To target only customers who haven’t yet rated your app, first create a customer segment with the definition Has rated == false, then use push notifications to engage them.

Defining inclusion conditions for push notifications.

3) Ask on Microsoft Store

Add a request for reviews and feedback in the What’s new in this version section or the description of your Microsoft Store listing. After you list any updates, let your customers know you’re looking for feedback on what you’ve updated by saying something like “Please help us improve your experience by reviewing our app.”

"What's new in this version" app summary.

Screenshot from MEGA Privacy Microsoft Store listing page.

Respond to your customers

Thoughtful responses tell your customers you care about their experiences.

4) Respond directly to reviews

Let customers know their opinions matter. So long as they haven’t opted out of receiving responses, you can respond to customer reviews via API or in the dashboard. Read more on how to do that here.

Example of responding directly to customer reviews.

Screenshot from Sketchable reviews page on their Microsoft Store listing

5) Respond directly to feedback

Want to address a particular comment? Use Feedback Hub in your app. You can respond publicly or privately and post updates on the status of any user issue you’re working to resolve.

Keep your responses shorter than 1,000 words, polite, and to the point. Read more on responding to feedback here.

When responding to reviews and feedback, keep your responses polite and respectful, because customers can report inappropriate or abusive responses, and we take those reports seriously.

6) Fix the pain points

If several reviewers are suggesting the same new feature or fixes, consider releasing a new version with those updates—and when you do, make a note in the app description or “What’s new in this version” section so customers know what issues have been fixed. You might even mention that the issues were identified based on customer feedback and ask them to continue letting you know what they think.

We hope these tips will help you let your customers know you want their input—and then let them know they’ve been heard.

The post The top 6 ways to boost ratings, reviews, and feedback for your app appeared first on Windows Developer Blog.


AI, Machine Learning and Data Science Roundup: September 2018

$
0
0

A monthly roundup of news about Artificial Intelligence, Machine Learning and Data Science. This is an eclectic collection of interesting blog posts, software announcements and data applications from Microsoft and elsewhere that I've noted over the past month or so.

Open Source AI, ML & Data Science News

ONNX 1.3 released. The standard for representing predictive models adds a cross-platform API for executing graphs on optimized backends and hardware.

A new foundation supporting the development of scikit-learn, the Python machine learning library.

Google open sources "dopamine", a Tensorflow-based framework for reinforcement learning research.

Google adds a "What-If Tool" to Tensorboard, to interactively explore the robustness and algorithmic fairness of machine learning models.

Industry News

Google introduces Dataset Search, an index of public datasets from the environmental and social sciences, government, and journalism.

The 2018.3 update to the Alteryx analytics platform brings interactive graphics, and Spark and Jupyter Notebook integration.

AWS Deep Learning AMIs now provide Tensorflow 1.10 and PyTorch with CUDA 9.2 support.

Rigetti Computing launches the Quantum Cloud Service with a $1M prize for the first application demonstrating "quantum advantage".

Thomas Dinsmore's commentary on Forrester Wave rankings for "Multimodal Predictive Analytics and Machine Learning Platforms" and "Notebook-Based Predictive Analytics and Machine Learning Solutions".

Microsoft News

Microsoft acquires Lobe, a Bay Area startup that produced a drag-and-drop interface for building machine learning models.

NVIDIA GPU Cloud now provides ready-to-run containers with GPU-enabled deep learning frameworks for use on Azure.

Microsoft introduces Azure CycleCloud, a tool for creating, managing, operating, and optimizing burst, hybrid, and cloud-only HPC clusters. Schedulers including Slurm, Grid Engine and Condor are supported.

New features in Azure HDInsight: Spark 2.3.0 and Kafka 1.1 support; ML Services 9.3 integration, with updated R engine and new statistical and machine learning algorithms; Apache Phoenix, for SQL-like queries for data in HBase; Apache Zeppelin, web-based notebooks for querying Phoenix tables; and more.

Learning resources

A review of the algorithms behind AutoML systems for model selection and hyperparameter optimization, from the H2O blog.

Joel Grus's criticisms of Jupyter Notebooks as a platform for reproducible and production-ready computing. Yihui Xiu offers an alternative: RMarkdown.

A comprehensive introduction to mixed-effects models, and fitting them in Python.

A collection of videos, presentations and essays by Brandon Rohrer with approachable explanations of the inner workings of deep learning and machine learning algorithms.

A tutorial on using Azure Batch AI to parallelize forecasts of energy demand.

A meticulously-researched accounting of the resources (natural, technological, and human) that enable an Alexa voice query, presented as highly-detailed map.

A review of the book SQL Server 2017 Machine Learning Services with R.

Applications

Data scientists use text similarity analyses in R to try and identify the author of that anonymous NYT op-ed.

Sketch2Code, an application to translate hand drawings into HTML forms.

Measuring building footprints from satellite images, with semantic segmentation.

Near real-time fraud detection for mobile banking, with a classification model implemented in Azure Machine Learning.

Credit card fraud detection with an Autoencoder neural network, using the Azure Data Science VM.

Shell deploys machine learning and AI systems to avert equipment failures, autonomously direct drill-bits underground, and improve safety.

Find previous editions of the monthly AI roundup here

Deep dive into Azure Artifacts

$
0
0

Azure Artifacts manages the dependencies used in your codebase and provides easy tools to ensure the immutability and performance of those components. Released as one of the new services available for developers in Azure DevOps, the current features in Artifacts will help you and your users produce and consume artifacts. For teams that use or produces binary packages, Azure Artifacts provides a secure, highly performant store and easy feed.

Getting started with Artifacts: Package feeds

Azure Artifacts groups packages in to feeds, which are containers for packages that help you consume and publish.

Package feeds

We’ve optimized default settings to be most useful to feed users, such as making your feed account visible to easily share a single source of packages across your entire team. However, if you’d like to customize your settings, simply access the settings tab to refresh your preferences.

New feature: Universal Packages

Azure Artifacts is a universal store for all the artifacts you use as part of development and deployment. In addition to NuGet, npm, and Maven packages, feeds now support Universal Packages, which can store any file or set of files. You create and consume Universal Packages via the Visual Studio Team Services (VSTS) CLI. Consider using them to store deployment inputs like installers, large datasets or binary files that you need during development, or as a versioned container for your pipeline outputs. To try them out, look for the Universal Packages toggle in your preview features panel by clicking your profile image in the upper right, followed by clicking on “Preview features”. You can also learn more in the Universal Packages announcement post.

Next up, enabling Views

The views in Azure Artifacts enable you to share subsets of the NuGet and npm package-versions in your feed with consumers. A common use for views is to share package-versions that have been tested, validated, or deployed but hold back packages still under development and not ready for public consumption.

Views

Views and upstream sources are designed to work together to make it easy to produce and consume packages at enterprise scale.

Control your dependencies with Upstream Sources

Upstream sources enable you to use a single feed to store both the packages you produce and the packages you consume from "remote feeds". This includes both public feeds, such as npmjs.com and nuget.org, and authenticated feeds, such as other Azure DevOps feeds in your organization. Once you've enabled an upstream source, any user connected to your feed can install a package from the remote feed, and your feed will save a copy. 

Note: For each component served from the upstream, a copy will be always available to consume, even if the original source is down or, for TFS users, your internet connection isn’t available.

Upstream sources

In short, enabling upstream sources to public sources makes it easy to use your favorite or most used dependencies, and can also give you additional protection against outages and corrupted or compromised packages.

Easy to use Symbols and the Symbol Server

To debug compiled executables, especially executables compiled from native code languages like C++, you need symbol files that contain debugging information. Artifacts makes Symbol support and publishing quick and simple.

The updated “Index Sources and Publish Symbols” task now publishes symbols to the Azure DevOps Symbol Server with a single checkbox. No advanced configuration or file share setup is required.

Symbols

We also have made it simple to consume symbols from Visual Studio:

  1. With VS2017 Update 4.1 (version 15.4.1) or later, type “debugging symbols” in Quick Launch and press Enter.
  2. Click the “New Azure DevOps Symbol Server Location…” button (marked in red below). In the dialog that appears, select your Azure DevOps account and click “Connect”.

When you are done, it should look like this:

Symbols from Visual Studio

If you prefer debugging with the new UWP version of WinDbg, these docs will help you configure your Azure DevOps account on the WinDbg sympath.

Credential Provider authentication for NuGet in Azure Artifacts

Azure Artifacts secures all the artifacts you publish. However, historically it’s been a challenge to get through security to use your NuGet packages, especially on Mac and Linux. Today, that changes with the new Azure Artifacts Credential Provider. We’ve automated the acquisition of credentials needed to restore NuGet packages as part of your .NET development workflow, whether you’re using MSBuild, dotnet, or NuGet(.exe) on Windows, Mac, or Linux. Any time you want to use packages from an Azure Artifacts feed, the Credential Provider will automatically acquire and store a token on behalf of the NuGet client you're using. To learn more and get the new Credential Provider, see the readme on GitHub.

Supported protocols versions and compatibility

Some package management services are only compatible with specific versions of TFS. The table below provides the information needed to understand version compatibility.

Feature Azure DevOps Services TFS
NuGet Yes TFS 2017
npm Yes TFS 2017 Update 1 and newer
NuGet.org upstream source Yes TFS 2018 Update 2 and newer
Maven Yes TFS 2018

Further info

Want to learn more? See our documented best practices, videos, and other learning materials for Azure Artifacts.

We also maintain a list of requested features through our UserVoice, and love to complete requests for our passionate users! Always feel free to message us on twitter (@had_msft or @VSTS) with questions or issues!

Azure.Source – Volume 50

$
0
0

Now in preview

Jenkins Azure ACR Build plugin now in public preview - Azure Container Registry (ACR) Build is a suite of features within ACR. It provides cloud-based container image building for Linux, Windows, and ARM, and can automate OS and framework patching for your Docker containers. Now you can use Azure ACR Plugin in Jenkins to build your Docker image in Azure Container Registry based on git commits or from a local directory. One of the best things about ACR build is you only pay for the compute you use to build your images.

Also in preview

The Azure Podcast

The Azure Podcast | Episode 247 - Partner Spotlight - Snowflake - Cynthia, Evan and Cale talk to Leo Giakoumakis, head of Snowflake's Seattle Development Center, about their Data warehouse platform built for the cloud (and now available on Azure).

Now generally available

Immutable storage for Azure Storage Blobs now generally available - Many industries are required to retain business-related communications in a Write-Once-Read-Many (WORM) or immutable state that ensures they are non-erasable and non-modifiable for a specific retention interval. Immutable storage for Azure Storage Blobs addresses this requirement and is available in all Azure public regions. Through configurable policies, users can keep Azure Blob storage data in an immutable state where Blobs can be created and read, but not modified or deleted.

Also generally available

The IoT Show

The IoT Show | Enhanced IoT Edge developer experience in VS Code - Joe Binder, PM in the Visual Studio team shows us the latest and greatest additions to the Azure IoT Edge extension for Visual Studio Code such as solution templates and debugging features.

The IoT Show | Azure IoT DevKit OTA Firmware update - Arthur Ma, developer lead in the Visual Studio team joins Olivier on the IoT Show to demonstrate how to do a Firmware update of a microcontroller over the air with Azure IoT Hub.

News and updates

Remote Monitoring Solution allows for root cause analysis with Azure Time Series Insights - Azure Time Series Insights is now integrated into the Azure IoT Remote Monitoring solution accelerator. With Time Series Insights, you can gain deeper insights into your time-series sensor data by spotting trends, anomalies, and correlations across real-time and historical data in all your locations. All messages data from your IoT devices will be stored in Time Series Insights, but your alarms, rules, and configuration settings will remain in Azure Cosmos DB.

Screenshot of telemetry data in the Time Series Insights explorer

HDInsight tools for Visual Studio Code: simplifying cluster and Spark job configuration management - HDInsight Tools for Visual Studio Code now uses Visual Studio Code's built-in user settings and workspace settings to manage HDInsight clusters in Azure regions worldwide and Spark job submissions. With this feature, you can manage your linked clusters and set your preferred Azure environment with Visual Studio Code user settings. You can also set your default cluster and manage your job submission configurations via Visual Studio Code workspace settings.

The Azure DevOps Podcast

The Azure DevOps Podcast | Donovan Brown on How to Use Azure DevOps Services - Episode 002 - Jeffrey Palermo and Donovan Brown talk about the whirlwind it’s been since the launch of the new Azure DevOps, key information new developers might want to know when beginning to use or incorporate Azure DevOps, some of the changes to their services, what’s available for packages in DevOps, the free build capabilities Microsoft is giving to open source projects, some of the new capabilities around GitHub integration, and more!

Technical content and training

Getting AI/ML and DevOps working better together - The difficult part about integrating AI or ML into an application is getting the model deployed into a production environment and keeping it operational and supportable. AI/ML projects need to incorporate some of the operational and deployment practices that make DevOps effective and DevOps projects need to accommodate the AI/ML development process to automate the deployment and release process for AI/ML models. See this post for suggestions for bridging the gap between DevOps and AI/ML projects that are based on lessons learned from several Microsoft projects including the Mobile Bank Fraud Solution.

Deep dive into Azure Repos - Azure Repos (a service in Azure DevOps) has unlimited free private repositories with collaborative code reviews, advanced file management, code search, and branch policies to ensure high quality code. Learn how Azure Repos is great for small projects as well as large organizations that need native AAD support and advanced policies.

Deep dive into Azure Test Plans - Azure Test Plans (a service in Azure DevOps) provides a browser-based test management solution for exploratory, planned manual, and user acceptance testing. Azure Test Plans also provides a browser extension for exploratory testing and gathering feedback from stakeholders.

Deep dive into Azure Artifacts - Azure Test Artifacts (a service in Azure DevOps) manages the dependencies used in your codebase and provides easy tools to ensure the immutability and performance of those components. For teams that use or produces binary packages, Azure Artifacts provides a secure, highly performant store and easy feed.

Programmatically onboard and manage your subscriptions in Azure Security Center - To streamline the security aspects of the DevOps lifecycle, ASC has recently released its official PowerShell module. This enables organizations to programmatically automate onboarding and management of their Azure resources in Azure Security Center and adding the necessary security controls. This blog post focuses on how to use this PowerShell module to onboard Azure Security Center. You can now use the official PowerShell cmdlets with automation scripts to programmatically iterate on multiple subscriptions/resources, reducing the overhead caused by manually performing these actions, as well as reduce the potential risk of human error resulting from manual actions.

See also

Azure Friday

Azure Friday | Batch and matrix routing with Azure Maps - Julie Kohler joins Scott Hanselman and shows you how to execute batch routing calls using Azure Maps as well as how to do matrix routing with a given set or origins and destinations.

Azure Friday | Batch geocoding and polygons for administrative areas with Azure Maps - Julie Kohler joins Scott Hanselman and shows you how to execute batch geocoding calls using Azure Maps as well as how to get the polygon for an administrative area on a map.

Azure Friday | Mapping IP to location with Azure Maps - Julie Kohler joins Scott Hanselman and shows you how to get the ISO country code for a provided IP address to tailor your application to your customers' needs based on geographic location.

Azure Friday | Calculating isochrones in Azure Maps - Julie Kohler joins Scott Hanselman and shows you how you can generate an isochrone using Azure Maps. Isochrones represent the reachable range from a given point using a set of constraints such as fuel, energy or time.

Events

Microsoft Ignite 2018 - If you're not able to join us next week for this premiere event, be sure to watch online to watch the live stream from Orlando.

Come check out Azure Stack at Ignite 2018 - If you're attending Microsoft Ignite in Orlando next week, the Azure Stack team has put together a list of sessions along with a pre-day event to ensure that you will enhance your skills on Microsoft’s hybrid cloud solution and get the most out of this year’s conference. The agenda is tailored for developers who use Azure Stack to develop innovative hybrid solutions using services on Azure Stack and Azure, as well as operators who are responsible for the operations, security, and resiliency of Azure Stack itself.

See also

Azure tips & tricks

How to work with extensions in Azure App Service thumbnail

How to work with extensions in Azure App Service - Learn how to add extensions to web applications in Azure App Service. Watch to see the list of the various extensions from Microsoft and external parties that you can choose from and how to create your own.

How to test web applications in production thumbnail

How to test web applications in production - Learn how to test web applications for production using Azure App Service. Using this testing in production feature, makes it easier for you to carry out AB testing with different versions of your application.

Customers and partners

Simplify modern data warehousing with Azure SQL Data Warehouse and Fivetran - Fivetran has certified their zero maintenance, zero configuration, data pipelines product for Azure SQL Data Warehouse. Fivetran is a simple to use system that enables customers to load data from applications, files stores, databases, and more into Azure SQL Data Warehouse. In addition, the Azure SQL Data Warehouse is present in Fivetran’s Cloud Data Warehouse Benchmark, which helps compare cloud providers TPCDS 1TB TPCDS performance.

Tuesdays with Corey

Tuesdays with Corey | What up with Azure File Sync - Corey Sanders, Corporate VP - Microsoft Azure Compute team sat down with Will Gries, Senior PM on the Azure Storage team to talk about Azure File Sync!

Industries

Improve patient engagement and efficiency with AI powered chatbots - Patients are more likely to give a higher rating for patient engagement if they have some way to communicate to an organization with as little friction as possible. Improving the patient engagement experience can also improve patient outcomes — the hope is to do it affordably. AI-powered health agents, also known as “chatbots,” can engage patients 24/7, using different languages, and at very low cost. Read this post to get an overview of powerful cloud-based tools you can use to rapidly create and deploy your AI-powered health agents in Microsoft Azure.

A Cloud Guru's Azure This Week

Azure This Week for 21 September 2018 thumbnail

Azure This Week | 21 September 2018 - Dean looks at Video Indexer, which is now GA, how you can save money and increase performance with GPUs vs CPUs for deep learning, as well as updating you on a brand new Azure course on A Cloud Guru.

Applications of R presented at EARL London 2018

$
0
0

During the EARL (Enterprise Applications of the R Language) conference in London last week, the organizers asked me how I thought the conference had changed over the years. (This is the conference's fifth year, and I'd been to each one.) My response was that it reflected the increasing maturity of R in the enterprise. The early years featured many presentations that were about using R in research and the challenges (both technical and procedural) for integrating that research into the day-to-day processes of the business. This year though, just about every presentation was about R in production, as a mainstream part of the operational infrastructure for analytics. 

That theme began in earnest with Garret Grolemund's keynote presentation on the consequences of scientific research that can't be replicated independently. (This slide, based on this 2016 JAMA paper was an eye-opener for me.) The R language has been at the forefront of providing the necessary tools and infrastructure to remove barriers to reproducible research in science, and the RMarkdown package and its streamlined integration in RStudio, is particularly helpful. Garret's keynote was recorded but the video isn't yet available; when it's published I highly recommend taking the time to watch this excellent speech.

The rest of the program was jam-packed with a variety of applications of R in industry as well. I couldn't see them all (it was a three-track conference), but every one I attended demonstrated mature, production-scale applications of R to solve difficult business problems with data. Here are just a few examples:

  • Rainmakers, a market research firm, uses R to estimate the potential market for new products. They make use of the officer package to automate the generation of Powerpoint reports.
  • Geolytix uses R to help companies choose locations for new stores that maximize profitability while reducing the risk of cannibalizing sales from other nearby locations. They use SQL Server ML Services to deploy these models to their clients.
  • Dyson uses R (and the prophet package) to forecast the expected sales of new models of vacuum cleaners, hand dryers, and other products, so that the manufacturing plants can ramp up (or down) production as needed.
  • Google uses R to design and analyze the results of customer surveys, to make the best decisions of which product features to invest in next.
  • N Brown Group, a fashion retailer, uses R to analyze online product reviews from customers.
  • Marks and Spencer uses R to increase revenues by optimizing the products shown and featured in the online store.  
  • PartnerRe, the reinsurance firm, has build an analytics team around R, and uses Shiny to deploy R-based applications throughout the company.
  • Amazon Web Services uses containerized applications with R to identify customers who need additional onboarding assistance or who may be dissatisfied, and to detect fraud.
  • Microsoft uses R and Spark (via sparklyr in HDInsight) to support marketing efforts for Xbox, Windows and Surface with propensity modeling, to identify who is most likely to respond to an offer. 

You can see may more examples at the list of speakers linked below. (I blogged about my own talk at EARL earlier this week.) Click through to see detailed summaries and (in most cases) a link to download slides.

EARL London 2018: Speakers 

Because it’s Friday: Fly Strong

$
0
0

I was about the same age as student pilot Maggie Taraska when I had my first solo flight. Unlike Maggie, I didn't have to deal with a busy airspace, or air traffic control, or engines (I was in a glider), or — most significantly — one of my landing wheels falling off during the flight. But Maggie handled the entire situation much more coolly than I remember my own short, uneventual, first solo. Just listen to the radio chatter below. If you're in a rush, you can skip the period from 2:00 – 7:30 while she circles the Beverly (MA) airport waiting for a couple of other planes to land first, but the whole thing is worth a listen to hear how she successfully lands with only one wheel on the left side.

 

That's all from us at the blog for this week. Have a great weekend, and we'll be back next week.

cppcon-2018-sweepstakes-official-rules

$
0
0

MICROSOFT CLOUD AND ENTERPRISE CPPCON EVENT SWEEPSTAKES

OFFICIAL RULES

NO PURCHASE NECESSARY. 

PLEASE NOTE: It is your sole responsibility to comply with your employer’s gift policies.  If your participation violates your employer’s policies, you may be disqualified. Microsoft disclaims any and all liability and will not be party to any disputes or actions related to this matter.

GOVERNMENT EMPLOYEES: Microsoft is committed to complying with government gift and ethics rules and therefore government employees are not eligible.

WHAT ARE THE START AND END DATES?

This Sweepstakes starts on September 22, 2018, and ends on September 27, 2018 (“Entry Period”) and will be conducted during regular event hours. Winners will be selected Thursday morning, September 27, before the 10:30am session.

Entries must be received within the Entry Period to be eligible.

CAN I ENTER?

You are eligible to enter if you meet the following requirements at time entry:

·       You are a registered attendee of CppCon and you are 18 years of age or older; and

·       If you are 18 of age or older, but are considered a minor in your place of residence, you should ask your parent’s or legal guardian’s permission prior to submitting an entry into this Sweepstakes; and

 You are NOT a resident of any of the following countries: Cuba, Iran, North Korea, Sudan, and Syria.

·       PLEASE NOTE: U.S. export regulations prohibit the export of goods and services to Cuba, Iran, North Korea, Sudan and Syria. Therefore residents of these countries / regions are not eligible to participate.

 You are NOT event exhibitor support personnel; and

·       You are NOT an employee of Microsoft Corporation, or an employee of a Microsoft subsidiary; and

·       You are NOT an immediate family (parent, sibling, spouse, child) or household member of a Microsoft employee, an employee of a Microsoft subsidiary, any person involved in any part of the administration and execution of this Sweepstakes.

This Sweepstakes is void where prohibited by law.

HOW DO I ENTER?

You can enter by completing a short survey, http://aka.ms/cppcon.  All required survey questions must be answered to receive an entry.

We will only accept one (1) entry(ies) per entrant. We are not responsible for entries that we do not receive for any reason, or for entries that we receive but are not decipherable for any reason.

We will automatically disqualify:

·       Any incomplete or illegible entry; and

·       Any entries that we receive from you that are in excess of the entry limit described above.

WINNER SELECTION AND PRIZES

At the close of the event, we will randomly select winners of the prizes designated below.

One (1) Grand Prize(s).  A(n) Microsoft Xbox One S – Starter Bundle – game console – 1 TB HDD – white.  Approximate Retail Value (ARV) $299.00.

The ARV of electronic prizes is subject to price fluctuations in the consumer marketplace based on, among other things, any gap in time between the date the ARV is estimated for purposes of these Official Rules and the date the prize is awarded or redeemed.  We will determine the value of the prize to be the fair market value at the time of prize award.

The total Approximate Retail Value (ARV) of all prizes: $299

We will only award one (1) prize(s) per person during the Entry Period.

WINNER NOTIFICATION

Winners will be selected randomly and must be present at time of drawing to win.

GENERAL CONDITIONS

Taxes on the prize, if any, are the sole responsibility of the winner.  All federal, state, and local laws and regulations apply.  No substitution, transfer, or assignment of prize permitted, except that Microsoft reserves the right to substitute a prize of equal or greater value in the event the offered prize is unavailable. Prize winners may be required to sign and return an Affidavit of Eligibility and Liability Release and W-9 tax  form or W-8 BEN tax form within 10 days of notification. If you complete a tax form, you will be issued an IRS 1099 the following January, for the actual value of the prize. You are advised to seek independent counsel regarding the tax implications of accepting a prize. If a selected winner cannot be contacted, is ineligible, fails to claim a prize or fails to return the Affidavit of Eligibility and Liability Release or W-8 BEN form, the selected winner will forfeit their prize and an alternate winner will be selected. Your odds of winning depend on the number of eligible entries we receive. In the event of a dispute all decisions of Microsoft are final.

By entering this Sweepstakes you agree:

·       To abide by these Official Rules; and

·       To release and hold harmless Microsoft and its respective parents, subsidiaries, affiliates, employees and agents from any and all liability or any injury, loss or damage of any kind arising from or in connection with this Sweepstakes or any prize won; and

·       That by accepting a prize, Microsoft may use of your proper name and state of residence online and in print, or in any other media, in connection with this Sweepstakes, without payment or compensation to you, except where prohibited by law.

WHAT IF SOMETHING UNEXPECTED HAPPENS AND THE SWEEPSTAKES CAN’T RUN AS PLANNED?

If someone cheats, or a virus, bug, bot, catastrophic event, or any other unforeseen or unexpected event that cannot be reasonably anticipated or controlled, (also referred to as force majeure) affects the fairness and / or integrity of this Sweepstakes, we reserve the right to cancel, change or suspend this Sweepstakes.  This right is reserved whether the event is due to human or technical error. If a solution cannot be found to restore the integrity of the Sweepstakes, we reserve the right to select winners from among all eligible entries received before we had to cancel, change or suspend the Sweepstakes.

If you attempt or we have strong reason to believe that you have compromised the integrity or the legitimate operation of this Sweepstakes by cheating, hacking, creating a bot or other automated program, or by committing fraud in ANY way, we may seek damages from you to the fullest extent permitted by law.  Further, we may disqualify you, and ban you from participating in any of our future Sweepstakes, so please play fairly.

WINNERS LIST AND SPONSOR

If you send an email to visualcpp@microsoft.com within 30 days of winner selection, we will provide you with a list of winners that receive a prize worth $25.00 or more.

This Sweepstakes is sponsored by Microsoft Corporation, One Microsoft Way, Redmond, WA 98052.

PRIVACY STATEMENT

At Microsoft, we are committed to protecting your privacy.  Microsoft uses the information you provide to notify prize winners, and to send you information about other Microsoft products and services, if requested by you.  Microsoft will not share the information you provide with third parties without your permission except where necessary to complete the services or transactions you have requested, or as required by law.  Microsoft is committed to protecting the security of your personal information.  We use a variety of security technologies and procedures to help protect your personal information from unauthorized access, use, or disclosure. Your personal information is never shared outside the company without your permission, except under conditions explained above.

If you believe that Microsoft has not adhered to this statement, please notify us by sending email to visualcpp@microsoft.com or postal mail to visualcpp@microsoft.com, Microsoft Privacy, Microsoft Corporation, One Microsoft Way, Redmond, WA 98052 USA, and we will use commercially reasonable efforts to remedy the situation.

Announcing automated ML capability in Azure Machine Learning

$
0
0

This post is co-authored by Sharon Gillett, Technical Advisor and Principal Program Manager, Research.

Intelligent experiences powered by machine learning can seem like magic to users. Developing them, however, can be anything but. Consider this “simple” tutorial chart from the scikit-learn machine learning library:

image

Source: scikit-learn machine learning library

Data scientists and developers face a series of sequential and interconnected decisions along the way to achieving "magic" machine learning solutions. For example, should they transform the input data, and if so, how – by removing nulls, rescaling, or something else entirely? What machine learning algorithm would be best - a support vector machine (SVM), logistic regression, or a tree-based classifier? What parameter values should they use for the chosen classifier – including decisions such as what the max depth and min split count should be for a tree-based classifier? And many more.

Ultimately, all these decisions will determine the accuracy of the machine learning pipeline - the combination of data pre-processing steps, learning algorithms, and hyperparameter settings that go into each machine learning solution.

illustr-transition-diagram

Unfortunately, the problem of finding the best machine learning pipeline for a given dataset scales faster than the time available for data science projects. Aiming for higher accuracy in one project may delay or preclude pursuing other projects. Or data scientists may have to settle for less-than-optimal accuracy in some projects, so they can make time for others.  And when machine learning solutions need to be maintained as data evolves – for example, models that detect spam or fraud, or models that depend on weather data – there’s always a tradeoff between keeping existing solutions up to date and devoting scarce data scientist time to new problems.
 
But what if a developer or data scientist could access an automated service that identifies the best machine learning pipelines for their labelled data? Automated machine learning (which we abbreviate as automated ML in the rest of this post) is a new capability that does exactly that. Automated ML is now in preview, accessible through the Azure Machine Learning service. Automated ML empowers customers, with or without data science expertise, to identify an end-to-end machine learning pipeline for any problem, achieving higher accuracy while spending far less of their time. And it enables a significantly larger number of experiments to be run, resulting in faster iteration towards production-ready intelligent experiences.

illustr-diagram-1

Microsoft is committed to democratizing AI through our products. By making automated ML available through the Azure Machine Learning service, we're empowering data scientists with a powerful productivity tool. We're working on making automated ML accessible through PowerBI, so that business analysts and BI professionals can also take advantage of machine learning. And stay tuned as we continue to incorporate it into other product channels to bring the power of automated ML to everyone.

Now, that's magic! But how does automated ML really work?

Automated ML is based on a breakthrough from our Microsoft Research division. The approach combines ideas from collaborative filtering and Bayesian optimization to search an enormous space of possible machine learning pipelines intelligently and efficiently. It's essentially a recommender system for machine learning pipelines. Similar to how streaming services recommend movies for users, automated ML recommends machine learning pipelines for data sets.

Streaming service (numbers represent user ratings for movies)

graph_1

Automated ML (numbers represent accuracy of pipelines evaluated on datasets)

graph_2

As indicated by the distributions shown on the right side of the figures above, automated ML also takes uncertainty into account, incorporating a probabilistic model to determine the best pipeline to try next. This approach allows automated ML to explore the most promising possibilities without exhaustive search, and to converge on the best pipelines for the user’s data faster than competing “brute force” approaches.

Just as important, automated ML accomplishes all this without having to see the customer’s data, preserving privacy. Automated ML is designed to not look at the customer’s data. Customer data and execution of the machine learning pipeline both live in the customer’s cloud subscription (or their local machine), which they have complete control of. Only the results of each pipeline run are sent back to the automated ML service, which then makes an intelligent, probabilistic choice of which pipelines should be tried next.

No need to “see” the data

AutoML_Blog_Illustrations_Illustr_Flow_2-color

We trained automated ML’s probabilistic model by running hundreds of millions of experiments, each involving evaluation of a pipeline on a data set. This training now allows the automated ML service to find good solutions quickly for your new problems. And the model continues to learn and improve as it runs on new ML problems – even though, as mentioned above, it does not see your data.

Automated ML is available to try in the preview of Azure Machine Learning. We currently support classification and regression ML model recommendation on numeric and text data, with support for automatic feature generation (including missing values imputations, encoding, normalizations and heuristics-based features), feature transformations and selection. Data scientists can use automated ML through the Azure Machine Learning Python SDK and Jupyter notebook experience. Training can be performed on a local machine or by leveraging the scale and performance of Azure by running it on Azure Machine Learning managed compute. Customers have the flexibility to pick a pipeline from automated ML and customize it before deployment. Model explainability, ensemble models, full support for Azure Databricks and improvements to automated feature engineering will be coming soon.

Get started by visiting our documentation and let us know what you think - we look forward to making automated ML better for you!


Azure AI – Making AI real for business

$
0
0

AI, data and cloud are ushering the next wave of transformative innovations across industries. With Azure AI, our goal is to empower organizations to apply AI across the spectrum of their business to engage customers, empower employees, optimize operations and transform products. We see customers using Azure AI to derive tangible benefits across three key solution areas. 

  • First, using machine learning to build predictive models that optimize business processes.
  • Second, building AI powered apps and agents to deliver natural user experience by integrating vision, speech and language capabilities into web and mobile apps. 
  • Third, applying knowledge mining to uncover latent insights from documents.  

Today, at Microsoft Ignite, we are excited to announce a range of innovations across these areas to make Azure the best place for AI. Let me walk you through them.

Machine Learning 

From pre-trained models to powerful services to help you build your own models, Azure provides the most comprehensive machine learning platform.

To simplify development of speech, vision, and language machine learning solutions, we provide a powerful set of pre-trained models as part of Azure Cognitive Services.  When it comes to building your own deep learning models, in addition to supporting popular frameworks such as PyTorch and TensorFlow, we have worked with Facebook to co-develop ONNX (Open Neural Network Exchange) that enables model interoperability across frameworks. We offer a choice of Machine Learning services with Azure Databricks, Azure Machine Learning and Machine Learning VMs so you can apply machine learning at any scale. To help you build and train models faster, we provide distributed deep learning capabilities over massive GPU clusters. You can also access powerful hardware (FPGA) accelerated models for very high-speed image classification and recognition scenarios at low cost. Once you have built your model, you can deploy on-premises, in the cloud or on the edge, including offline environments.


ML on Azure

Today, we are announcing several updates to Azure Machine Learning, available in preview.

  • Automated machine learning and hyper-parameter tuning to help you accelerate the rate of model development by identifying suitable algorithms, features and ML pipelines faster.
  • Distributed deep learning to enable you to develop deep learning solutions faster with massive, managed GPU clusters. 
  • Hardware accelerated inferencing for high speed image classification and recognition scenarios using powerful FPGAs. Supported models include ResNet 50, ResNet 152, VGG-16, SSD-VGG, and DenseNet-121 that can be trained using your data.   
  • A new Python SDK to enable you to access Azure Machine Learning from your favorite Python development environment, including Visual Studio Code, Visual Studio, PyCharm, Azure Databricks notebooks, or Jupyter notebooks.
  • Model management capabilities to help you manage you manage Dockerized models using models and images registry that integrate into your continuous integration (CI/CD) pipeline.

“By using Azure Machine Learning, we can meet a variety of customer needs, learn from all the analytics, and put that learning into new products and services to improve our offerings”.

– Rosemary Yeilding Radich Director, Data Science, AccuWeather

AI apps and agents

Azure provides a number of services specifically designed to help you build AI powered apps & agents. As outlined in the previous section, Azure Cognitive Services allow you to infuse your apps with powerful pre-trained models that can help with vision, speech, language and web search. We have over 1.2 million developers using Cognitive Services to deliver AI led experiences to their users. We continue to build on the momentum and we are excited to announce the general availability. It combines the following capabilities into one service:

  • Speech to Text
  • Text to Speech
  • Custom Speech
  • Speech Translation
  • Custom Voice

Additionally, Content Moderator, Computer Vision, Face, Text Analytics, Translator Text, and Language Understanding APIs will be generally available in US Government regions in the next few weeks. 

Microsoft has reached another AI milestone in text-to-speech synthesis with a system that uses deep neural networks to make the voices of computers nearly indistinguishable from recordings of people. This capability is available in preview through Azure Cognitive Speech Service and you can sign up for access.

Azure Bot Service helps you build, connect, deploy, and manage intelligent agents to interact naturally with your users in app or on web. Native integration with Cognitive Services, enables you to easily leverage cognitive capabilities to deliver an engaging experience to your users. For example, you can use Language Understanding to better interact with your users by understanding context and continuing a conversation in a very natural manner. The Bot Service leverages the Microsoft Bot Framework, which is a set of open source SDKs for building conversational experiences. Bot Framework also includes an emulator and a set of CLI tools to streamline the creation and management of different bot language understanding services. There are over 340K users of the Bot Framework. Today, we are making Bot Framework v4 generally available, offering broader language support with C# and JavaScript generally available, Python and Java in preview, and better extensibility to benefit from a rich ecosystem of pluggable components like dialog management and machine translation. 

We have a number of customers such as Progressive, Dixons Carphone, UPS, Progressive and Adobe building AI powered apps & agents.

Knowledge Mining

Organizations have valuable information lying latent in historical records across multiple data formats that are difficult to explore such as forms, PDFs and Images. Unfortunately, this information is hidden, not searchable and hence not useful. Azure Cognitive Search (in preview) adds Cognitive Services on top of Azure Search, helping you glean knowledge from vast amounts of data. With Cognitive Search, you can easily add object recognition, computer vision, entity extraction and other cognitive skills to explore various data types and then bring all the output together to uncover opportunities. Since announcing the preview of Cognitive Search at Microsoft Build earlier this year, we have seen rapid adoption, with tens of millions of documents getting processed every month.

Knowledge Mining

“Azure Cognitive Search has helped us deliver an intelligent, easy-to-use platform to our customers that surfaces actionable insights from tons of unstructured data – using entity recognition, language translation and other cognitive capabilities”.

  – Monish Darda, CTO and co-founder, Icertis

As we continue to make Azure the best place for AI, we are most excited to see what customers can do with AI. The opportunities are limitless, and we are looking forward to seeing what you create with Azure AI. Get started today.

Additional resources

What’s new in Azure Machine Learning service

$
0
0

Today we are very happy to release the new capabilities for the Azure Machine Learning service. Since our initial public preview launch in September 2017, we have received an incredible amount of valuable and constructive feedback. Over the last 12 months, the team has been very busy enhancing the product, addressing feedbacks, and adding new capabilities. It is extremely exciting to share these new improvements with you. We are confident that they will dramatically boost productivity for data scientists and machine learning practitioners in building and deploying machine learning solutions at cloud scale.

In this post, I want to highlight some of the core capabilities of the release with a bit more technical details.

Python SDK & Machine Learning workspace

Most of the recent machine learning innovations are happening in the Python language space, which is why we chose to expose the core features of the service through a Python SDK. You can install it with a simple pip install command, preferably in an isolated conda virtual environment.

# install just the base package
$ pip install azureml-sdk

# or install additional capabilities such as automated machine learning
$ pip install azureml-sdk[automl]

You will need access to an Azure subscription in order to fully leverage the SDK. First think you need to do is to create an Azure Machine Learning workspace. You can do that with the Python SDK or through the Azure portal. A workspace is the logical container of all your assets, and also the security and sharing boundary.

workspace = Workspace.create(name='my-workspace',
                             subscription_id='<azure-subscription-id>',
                             resource_group='my-resource-group')

Once you create a workspace, you can access a rich set of Azure services through the SDK and start rocking data science. For more details on how to get started, please visit the Get started article.

Data preparation

Data preparation is an important part of a machine learning workflow. Your models will be more accurate and efficient if they have access to clean data in a format that is easier to consume. You can use the SDK to load data of various formats, transform it to be more usable, and write that data to a location for your models to access. Here is a cool example of transforming a column into 2 new columns by supplying a few examples:

df2 = df1.derive_column_by_example(source_columns='start_time',
                                   new_column_name='date',
                                   example_data=[
                                   ('2017-12-31 16:57:39.6540', '2017-12-31'),
                                   ('2017-12-31 16:57:39', '2017-12-31')])
df3 = df2.derive_column_by_example(source_columns='start_time',
                                   new_column_name='wday',
                                   example_data=[('2017-12-31 16:57:39.6540', 'Sunday')])

For more details, please visit this article.

Experiment tracking

Modeling training is an iterative process. The Azure Machine Learning service can track all your experiment runs in the cloud. You can run an experiment locally or remotely, and use the logging APIs to record any metric, or upload any file.

exp = Experiment(workspace, "fraud detection")
with exp.start_logging() as run:
      # your code to train the model omitted
      ... ...
     
      run.log("overall accuracy", acc) # log a single value
      run.log_list("errors", error_list) # log a list of values
      run.log_row("boundaries", xmin=0, xmax=1, ymin=-1, ymax=1) # log arbitrary key/value pairs
      run.log_image("AUC plot", plt) # log a matplotlib plot
      run.upload("model", "./model.pkl") # upload a file

Once the run is complete (or even while the run is being executed), you can see the tracked information in the Azure portal.

image

You can also query the experiment runs to find the one that recorded the best metric as defined by you such as highest accuracy, lowest mean squared error, and more. You can then register the model produced by that run in the model registry under the workspace. You can also take that model and deploy it.

# Find the run that has the highest accuracy metric recorded.
best_run_id = max(run_metrics, key=lambda k: run_metrics[k]['accuracy'])

# reconstruct the run object based on run id
best_run = Run(experiment, best_run_id)

# register the model produced by that run
best_run.register_model('best model', 'outputs/model.pkl')

Find more detailed examples of how to use experiment logging APIs.

Scale your training with GPU clusters

It is OK to tinker with small data sets locally on a laptop, but training a sophisticated model might require large-scale compute resources in the cloud. With the Python SDK, you can simply attach existing Azure Linux VMs, or attach an Azure HDInsight for Spark clusters to execute your training jobs. You can also easily create a managed compute cluster, also known as Azure Batch AI cluster, to run your scripts.

pc = BatchAiCompute.provisioning_configuration(vm_size="STANDARD_NC6",
                                               autoscale_enabled=True,
                                               cluster_min_nodes=0,
                                               cluster_max_nodes=4)
cluster = compute_target = ComputeTarget.create(workspace, pc)

The code above creates a managed compute cluster equipped with GPUs (“STANDARD_NC6” Azure VM type). It can automatically scale up to 4 nodes when jobs are submitted to it, or scale back down to 0 nodes when jobs are finished to save cost. It is perfect for running many jobs in parallel, supporting features like intelligent hyperparameter tuning or batch scoring, or training large deep learning model in a distributed fashion.

Find a more detailed example of how to create a managed computer cluster.

Flexible execution environments

Azure Machine Learning service supports executing your scripts in various compute targets, including on local computer, aforementioned remote VM, Spark cluster, or managed computer cluster. For each compute target, it also supports various execution environments through a flexible run configuration object. You can ask the system to simply run your script in a Python environment you have already configured in the compute target, or ask the system to build a new conda environment based on dependencies specified to run your job. You can even ask the system to download a Docker image to run your job. We supply a few base Docker images you can choose from, but you can also bring your own Docker image if you’d like.

Here is an example of a run configuration object that specifies a Docker image with system managed conda environment:

# Create run configuration object
rc = RunConfiguration()
rc.target = "my-vm-target"
rc.environment.docker.enabled = True
rc.environment.python.user_managed_dependencies = False
rc.environment.docker.base_image = azureml.core.runconfig.DEFAULT_CPU_IMAGE

# Specify conda dependencies with scikit-learn
cd = CondaDependencies.create(conda_packages=['scikit-learn'])
rc.environment.python.conda_dependencies = cd

# Submit experiment
src = ScriptRunConfig(source_directory="./", script='train.py', run_config=rc)
run = experiment.submit(config=src)

The SDK also includes a high-level estimator pattern to wrap some of these configurations for TensorFlow and PyTorch-based execution to make it even easier to define the environments. These environment configurations allow maximum flexibility, so you can strike a balance between reproducibility and control. For more details, check out how to configure local execution, remote VM execution, and Spark job exeuction in HDInsight.

Datastore

With the multitude of compute targets and execution environments support, it is critical to have a consistent way to access data files from your scripts. Every workspace comes with a default datastore which is based on Azure blob storage account. You can use it to store and retrieve data files. And you can also configure additional datastores under the workspace. Here is a simple example of using datastore:

# upload files from local computer into default datastore
ds = workspace.get_default_datastore()
ds.upload(src_dir='./data', target_path='mnist', overwrite=True)

# pass in datastore's mounting point as an argument when submitting an experiment.
scr = ScriptRunConfig(source_directory="./",
                      script="train.py",
                      run_config=rc,
                      arguments={'--data_folder': ds.as_mount()})
run = experiment.submit(src)

Meanwhile, in the training script which is executed in the compute cluster, the datastore is automatically mounted. All you have to do is to get hold of the mounting path.

args = parser.parse_args()
# access data from the datastore mounting point
training_data = os.path.join(args.data_folder, 'mnist', 'train-images.gz')

For more detailed information on datastore.

automated machine learning

Given a training data set, choosing the right data preprocessing mechanisms and the right algorithms can be a challenging task even for the experts. Azure Machine Learning service includes advanced capabilities to automatically recommend machine learning pipeline that is composed of the best featurization steps and best algorithm, and the best hyperparameters, based on the target metric you set. Here is an example of automatically finding the pipeline that produces the highest AUC_Weighted value for classifying the given training data set:

# automatically find the best pipeline that gives the highest AUC_weighted value.
cfg = AutoMLConfig(task='classification',
                   primary_metric="AUC_weighted",
                   X = X_train,
                   y = y_train,
                   max_time_sec=3600,
                   iterations=10,
                   n_cross_validations=5)
run = experiment.submit(cfg)

# return the best model
best_run, fitted_model = run.get_output()

Here is a printout of the run. Notice iteration #6 represents the best pipeline which includes a scikit-learn StandardScaler and a LightGBM classifier.

ITERATION PIPELINE DURATION METRIC BEST
0 SparseNormalizer LogisticRegression 0:00:46.451353 0.998 0.998
1 StandardScalerWrapper KNeighborsClassi 0:00:31.184009 0.998 0.998
2 MaxAbsScaler LightGBMClassifier 0:00:16.193463 0.998 0.998
3 MaxAbsScaler DecisionTreeClassifier 0:00:12.379544 0.828 0.998
4 SparseNormalizer LightGBMClassifier 0:00:21.779849 0.998 0.998
5 StandardScalerWrapper KNeighborsClassi 0:00:11.910200 0.998 0.998
6 StandardScalerWrapper LightGBMClassifi 0:00:33.010702 0.999 0.999
7 StandardScalerWrapper SGDClassifierWra 0:00:18.195307 0.994 0.999
8 MaxAbsScaler LightGBMClassifier 0:00:16.271614 0.997 0.999
9 StandardScalerWrapper KNeighborsClassi 0:00:15.860538 0.999 0.999

automated machine learning supports both classification and regression, and it includes features such as handling missing values, early termination by a stopping metric, blacklisting algorithms you don’t want to explore, and many more. To learn more explore the automated machine learning article.

Intelligent hyperparameter tuning

Once you identify the best algorithm, you would typically search through the various hyperparameter combinations to find the one that gives the best performance. For an iteratively improving algorithm, hyperparameter tuning (aka parameter sweep) can be computationally expensive and time-consuming. The Azure Machine Learning service delivers intelligent hyperparameter tuning capabilities by automatically scheduling parameter search jobs on the managed compute clusters, and through user-defined early termination policies to accelerate the process of finding the best parameter combination. Here is an example of using hyperparameter tuning with an estimator object.

# parameter grid to search
parameter_samples = RandomParameterSampling{
       "--learning_rate": loguniform(-10, -3),
       "--batch_size": uniform(50,300),
       "--first_layer_neurons": choice(100, 300, 500),
       "--second_layer_neurons": choice(10, 20, 50, 100)
}

# early termination policy
policy = BanditPolicy(slack_factor=0.1, evaluation_interval=2)

# hyperparameter tuning configuration
htc = HyperDriveRunConfig(estimator=my_estimator,
                          hyperparameter_sampling=parameter_samples,
                          policy=policy,
                          primary_metric_name='accuracy',  
                          primary_metric_goal=PrimaryMetricGoal.MAXIMIZE,
                          max_total_runs=200,
                          max_concurrent_runs=10,
)
run = experiment.submit(htc)

The above code will randomly search through the given parameter space, and launch up to 200 jobs on the computer target configured on the estimator object, and look for the one that returns the highest accuracy. The BanditPolicy checks the accuracy produced by each job every two iterations, and terminates the job if the “accuracy” value is not with the 10 percent slack of the highest “accuracy” reported so far from other runs. Find a more detailed description of intelligent hyperparameter tuning.

Distributed training

When training a deep neural network, it can be more efficient to parallelize the computation over a cluster of GPU-equipped computers. Configuring such a cluster, and parallelizing your training script can be a tedious and error-prone task. Azure Machine Learning service provisions managed compute clusters with distributed training capabilities already enabled. Whether you are using the native parameter server option that ships with TensorFlow, or an MPI-based approach leveraged by Hovorod framework combined with TensorFlow, or Horovod with PyTorch, or CNTK with MPI, you can train the network in parallel with ease.

Here is an example of configuring an TensorFlow training run with MPI support with the distributed_backend flag set to mpi. In the word2vec.py file you can leverage Horovod that is automatically installed for you.

tf_estimator = TensorFlow(source_directory="./",
                          compute_target=my_cluster,
                          entry_script='word2vec.py',
                          script_params=script_params,
                          node_count=4,
                          process_count_per_node=1,
                          distributed_backend="mpi",
                          use_gpu=True)

Find more details on distributed training in Azure Machine Learning service.

Pipeline

Azure Machine Learning pipeline enables data scientists to create and manage multiple simple and complex workflows concurrently. A typical pipeline would have multiple tasks to prepare data, train, deploy and evaluate models. Individual steps can make use of diverse compute options (for e.g.: CPU for data preparation and GPU for training) and languages. User can also use published pipelines for scenarios like batch-scoring and retraining.

Here is a simple example showing a sequential pipeline for data preparation, training and batch scoring:

# Uses default values for PythonScriptStep construct.
s1 = PythonScriptStep(script_name="prep.py", target='my-spark-cluster', source_directory="./")
s2 = PythonScriptStep(script_name="train.py", target='my-gpu-cluster', source_directory="./")
s3 = PythonScriptStep(script_name="batch_score.py", target='my-cpu-cluster', source_directory="./")

# Run the steps as a pipeline
pipeline = Pipeline(workspace=ws, steps=[s1, s2, s3])
pipeline.validate()
pipeline_run = experiment.submit(pipeline)

Using a pipeline can drastically reduce complexity by organizing multi-step workflows, improve manageability by tracking the entire workflow as a single experiment run, and increase usability by recording all intermediary tasks and data. For more details, please visit this documentation.

Model management

A model can be produced out of a training run from an Azure Machine Learning experiment. You can use a simple API to register it under your workspace. You can also bring a model generated outside of Azure Machine Learning and register it too.

# register a model that's generated from a run and stored in the experiment history
model = best_run.register_model(model_name='best_model', model_path='outputs/model.pkl')

# or, register a model from local file
model = Model.register(model_name="best_model", model_path="./model.pkl", workspace=workspace)

Once registered, you can tag it, version it, search for it, and of course deploy it. For more details on model management capabilities, review this notebook.

Containerized deployment

Once you have a registered model, you can easily create a Docker image by using the model management APIs from the SDK. Simply supply the model file from your local computer or using a registered model in your workspace, add a training script and a package dependencies file. The system uploads everything into the cloud and creates a Docker image, then registers it with your workspace.

img_conf = ContainerImage.image_configuration(runtime="python",
                                                execution_script="score.py",
                                                conda_file="dependencies.yml")
# create a Docker image with model and scoring file
image = Image.create(name="my-image",                   
                     models=[model_obj],
                     image_config=image_config,
                     workspace=workspace)

You can choose to deploy the image to Azure Container Instance (ACI) service, a computing fabric to run a Docker container for dev/test scenarios, or deploy to Azure Kubernetes Cluster (AKS) service for scale-out and secure production environment. Find more deployment options.

Using notebooks

The Azure Machine Learning Python SDK can be used in any Python development environment that you choose. In addition, we have enabled tighter integration with the following notebook surface areas.

Juypter notebook

You may have noticed that nearly all the samples I am citing are in the form of Jupyter notebooks published in GitHub. Jupyter has become one of the most popular tools for data science due to its interactivity and self-documenting nature. To facilitate Juypter users interacting with experiment run history, we have created a run history widget to monitor any run object. Below is an example of a hyperparameter tuning run.

image

run history widget

To see the run history widget in action, follow this notebook.

Azure Notebooks

Azure Notebooks is a free service that can be used to develop and run code in a browser using Jupyter. We have preinstalled the SDK in the Python 3.6 kernel of the Azure Notebooks container, and made it very easy to clone all the sample notebooks into your own library. In addition, you can click on the Get Started in Azure Notebooks button from a workspace in Azure portal. The workspace configuration is automatically copied into your cloned library as well so you can access it from the SDK right away.

image

Integration with Azure Databricks

Azure Databricks is an Apache Spark–based analytics service to perform big data analytics. You can easily install the SDK in the Azure Databricks clusters and use it for logging training run metrics, as well as containerize Spark ML models and deploy them into ACI or AKS, just like any other models. To get started on Azure Databricks with Azure Machine Learning Python SDK, please review this notebook.

Visual Studio Code Tools for AI

Visual Studio Code is a very popular code editing tool, and its Python extension is widely adopted among Python developers. Visual Studio Code Tool for AI seamlessly integrates with Azure Machine Learning for robust experimentation capabilities. With this extension, you can access all the cool features from experiment run submission to run tracking, from compute target provisioning to model management and deployment, all within a friendly user interface. Here is a screenshot of the Visual Studio Code Tool for AI extension in action.

image

Please download the Visual Studio Code Tool for AI extension and give it a try.

Get started, now!

This has been a very long post so thank you for your patience to read through it. However, I have barely scratched the surface of Azure Machine Learning service. There are so many other exciting features that I am not able to get to. You will have to explore them on your own:

And many more to come. Please visit the Getting started guide to start the exciting journey with us!

Premium Files pushes Azure Files limits by 100x!

$
0
0

Yes, you heard it right!

Today, we are excited to introduce the limited public preview of Azure Premium Files, the high-performance file service for enterprise applications in the cloud. Azure Premium Files provides fully managed file services, optimized to deliver consistent performance at 100x improvement over the existing Azure Files. It is designed for IO intensive enterprise workloads that require high throughput and a single digit millisecond latency. With the introduction of Premium Files, Azure now offers you a choice for two types of shared files storage to fit your workload needs – Premium and Standard. Premium Files, our new offering, stores data on the latest Solid-State Drives (SSDs), which makes it suitable for wide variety of workloads like file services, databases, web infrastructure, content and collaboration repositories, analytics, home directories, high variable and batch workloads, among many others. Standard Files, our existing offering, is designed to provide reliable performance for general purpose file storage, development/test, and application workloads that are less sensitive to performance variability.

Azure Premium Files in a nutshell

PremiumFiles

Performance - on your terms!

You define and get the performance characteristics your workload desires! Premium Files offers a way for you to customize file shares to scale IOPS and throughput based on the provisioned capacity to fit your workload performance characteristics without compromising latency. As compared to the current file shares, premium shares can massively scale IOPS by a factor of 100x and targets throughput up to 5 GiB/s, which is 85x improvements.

Premium Files offers the best out of the box experience. All the shares, regardless of the provisioned capacity, get a baseline throughput of 100 MiB/s. For each GiB provisioned, the shares get 1 baseline IOPS and 0.1MiB/s throughput up to the maximum allowed limits. Premium Files also offers the ability to operate in a burst mode, which is suited for highly variable and batch workloads requiring short periods of intense IOPS.

Let’s look at how the burst mode works. Any unused baseline IOs are accrued in the burst credit bucket. Shares can burst up to 3x their baseline IOPS if there are enough IO credits accrued. On a best effort basis, all shares can burst up to 3 IOPS per provisioned GiB for up to 60 minutes and shares larger than 50 TiB can go over 60 minutes duration.

Example: For a 5 TiB provisioned share*,
Throughput: 100 MiB/s+(0.1)*5,120=612 MB/s
Baseline IOPS: 5,120
Burst IOPS Limit: 15,360

*Premium Files limited preview: IOPS and throughput numbers can vary based on the access patterns. The baseline rates, burst credit, and burst duration are subject to change at the time of general availability. The max scale limits are based off the shares scale with a large capacity of up to100 TiB.

Capacity – flexible

Shares sizes are mutable. You can change the capacity of shares and shares will shrink and grow to fit your demand. The current limited public preview of Premium Files supports the scale limits up to 5 TiB shares. With the preview of larger and high scale Azure File shares, Premium Files shares will also increase its 5 TiB capacity limits to 100 TiB. Complete larger and higher scale file shares preview request form to join the preview waitlist. We have an overwhelming interest in this feature, we will contact you once preview is available for you to try.

Cost – simple and predictable

You just pay a per GiB cost. No transactions cost. No additional cost for throughput and IOPS bursting. For IO intensive workloads, you can save money by moving to the unlimited transactions model. With in-built bursting capability, Premium Files removes the need to over-provision capacity to adjust for workloads requiring short periods of high IOPS. Refer to the pricing page for additional details.

How can I get started?

Currently, Azure Premium Files is available for limited public preview. To sign up for the preview, complete the Premium Preview sign-up survey and then submit a request to enroll your subscription for the preview by running the following PowerShell:

Register-AzureRmProviderFeature -FeatureName AllowPremiumFiles -ProviderNamespace Microsoft.Storage

It may take up to 5 business days to receive an approval. Depending on the demand and scenario fit, we may not be able to accommodate every request. To verify successful registration approval, run the following command:

Get-AzureRmProviderFeature -FeatureName AllowPremiumFiles -ProviderNamespace Microsoft.Storage

Using your approved subscription, create a new Premium Files storage account - select Premium as the Performance, FileStorage (preview) as the Account kind, and Locally-redundant storage (LRS) as the Replication.

PremiumUI

Currently, Premium Files storage preview is available in the following regions and we will be gradually expanding the region coverage:

  • Central US
  • East US 2
  • West US
  • North Europe (coming soon)
  • East US (coming soon)

All shares created under premium file storage account will be premium file shares. To learn more, visit Introduction to Azure Files.

Pricing

For the limited period in the preview, Azure Premium Files storage will be free. Refer to the pricing page for further details.

What’s next?

Give Premium Files a try and provide us with feedback to help us make this product better for you! You can share your feedback either via comments on this blog or on Azure Storage forum or just send us an email at PFSFeedback@microsoft.com. You can also post your ideas and suggestions about Azure Storage on Azure Storage feedback forum.

And, we’re on to the next big thing! Till then.

Happy sharing!

A new era for Azure Files: Bigger, faster, better!

$
0
0

Azure Files is the only public cloud file storage that delivers secure, Server Message Block (SMB) based, fully managed cloud file shares that can also be cached on-premises for performance and compatibility. Today, we are excited to announce that Azure Files just got bigger, faster, and better than ever before.

That’s right … starting today we are launching the previews of larger, low latency file shares with more IOPS, higher throughput, and Azure Active Directory integration. But wait, there’s more! We are also excited to announce the forthcoming new Azure File Sync agent with significant improvements in the sync performance, tiering, and reporting.

Let’s take a sneak peek into the new Azure Files ...

Files scale and performance


NewEraPic

Premium Files, new and now in limited public preview, is optimized to deliver consistent performance for IO-intensive enterprise workloads that require high throughput with single digit millisecond latency. Premium Files is suitable for a wide variety of workloads such as databases, home directories, analytics, content, and collaboration. Its performance, whether measured by IOPS or throughput, scales with the provisioned capacity to fit your workload’s needs. With premium file shares, IOPS can scale up to 100x and capacity can scale up to 20x the current Azure Files scale limits. To accommodate workload IO spikes for a short duration, premium file shares can also operate in burst mode. Learn more about the Premium Files limited public preview announcement and how to try the preview.

Standard files, our existing storage, is designed to provide reliable performance for IO workloads that are less sensitive to performance variability. It is well suited for general purpose files services and dev/test environments. We have also made significant investments in standard files to increase the capacity limits by 20x and IOPS limits up to 10x. The preview for the 100 TiB shares with higher scale limits will be launching soon. You can join the waitlist for the preview of 100 TiB shares by completing this request form.

Files security and access control

Azure Active Directory (AD) integration is a critical capability for Azure Files to offer a seamless lift and shift experience from on-premises file services to fully managed cloud services. Today, we are pleased to announce the preview of Azure AD authentication for Azure Files SMB access. This feature allows the native preservation of Windows access control lists (ACLs) on Azure file shares. It also enables end users to access Azure file shares through an Azure AD Domain Services joined machine with Azure AD credentials. Learn more details on the Azure AD Directory integration preview program.

Hybrid flexibility

Azure File Sync recently became generally available, concluding a phenomenal preview period with great adoption and feedback received from you, our customers. In this month’s update release we are adding Windows Server 2019 support, a date-based cloud tiering policy, a lot of performance improvements, new metrics charts in the Azure portal for sync, cloud tiering, and server online states - and thanks to Azure Metrics (preview), you can even set up alerts! Furthermore, we have been collaborating with DataON, whose Storage Spaces Direct appliance dubbed Kepler-47 is well-suited for Azure File Sync. Simply create a VM on this cluster and leverage all the benefits of Azure File Sync on top of DataON’s innovative footprint.

Come and learn at Ignite; and go try for yourself!

If you are attending Ignite this year, come to Azure Files session and learn more about these exciting announcements:

NewEraTable

    You can share your feedback via Azure Storage forum or just send us an email at AzureFiles@microsoft.com.
    .

    Advanced Threat Protection for Azure Database for PostgreSQL in preview

    $
    0
    0

    This blog post was co-authored by Ron Matchoro, Principal Program Manager, Azure SQL Database.

    Advanced Threat Protection detects anomalous database activities indicating potential security threats to Azure Database for PostgreSQL.

    Advanced Threat Protection provides a new layer of security, which enables customers to detect and respond to potential threats as they occur by providing security alerts on anomalous database activities. Advanced Threat Protection makes it simple to address potential threats to the Azure Database for PostgreSQL server and integrates its alerts with Azure Security Center, without needing to be a security expert.

    For a full investigative experience, it is recommended to use Server Logs in Azure Database for PostgreSQL, which generates database query and error logs.

    image

    The benefits of Advanced Threat Protection include:

    • Simple configuration of Advanced Threat Protection policy via the Azure portal.
    • Clear email alerts upon detection of suspicious database activities.
    • Ability to explore the activity log around the time of the event using the Azure portal.
    • No need to modify database procedures or application code.

    Set up Advanced Threat Protection for your Azure Database for PostgreSQL server in the Azure portal

    • Launch the Azure portal.
    • Navigate to the configuration page of the Azure Database for PostgreSQL server you want to protect. In the Settings page, select Advanced Threat Protection.
    • In the Advanced Threat Protection configuration blade:
      • Turn ON Threat detection.
      • Configure the list of emails that will receive security alerts upon detection of anomalous database activities.
    • Click Save in the Advanced Threat Protection configuration blade to save the new or updated threat detection policy.

    set-up-threat-protection

    Explore anomalous PostgreSQL server activities upon detection of a suspicious event

    You will receive an email notification upon detection of anomalous database activities. The email will provide information about the suspicious security event including the nature of the anomalous activities, database name, server name, and the event time. In addition, it will provide information on possible causes and provide recommended actions to investigate and mitigate the potential threat to the Azure Database for PostgreSQL server.

    anomalous-activity-report

    Clicking on the view recent alerts link in the email will launch the Azure portal and show the Azure Security Center alerts blade, which provides an overview of active SQL threats detected on the Azure Database for PostgreSQL server.

    active-threats

    Clicking on a specific alert provides additional details and actions for investigating this threat and remediating future threats.

    specific-alert

    Azure Database for PostgreSQL server Advanced Threat Protection alerts

    Advanced Threat Detection for Azure Database for PostgreSQL server detects anomalous activities indicating unusual and potentially harmful attempts to access or exploit databases. It can trigger the following alerts:

    • Access from unusual locations: This alert is triggered when there is a change in the access pattern to an Azure Database for PostgreSQL server, where someone has logged onto the Azure Database for PostgreSQL server from an unusual geographical location. In some cases, the alert detects a legitimate action, meaning a new application or developer’s maintenance operation. In other cases, the alert detects a malicious action such as former employee, external attacker, etc..
    • Access from unusual Azure data center: This alert is triggered when there is a change in the access pattern to Azure Database for PostgreSQL server. Where someone has logged onto the Azure Database for PostgreSQL server from a Azure Data Center that was not seen accessing this Managed Instance during the recent period. In some cases, the alert detects a legitimate action such as your new application in Azure, Power BI, Azure SQL Query Editor, etc. In other cases, the alert detects a malicious action from an Azure resource or service such as a former employee or external attacker.
    • Access from unfamiliar principal: This alert is triggered when there is a change in the access pattern to Azure Database for PostgreSQL server, where someone has logged on to the Azure Database for PostgreSQL server using an unusual principal. In some cases, the alert detects a legitimate action such as new application developer’s maintenance operation. In other cases, the alert detects a malicious action such as a former employee or external attacker.
    • Access from a potentially harmful application: This alert is triggered when a potentially harmful application is used to access the database. In some cases, the alert detects penetration testing in action. In other cases, the alert detects an attack using common attack tools.
    • Brute force login credentials: This alert is triggered when there is an abnormally high number of failed logins with different credentials. In some cases, the alert detects penetration testing in action. In other cases, the alert detects a brute force attack.

    Next steps

    Advanced Threat Protection for Azure Database for MySQL in preview

    $
    0
    0

    This blog post was co-authored by Ron Matchoro, Principal Program Manager, Azure SQL Database.

    Advanced Threat Protection detects anomalous database activities indicating potential security threats to Azure Database for MySQL.

    Advanced Threat Protection provides a new layer of security, which enables customers to detect and respond to potential threats as they occur by providing security alerts on anomalous database activities. Advanced Threat Protection makes it simple to address potential threats to the Azure Database for MySQL server without the need to be a security expert and integrates its alerts with Azure Security Center.

    For a full investigation experience, it is recommended to use Server Logs in Azure Database for MySQL , which generates database query and error logs.

    image

    The benefits of Advanced Threat Protection include:

    • Simple configuration of Advanced Threat Protection policy via the Azure portal.
    • Clear email alerts upon detection of suspicious databases activities.
    • Ability to explore the activity log around the time of the event using the Azure portal.
    • No need to modify database procedures or application code.

    Set up Advanced Threat Protection for your Azure Database for MySQL server in the Azure portal

    • Launch the Azure portal.
    • Navigate to the configuration page of the Azure Database for MySQL server you want to protect. In the Settings page, select Advanced Threat Protection
    • In the Advanced Threat Protection configuration blade
      • Turn ON Threat detection.
      • Configure the list of emails that will receive security alerts upon detection of anomalous database activities.
    • Click Save in the Advanced Threat Protection configuration blade to save the new or updated threat detection policy.

    set-up-threat-protection

    Explore anomalous MySQL server activities upon detection of a suspicious event

    You will receive an email notification upon detection of anomalous database activities. The email will provide information about the suspicious security event including the nature of the anomalous activities, database name, server name and the event time. In addition, it will provide information on possible causes and recommended actions to investigate and mitigate the potential threat to the Azure Database for MySQL server.

    anomalous-activity-report

    Clicking on the view recent alerts link in the email will launch the Azure portal and show the Azure Security Center alerts blade which provides an overview of active SQL threats detected on the Azure Database for MySQL server.

    active-threats

    Clicking on a specific alert provides additional details and actions for investigating this threat and remediating future threats.

    specific-alert

    Azure Database for MySQL server Advanced Threat Protection alerts

    Advanced Threat Detection for Azure Database for MySQL server detects anomalous activities indicating unusual and potentially harmful attempts to access or exploit databases and it can trigger the following alerts:

    • Access from unusual location: This alert is triggered when there is a change in the access pattern to a Azure Database for MySQL server, where someone has logged on to the Azure Database for MySQL server from an unusual geographical location. In some cases, the alert detects a legitimate action such as a new application or developer’s maintenance operation. In other cases, the alert detects a malicious action like a former employee, external attacker, and more.
    • Access from unusual Azure data center: This alert is triggered when there is a change in the access pattern to Azure Database for MySQL server, where someone has logged on to the Azure Database for MySQL server from a Azure Data Center that was not seen accessing this Managed Instance during the recent period. In some cases, the alert detects a legitimate action such as your new application in Azure, Power BI, Azure SQL Query Editor, and more. In other cases, the alert detects a malicious action from an Azure resource/service like a former employee or external attacker.
    • Access from unfamiliar principal: This alert is triggered when there is a change in the access pattern to Azure Database for MySQL server, where someone has logged on to the Azure Database for MySQL server using an unusual principal. In some cases, the alert detects a legitimate action such as new application or developer’s maintenance operation. In other cases, the alert detects a malicious action like a former employee or external attacker.
    • Access from a potentially harmful application: This alert is triggered when a potentially harmful application is used to access the database. In some cases, the alert detects penetration testing in action. In other cases, the alert detects an attack using common attack tools.
    • Brute force login credentials: This alert is triggered when there is an abnormal high number of failed logins with different credentials. In some cases, the alert detects penetration testing in action. In other cases, the alert detects a brute force attack.

    Next steps

    Building with blockchain on Azure

    $
    0
    0

    Today my team is in Orlando, meeting with technical and business decision makers interested in how blockchain can transform the way they do business and wondering just how to get started with Azure.

    As Satya mentioned in his keynote this morning, we’ve seen tremendous progress from businesses building with blockchain on Azure. The feedback from early adopters has been consistent on two fronts, blockchain has the potential to extend digital transformation outside their four walls and into the processes they share with suppliers, partners, and customers, but building these new shared applications is a challenge and they’re looking for trusted partners to help. Businesses across industries are increasingly turning to Microsoft and its partners as they begin this journey.

    Over the past few years, the power of Azure has enabled companies to break out of data silos. Scalable data platforms like Azure Databricks and analytics tools like PowerBI let companies collaborate on heterogeneous data sets across departments, and extract insights through Machine Learning and AI. Blockchain empowers the next step enabling not only a single authentic data set, but also shared business logic across multiple companies or individuals.

    The emergence of blockchain networks

    Recently, we’ve seen our customers shift from experimentation to a clear focus on the establishment and growth of shared industry-specific utilities where multiple organizations come together to leverage immutable, shared data and business logic sets to more effectively manage processes that cross company boundaries.

    • In Europe, Maersk and its partners are using blockchain and IoT to trace the movement of valuable cargo across oceans and provide attestable insight into both its real-time risk and the degree to which shippers are adhering to contractual terms. Sharing this information, immutably, on the ledger allows Maersk’s insurers to reduce premiums in light of new information. This frees up capital for Maersk, while also relieving the burdensome post-delivery reconciliations for insurers.
    • In Australia, Webjet introduced Rezchain, a shared platform for its travel partners to coordinate bookings that involve multiple parties, reducing lost reservations, double bookings for customers, while eliminating costly and time-consuming dispute resolution costs for its constellation of partners.
    • And here in Redmond, our Xbox team is working with EY and Ubisoft to simplify the royalty payments process, which spans game developers, publishers, console providers, and advertisers. Using smart contracts to codify the agreements that govern over eight million transactions a day, we not only reduced manual and error-prone reconciliations, but also enabled near real-time visibility into past and future income. While large publishers like Ubisoft will use the platform to more accurately manage cash flows, an immutable accounts receivable from existing games could empower indie developers to secure a loan to develop their next title.

    In each of these examples, multiple companies have come together to build shared, cloud-native blockchain applications on Azure. Building these apps requires a set of capabilities to quickly develop a prototype, translate workflows into code, and integrate with tools outside of blockchain like identity, compute, and data.

    Accelerating development with Workbench

    In the spring we introduced Azure Blockchain Workbench to help businesses take their first steps towards building these new apps. This week we proudly share the stage with three companies that are using Workbench to build new blockchain businesses on Azure.

    This morning, Satya shared some exciting new work from Bühler, whose machines process the grain that makes up over 65 percent of the world’s food supply. Bühler has augmented their physical machines with an AI-enabled platform that guarantees the safety of the grains they process. With blockchain, Bühler is extending that safety guarantee across the custodial chain, tracking crops as they move from granaries and millers to the shippers and food manufacturers, between them and their common retail customers. They’re not only building a new service offering and revenue stream, but also creating a shared food-safety utility for the Ag industry that could guarantee crop safety from farm to fork.

    This afternoon we’ll hear from Arun Ghosh, who leads KPMG’s blockchain business in the US. Like IoT and ERP before it, these new applications and networks are complex to set up, organize, and govern. But once established they have the potential to reshape value chains and create entirely new industries. Partners like KPMG play a crucial role building and managing these consortium networks and guiding customers like Singapore Airlines on their digital transformation.

    Finally, we’ll hear from Eghosa Ojo, the Head of Innovation at Interswitch who is pioneering a Supply Chain Financing platform for the African continent. Nigeria’s financing system has always served banks, corporate borrowers, and large capital projects. But as mobility and other digital technologies transform Africa, a new, independent, small-entrepreneur economy is emerging in Nigeria. To bridge the gap between the existing corporate-based financing infrastructure and these new small enterprises, Interswitch set out to build a new utility on Azure that extends the reach of the banking system to non-traditional players. Supply chain financing involves multiple counterparties, many of whom lack end-to-end visibility, which can create opportunities for misunderstandings or subjectivity regarding contract performance. This lack of transparency can expose lenders to higher risk, delay financing, threaten project timelines and budgets, and even jeopardize the success of emerging businesses. With the Interswitch blockchain-based Supply Chain Financing service, lenders, suppliers, and borrowers of all sizes can manage their supply-chain financing under objective terms and complete transparency.

    For each of these businesses, Workbench dramatically reduced development costs and accelerated time to value from months to weeks, by allowing developers to reduce the time spent on infrastructure scaffolding and integration and instead focus their efforts on their workflows and smart contract logic.

    The best platform for blockchain development

    Whether you’re building a consortium application in financial services or supply chain, a new economy company, or a decentralized product offering, Azure is the best platform for these new applications. We offer a cloud where consortiums can build and operate blockchain networks at scale, using the blockchain that most fits their needs.

    Azure offers not only the network infrastructure, but also the deep integration with the identity, key management, messaging and off-chain data services developers need to build these new utilities and connect them to existing business apps, infrastructure and talent.

    We’re humbled by the bold ideas and projects our customers, partners, and open blockchain communities have started in the space, and we are committed to making this technology more useful to our enterprise customers.

    To get started building your blockchain network on Azure, check out Workbench on the Azure Marketplace today.


    10 new ways for everyone to achieve more in the modern workplace

    $
    0
    0

    It’s been over a year since we introduced Microsoft 365, the complete, intelligent, and secure solution that empowers employees to drive their organizations to future growth. Customers are seeking to transform and support a workforce that is more diverse and mobile than ever before, and they are relying on latest advancements in technology to do so. Customers such as Goodyear, Eli Lilly, and Fruit of the Loom use Microsoft 365 to empower their employees.

    Microsoft 365 is growing quickly, built on the strength of more than 135 million commercial monthly Office 365 users. Windows 10 has approximately 200 million commercial devices in use, and there is an install base of over 82 million for Enterprise Mobility + Security (EMS). Today, at the Microsoft Ignite Conference in Orlando, Florida, we are introducing new capabilities in Microsoft 365 that make it possible for every person to do their best work.

    1. Microsoft Teams is the fastest growing business app in Microsoft history

    After less than two years in market, more than 329,000 organizations worldwide use Microsoft Teams, including 87 of the Fortune 100 companies. In fact, 54 customers now have more than 10,000 active users of Teams, and Accenture just crossed the 100,000 active-user mark in Teams. Further growth has been spurred by the recently announced free version of Teams.

    We continue to add powerful new capabilities to foster teamwork and collaboration. New artificial intelligence (AI) powered meeting features are now generally available—including background blur and meeting recording. Background blur uses facial detection to blur your background during video meetings, and meeting recording allows you to playback recorded meeting content at any time with captions and a searchable, timecoded transcript.

    General availability of new live event capabilities will begin to roll out worldwide in Microsoft 365 later this year. These new tools allow customers to create and stream live and on-demand events in Teams, Yammer and Microsoft Stream to inform and engage customers and employees, wherever they are. Beginning in October, employees can watch videos on the go with the Stream mobile app for iOS and Android, with support for offline viewing. And we’re working with our ecosystem of device partners to deliver new devices optimized for Teams meetings and calling, including the new Surface Hub 2. Surface Hub 2 is perfect for dynamic teamwork and features a light, sleek, and intelligent design that’s easy to move around and fit in any workspace. The first phase of Surface Hub 2, Surface Hub 2 S, will start shipping in the second quarter of 2019.

    Animated image of a man blurring his background in Teams.

    Blur your background during meetings.

    2. Extend the power of Teams to empower workers in all roles and across industries

    As an example of how Teams can enable secure workflows for regulated industries, we’re delivering a new care coordination solution, now available in private preview, that gives healthcare teams a secure hub for coordinating care across multiple patients. It provides for integration with electronic health records (EHR) systems and enables care providers to communicate about patient care in real-time within Teams’ secure platform. We are also releasing two new secure messaging features with particular relevance in healthcare settingsimage annotation, now generally available, and priority notifications, which will roll out by the end of this year to all Teams commercial customers. These capabilities support HIPAA compliance and enable doctors, nurses, and other clinicians to communicate about patients while avoiding the privacy risks that arise when healthcare professionals use consumer chat apps.

    Image of a mobile device creating an event in Microsoft Teams.

    Easily swap shifts, request time off, and see who else is working.

    3. Find what you need faster with Microsoft Search

    Microsoft Search, a new cohesive search capability, makes it easier for you to find what you need without leaving the flow of your work. We’re putting the search box in a consistent, prominent place across Edge, Bing, Windows, and Office apps, so that search is always one click away. We’re also supercharging the search box so you can not only quickly find people and related content, but you can also access commands for apps and navigate to other content wherever you need to get work done—even before you start typing in the search box. Recognizing that you work in an ecosystem of information, we’re extending Microsoft Search to connect across your organization’s data, inside and outside of Microsoft 365. Learning from your everyday work patterns and acting as a brain for your organization, the Microsoft Graph personalizes your experiences everywhere. We’re pulling together the power of the Microsoft Graph and AI technology from Bing to deliver future experiences that are more relevant to what you are working on. This will include automatically answering questions such as “Can I bring my wife and kids on a work trip?” by using machine reading comprehension that takes knowledge of the world and pairs it with understanding of your organization’s documents. Preview the Microsoft Search capability it as it rolls out to Office.com, Bing.com, and in the SharePoint mobile app today, with many more experiences to come in Edge, Windows, and Office.

    Image shows Microsoft Search in Office.com.

    Find what you need faster with Microsoft Search.

    4. Create content that stands out with Microsoft 365

    Three new features in Microsoft 365 use the power of AI to help you create content that shines. Ideas is a new feature that follows along as you create a document and makes intelligent suggestions. In PowerPoint, Ideas recommends designs, layouts, and images. In Excel, Ideas recognizes trends, suggests charts, and identifies outliers in your data. Ideas is generally available in Excel today and will begin rolling out in preview to the other apps starting with PowerPoint Online. Additionally, new data types in Excel turn references to stocks and geographies into rich entities that can be used to build powerful, interactive spreadsheets. The Stocks and Geography data types are generally available today and make it easy to get updated stock prices, company information, population, area, and more. Finally, new image recognition capabilities in Excel take a picture of a hand-drawn or printed data table and turn it into an Excel spreadsheet, making data entry as easy as taking a picture.

    Animated image shows a laptop open and Ideas being used in PowerPoint.

    In PowerPoint, Ideas recommends designs, layouts, and images for your presentation.

    5. Office loves the Mac

    Office empowers everyone to achieve more on any device. And Office loves the Mac. We’re committed to the Mac as a first-class endpoint and have made significant investments in the platform over the past year—including moving the Mac and Windows versions of the apps onto a single code base and releasing new features for the Mac every month. We also tailored new experiences for the Mac, like the new Touch Bar integration.

    Today, we’re announcing OneDrive Files On-Demand for Mac, a way to access all your personal and work files from the cloud in Finder without using storage space and only download them when you need them. Files On-Demand gives the Mac an intelligent connection to the cloud and is just one more example of the power of Office on the Mac platform. Preview it before it rolls out to all Mac users.

    Image shows OneDrive Files On-Demand on an open Mac.

    OneDrive Files On-Demand for Mac displays all your OneDrive files in Finder but only downloads them when you need them.

    6. Work together with your entire network with LinkedIn in Outlook and Office web apps

    We’re announcing two new ways to use the power of the LinkedIn network within your daily workflow. Soon, when you connect your LinkedIn account to Office 365, you’ll be able to coauthor documents with people in your LinkedIn network in Word, Excel, and PowerPoint and send emails to them directly from Outlook. This brings your corporate directory and your LinkedIn network together, so you never lose touch with the contacts who can help you succeed, inside or outside your organization. You’ll also see LinkedIn highlights about the people in your meeting invites, providing you with insights about attendees, so you can prep for important meetings quickly and easily. These features help you focus on what’s important by providing information and connections directly in your flow of work and will be coming soon in a staged rollout.

    7. Deliver a modern desktop with Azure

    For many companies, the specific needs of their business demand a virtualized desktop experience. Today, we are introducing Windows Virtual Desktop, the only cloud-based service that delivers a multi-user Windows 10 experience, which is optimized for Office 365 ProPlus and includes free Windows 7 Extended Security Updates. With Windows Virtual Desktop, you can deploy and scale Windows and Office on Azure in minutes with built-in security and compliance. Sign up to be notified of the preview availability.

    8. Manage your environment with the Microsoft 365 admin center

    Following our recent release of the new Microsoft 365 admin center, we’re announcing new features to help you to monitor and manage applications, services, data, devices, and users across your Microsoft 365 subscriptions, including Office 365, Windows 10, and EMS. The Microsoft 365 admin center has several new capabilities to help you better manage your environment, including insight-based recommendations, a more consistent UI, and customized views for each of your admins. The public preview of these features is rolling out now to targeted release admins and soon to all admins. To get started, visit admin.microsoft.com.

    Image shows the Microsoft 365 admin center on an open laptop.

    Manage your environment more easily with the Microsoft 365 admin center.

    9. Achieve modern compliance easily for the General Data Protection Regulation (GDPR) and more

    In the world of complex regulations and evolving privacy standards, customers consistently tell us they need the built-in, intelligent capabilities of Microsoft 365 to proactively achieve compliance in their organizations. We’ve expanded Compliance Manager to now include 12 assessments across different industries. The unified labeling experience is also now available in the Security & Compliance Center as a single destination where you can create, configure, and automatically apply policies to ensure protection and governance of sensitive data.

    Image shows the Compliance Manager on a tablet.

    Compliance Manager now includes 12 assessments across different industries.

    10. Advancing security for IT professionals

    The work we do in security at Microsoft gives us the broadest perspective on the challenges and a unique ability to help. We focus on three areas: running security operations that work for you, building enterprise-class technology, and driving partnerships for a heterogeneous world. Today, we’re announcing several new enterprise-class capabilities that leverage the Microsoft intelligent cloud and operational learnings to help organizations secure their people, devices, and data.

    New support for passwordless sign-in via the Microsoft Authenticator app is now available for the hundreds of thousands of Azure Active Directory connected apps that businesses use every day. Nearly all data loss starts with compromised passwords. Today, we are declaring an end to the era of passwords. No company lets enterprises eliminate more passwords than Microsoft.

    Microsoft Secure Score is the only enterprise-class dynamic report card for cybersecurity. By using it, organizations get assessments and recommendations that typically reduce their chance of a breach by 30-fold. It guides you to take steps like securing admin accounts with Multi-Factor Authentication (MFA), securing users accounts with MFA, and turning off client-side email forwarding rules. Starting today, we’re expanding Secure Score to cover all of Microsoft 365. We are also introducing Secure Score for your hybrid cloud workloads in the Azure Security Center, so you have full visibility across your estate.

    Finally, we are announcing Microsoft Threat Protection, an integrated experience for detection, investigation, and remediation across endpoints, email, documents, identity, and infrastructure in the Microsoft 365 admin console. This will save analysts thousands of hours as they automate the more mundane security tasks.

    Image shows Microsoft Secure Score in Microsoft 365 Security, on a tablet.

     

    Microsoft Secure Score is expanding to cover all of Microsoft 365.

    We look forward to bringing you these new ways to achieve more from unlocking creativity to advancing security. You can learn more about our announcements, see all of our Microsoft Ignite sessions live streaming or on-demand, and connect with experts on the Microsoft Tech Community.

    The post 10 new ways for everyone to achieve more in the modern workplace appeared first on Microsoft 365 Blog.

    Bringing AI to Excel—4 new features announced today at Ignite

    $
    0
    0

    Excel’s power comes from its simplicity. At its core, Excel is three things: cells of data laid out in rows and columns, a powerful calculation engine, and a set of tools for working with the data. The result is an incredibly flexible app that hundreds of millions of people use daily in a wide variety of jobs and industries around the world.

    Today, we’re pleased to announce four new artificial intelligence (AI) features that make Excel even more powerful:

    • Ideas
    • New data types
    • Insert Data from Picture
    • Dynamic arrays

    Introducing intelligent suggestions with Ideas

    Ideas is an AI-powered insights service that helps people take advantage of the full power of Office. Proactively surfacing suggestions that are tailored to the task at hand, Ideas helps users create professional documents, presentations, and spreadsheets in less time. In Excel, for instance, Ideas helps identify trends, patterns, and outliers in a data set—helping customers analyze and understand their data in seconds. Ideas is generally available in Excel today and will begin rolling out in preview to other apps starting with PowerPoint Online. Simply click the lightning bolt icon in Excel or PowerPoint Online and Ideas will start making recommendations. Read more about Ideas in this support article.

    An animated image shows Ideas in Excel.

    Making new data types generally available today

    Excel has always been great at helping people make the most of numbers. But now Excel can do even more: It can recognize real-world concepts, starting with Stocks and Geography. This new AI-powered capability turns a single, flat piece of text into an interactive entity containing layers of rich information. For instance, by converting a list of countries in a workbook to “Geography” entities, customers can weave location data into an analysis of their own data. And this new capability—though deceptively simple—opens a whole new world of possibilities. As we add new data types over time, Excel’s rows, columns, cells, logic engine, and tools can be used to organize, analyze, and reason over any combination of numbers and sophisticated entities. The Stocks and Geography data types are generally available today. Read more about data types in this support article.

    Image shows the Geography data type in Excel.

    Saying goodbye to manual data entry

    With Insert Data from Picture, you can take a picture of a hand-drawn or printed data table with your Android device and convert that analog information into an Excel spreadsheet with a single click. New image recognition functionality automatically converts the picture to a fully editable table in Excel, eliminating the need for you to manually enter data. Insert Data from Picture will be available in preview for the Excel Android app soon.

    An animated image shows Insert Data from Picture being used on a mobile device.

    Calculating with ease using dynamic arrays

    With dynamic arrays, we continue to invest in making advanced formulas easier to use. Using dynamic arrays, any formula that returns an array of values will seamlessly “spill” into neighboring unoccupied cells, making it as easy to get an array of values returned as it is to work on a single cell. You can immediately harness the power of dynamic arrays by using one the new FILTER, UNIQUE, SORT, SORTBY, SEQUENCE, SINGLE, and RANDARRAY functions to build spreadsheets that would previously have been nearly impossible. So now, rather than writing many complex formulas to solve a multi-cell problem, you can write one simple formula and get an array of values returned.

    An animated image shows Dynamic Arrays in Excel.

    Faster LOOKUP and MATCH

    We’re not only adding new capabilities to Excel, we’re also continually improving the features customers already know and love. For example, we have invested in significant performance improvements for important lookup functions. We’re pleased to announce that VLOOKUP, HLOOKUP, and MATCH functions operating on large data sets will now execute in seconds instead of minutes. We’ve also improved performance on many key operations like copy/paste, undo, conditional formatting, cell editing, cell selection, filtering, file open, and programmability.

    An animated image shows the LOOKUP function in Excel.

    We’re excited about these new features and hope you are, too. For us, Excel isn’t just a tool—it’s a way of life! And today’s announcements not only deliver incremental improvements in data handling and performance, they also push the app into new territory with new data types and AI-powered analysis services. We look forward to showing you more at Ignite this week and can’t wait to see what you do with it all. As always, we’d love to hear from you, so please send us your thoughts through UserVoice—and keep the conversation going by following Excel on Facebook and Twitter.

    Availability

    • Ideas will be available starting today.
    • New data types are rolling out to users of Excel in Office 365, in the English language only starting today.
    • Performance improvements are rolling out first to Excel in Office 365 starting today.
    • Dynamic arrays is available in preview for users signed up for the Office 365 Insiders Program starting today.
    • The Insert Data from Picture feature will be available in preview for users signed up for the Office 365 Insiders Program on the Android Excel app soon.

    The post Bringing AI to Excel—4 new features announced today at Ignite appeared first on Microsoft 365 Blog.

    Microsoft 365 adds modern desktop on Azure

    $
    0
    0

    Today, we are announcing Windows Virtual Desktop, the best virtualized Windows and Office experience delivered on Azure. Windows Virtual Desktop is the only cloud-based service that delivers a multi-user Windows 10 experience, optimized for Office 365 ProPlus, and includes free Windows 7 Extended Security Updates. With Windows Virtual Desktop, you can deploy and scale Windows and Office on Azure in minutes, with built-in security and compliance.

    For many companies, the specific needs of their business require a virtualized desktop experience. The reasons for virtualization vary. For example, for regulated industries like financial services and healthcare, a virtualized desktop experience ensures compliance regulations are met and access to sensitive data is securely managed. For mobile workforces and Firstline Workers, desktop virtualization makes managing and provisioning access to corporate data and apps easier. It also gives IT options in supporting scenarios such as giving access to specific apps to certain employees.

    Introducing the Windows Virtual Desktop

    While desktop virtualization helps in many business scenarios, it has historically been complex and expensive to deploy and manage. To solve some of these complexities, we have developed Windows Virtual Desktop with both IT and the user in mind.

    Windows Virtual Desktop includes the following benefits:

    • The only service to enable a multi-user Windows 10 experience, including compatibility with Microsoft Store and existing Windows line-of-business apps while delivering cost advantages previously only possible with server-based virtualization.
    • The best service to virtualize Office 365 ProPlus running in multi-user virtual scenarios. Microsoft Office is the most virtualized app used, and we are committed to deliver the best possible virtual experience. In the months ahead, we will have more to share on our investments in the Office 365 virtualized experience for the Windows Virtual Desktop service.
    • The only service to provide Windows 7 virtual desktop with free Extended Security Updates, giving you more options to support legacy apps while you transition to Windows 10.
    • The most scalable service to deploy and manage Windows virtual machines, using Azure for compute, storage, rich diagnostics, advanced networking, connection brokering, and gateway. You no longer need to host, install, configure, and manage these components yourself—so you can deploy and scale in minutes.
    • The most flexible service allowing you to virtualize both desktops and apps, meaning you can choose between providing your users the entire desktop experience or delivering only specific apps. When you deliver virtual apps to a Windows 10 endpoint, they are integrated seamlessly into the user experience.
    • Deeply integrated with the security and management of Microsoft 365. The Microsoft 365 conditional access, data loss prevention, and integrated management are natively built in—providing the most secure and simplest solution for protecting and managing all your apps and data.

    Windows Virtual Desktop ecosystem

    Windows Virtual Desktop is a comprehensive virtualization solution. However, we recognize that there is a need for different capabilities to serve our broad set of customers. To accommodate this need, we have built Windows Virtual Desktop as a platform that can be easily extended and enriched by partners in the following ways:

    • We will have many partners that extend the service through the Azure marketplace and are already working with leading partners, including Citrix, CloudJumper, FSLogix, Lakeside Software, Liquidware, People Tech Group, and ThinPrint.
    • The extensive network of Microsoft Cloud Solution Providers will be able to offer Windows Virtual Desktop to their customers and offer additional value around the service.
    • We are also working with partners such as Citrix to deeply integrate and build upon the Windows Virtual Desktop capabilities.

    We will have more to share on our ecosystem approach in the near future.

    Access to Windows Virtual Desktop

    We’re excited to offer this service to Windows 10 Enterprise and Windows 10 Education customers. Once you sign up for Windows Virtual Desktop, you only need to set up or use an existing Azure subscription to quickly deploy and manage your virtual desktops and apps. The only additional cost to you is for the storage and compute consumption from the virtual machines themselves, which will live in your Azure subscription. You will be able to take advantage of any of your existing Azure compute commitments, including Azure Virtual Machine Reserved Instances (RI).

    We are working hard to put the finishing touches on a public preview, which will be available later this year. If you would like to participate in the public preview, you can register your interest now. In the meantime, please visit the Windows Virtual Desktop website to learn more, or contact your Microsoft account team or Microsoft partner.

    The post Microsoft 365 adds modern desktop on Azure appeared first on Microsoft 365 Blog.

    Microsoft Search—cohesive search that intelligently helps you find, discover, command, and navigate

    $
    0
    0

    At Ignite last year, we introduced personalized search across Office 365, a way to bring intelligent search and discovery experiences directly to you. Today, we’re delighted to announce that we are expanding that vision to encompass search both inside and outside of Microsoft 365. By applying artificial intelligence (AI) technology from Bing to the deep personalized insights surfaced by the Microsoft Graph, we are able to make search in your organization even more effective.

    With Microsoft Search, we’re introducing new organizational search experiences into the apps you use every day, including Bing.com and Windows, and our vision to connect across your organization’s network of data.

    We’re also evolving the notion of what search means. Getting pages of results with hyperlinks to other information is simply not enough. Faced with ever decreasing attention spans, and an explosion of data, we recognize that the challenge is to find and deliver answers to your questions, suggest insights, and enable you to take action on your tasks. This makes search a powerful capability that stretches across your work to make you more productive and take advantage of the collective knowledge from your organization.

    Our vision is a cohesive and coherent search capability, prominent in every experience, providing the way to search across all your organization’s data—both inside and outside of Microsoft 365.

    Over the next few months, you’ll experience these first steps:

    • Search will move to a prominent, consistent place across the apps you work with every day. From Outlook to PowerPoint, Excel to Sway, OneNote to Microsoft Teams, Office.com to SharePoint, the search bar will be in the same place—across desktop, mobile, and web.
    • Personalized results as soon as you click in the search box, such as people you share with the most and documents you were working on recently. No query is required to get suggestions.
    • The search box itself will also command the application where you are working. For example, begin typing “acc” in Word to get list of suggested actions such as Accept Revision or Accessibility Checker. You no longer need to hunt through toolbars to look for a command.
    • Search results will include results from across your organization. For example, within Word, you can find not only other Word documents, but also a presentation you were working on. You can navigate straight to that presentation or you can choose to incorporate slides from that file directly into your document.
    • Extending that same organizational search experience wherever you are working—in Bing.com (when signed in with your Office 365 account), Edge or Windows. Search wherever you want to and get the same experience.
    • Unified administration of your organizational search results, including admin-curated results such as bookmarks.

    Experience Microsoft Search today

    Key to delivering the Microsoft Search capability is the ability to have consistent scope of results anywhere you are searching. Even if the interface looks different, the goal is to have the same experience, personalized and contextualized for that specific interaction point.

    Microsoft Search in Bing.com

    Over the last year, 180 companies participated in our private preview, with files, sites, people, locations, and groups based on Microsoft Graph data. Based on the private preview feedback, we’re announcing the ability to search across conversations in both Teams and Yammer simultaneously.

    An image shows Microsoft Search in Bing.com.

    Searching in Bing returns both your organizational results and web results, making it an easy destination for broad searches to get the best of your work world and secure your web searches. Public preview begins rolling out today. Tenant admins must opt in to the experience for their organization. Visit bing.com/business/explore for details.

    Microsoft Search in Office.com

    Get back to your work faster with Office.com, surfacing the same search scope across Microsoft 365. Find documents you were recently working on, as well as recommended documents that your colleagues have mentioned you in, and keep up to date with what has been worked on since you last looked at it. Microsoft Search in Office.com goes into targeted release today.

    An image shows Microsoft Search in Office.com.

    Microsoft Search in SharePoint mobile app

    The new version of the SharePoint mobile app includes search as the default experience when you enter the app. It lists common questions, personalized results, and frequent searches that organizations can curate. The new SharePoint mobile app is available for download today.

    An image shows a mobile device using the SharePoint mobile app.

    Microsoft Search in the Outlook mobile app

    The Outlook mobile app also highlights search as an important element of the user experience, providing access to commands, content, and people. With “zero query search,” simply placing your cursor in the search box will bring up recommendations powered by AI and the Microsoft Graph. The Outlook mobile app for iOS and Android are available for download today.

    Outlook will also bring zero query, fuzzy search, and top results based on intelligent technology to other endpoints as we drive towards coherence. Enhancements to search in Outlook for Windows, Mac, and on the web started to roll out to targeted release mid-September.

    Connect and curate data across your organization’s systems

    Boundary-less search is only as good as the sources it has access to. We recognize that organizations have a wealth of data in other third-party services and applications. In 2019, we will build native connectors for popular third-party applications that will surface search results inline with Microsoft data into all the search experiences you have, including Office, Windows, Edge, and Bing.com. Administrators will be able to select which connectors they wish to use for their organizations based on which investments they’ve already made in third-party applications. Further extensibility with APIs will also be possible. Organizations will be able customize the search sources and the display of search results with custom refiners and verticals, and control the display of how sources of information look in result pages.

    Coming in the first half of 2019

    Microsoft Search in Office

    Our new suite-wide search in the ribbon offers the same consistent experience and results across your favorite Office apps—across desktop, mobile, and web. Find, command, navigate, and discover directly from the same search box.

    An image shows a laptop open to a PowerPoint deck in which the user is using Microsoft Search.

    Microsoft Search in Windows

    Right from your taskbar, perform searches that include local and organizational search results; whether that is people, the location of an office, or your files, you can find it all in Windows.

    An image shows Microsoft Search used from the Windows start screen.

    What’s coming next?

    The Microsoft Graph gleans insight from the people, sites, devices, and documents you work with and is the basis for consistent learning across your organization, wherever you and your colleagues work. Microsoft Graph ranks search results relevant to your needs. You can see all the results that satisfy your query, but personalized search prioritizes the results that are most likely to achieve your objective.

    Supercharging the Microsoft Graph with advanced AI technology from Bing and its knowledge of the world, we can extend our vision for insightful technology to make it simple to ask natural language questions and get real answers, without manual intervention. For example, a question such as “Can my brother work for me at my company?” means that not only syntactic parsing of the question is necessary but semantic understanding. Your organization’s HR policy probably specifies “close family relationships,” so we use Bing’s knowledge of the world to expand and match “brother” and couple that with searching your organization’s intranet to derive the answer.

    Using this machine reading comprehension technology is just one of the many ways we’ll be continuously improving Microsoft Search in the future.

    The post Microsoft Search—cohesive search that intelligently helps you find, discover, command, and navigate appeared first on Microsoft 365 Blog.

    Goodyear mobilizes with Microsoft 365 to lead in a new mobility ecosystem

    $
    0
    0

    Today’s post was written by Sherry Neubert, chief information officer at The Goodyear Tire & Rubber Company.

    Goodyear has a 120-year history of being at the forefront of an ever-changing mobility industry. From bicycles, carriages, and the first Ford Model Ts off the production line, to blimps, airplanes, and even the first tire on the moon—Goodyear has always delivered innovative solutions that keep the world moving.

    Today, our industry faces even more disruption through shifting consumer behaviors and expectations, digital transformation, and breakthroughs in technology that are accelerating change at speeds unlike anything we’ve ever experienced. This is creating a new mobility ecosystem shaped by shared mobility fleets, autonomous vehicles, connected vehicles, and electric vehicles.

    Harnessing the power of our associates is key to competing—and continuing to lead—in this new world. With 64,000 associates selling products in around 150 countries, we need workplace productivity tools that enable innovation and bring together all of Goodyear’s diverse knowledge and capabilities. In 2014, we began to adopt Office 365 to modernize our workplace. Since then, we’ve empowered our associates worldwide with Microsoft 365 productivity and security tools and reached the halfway point in our global rollout of Windows 10. This cloud-connected modern desktop has given us the right tools to allow our teams to be as collaborative, agile, and productive as possible, while providing a highly secure digital environment.

    Enhancing collaboration is crucial to us for improved decision making and to drive innovation, both in tires and beyond tires. Connecting associates via Microsoft 365 tools such as Outlook, SharePoint Online, OneDrive, Yammer, and Skype for Business Online helps us take better advantage of our cross-functional and geographically dispersed teams. Our multigenerational and multicultural global workforce is now sharing perspectives and ideas more quickly and easily than ever.

    Another focus for Goodyear is driving productivity and greater efficiencies so we can deliver the right products to the right place at the right time—faster than ever. With Microsoft 365, we stay connected and access shared content from anywhere, at any time, on any device, which allows us to innovate with greater speed. We also have an Internet of Things (IoT) proof of concept with Azure hosting big data from sensors in our factories. This allows us to use machine-learning capabilities to improve our predictive maintenance.

    It is critically important for Goodyear to keep our information and systems safe so we can help protect our customers, associates, and the company and keep our focus on the changing needs of the marketplace. We safeguard our people, devices, data, and IP better with Windows 10 and Enterprise Mobility + Security. This has greatly improved our security capabilities, while at the same time continuously hardening our devices by providing updates as a service.

    As our Microsoft 365 environment continues to mature, we’re evaluating the use of other Microsoft technologies. We’ve been using Microsoft HoloLens in exploration mode—it represents exciting possibilities for virtual tire modeling and design as well as remote knowledge transfer. We are also making strides advancing the AI infused in Microsoft 365, including Cortana, Azure-powered chatbots, and Focused Inbox, to amplify skills and democratize knowledge. The next big initiative on our horizon is adopting Microsoft Teams—we expect this to become our associates’ go-to social and collaboration platform.

    The digital world is changing fast, and we must lead with digital innovation. For Goodyear, it’s all about how we can enable everyone in the company to put the customer at the center of everything we do and ensure that we maintain our leadership position in the new mobility ecosystem of tomorrow. When you find and deploy solutions that bring out the best in your people, the possibilities are endless.

    —Sherry Neubert

    The post Goodyear mobilizes with Microsoft 365 to lead in a new mobility ecosystem appeared first on Microsoft 365 Blog.

    Viewing all 5971 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>