Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

Cross-Platform Container Builds with Azure Pipelines

$
0
0

This is a follow-up to Matt Cooper’s earlier blog post, “Using containerized services in your pipeline“. If you haven’t yet, I encourage you to read that post to understand the new container syntax in the pipeline definition.

As a program manager for Azure DevOps, I spend a lot of time speaking with customers about their DevOps practices. In a recent meeting, a development team was excited about Azure Pipelines and our Linux build agents that we manage in Azure, but they needed to build their application on CentOS instead of Ubuntu.

Like text editors, whitespace and the careful placement of curly braces, Linux distributions can be hotly debated among engineers. But one of the great things about Azure Pipelines is that you don’t need to rely on our choice of Linux distribution. You can just bring your own – using containers. It’s easy to create a Docker image that has the exact distribution that you want to run your builds on. Want to build on an older LTS version of Ubuntu like Trusty? No problem. Want to run the very latest RHEL or CentOS? That’s great, too.

Of course, the choice of distribution isn’t just a personal preference: there’s usually a solid technical reason for wanting a CI build on a particular platform. Often you want to perform your build on a system that’s identical — or nearly so — to the system you’re deploying to. And since Azure Pipelines offers a single Linux based platform: Ubuntu 16.04 LTS (the LTS stands for Long-Term Support), this might seem like a problem to you if you wanted to build on a different distribution, like CentOS.

Thankfully, it’s easy to run your build in a CentOS container. And even better than building in a container with the base distribution, you can provide your own container that has the exact version of the dependencies that you want, so there’s no initial step of running apt-get or yum to install your packages.

I’m a maintainer of the libgit2 project and we recently moved over to Azure Pipelines using container builds. The project decided to adopt containers so that we could build on Ubuntu 12.04 (“Trusty”), which is libgit2’s oldest supported platform.

But what if we wanted to build libgit2 on a different distribution? Let’s walk through how to use Azure Pipelines to build this project on the latest CentOS image instead.

Creating an Image

The first thing we need to do is create an image that has our dependencies installed. That begins, of course, with the creation of the Dockerfile.

This description starts with a base Linux distribution image and adds in our dependencies, in much the same way that we used to do on each CI build.

FROM centos:7
RUN yum install -y git cmake gcc make 
    openssl-devel libssh2-devel openssh-server 
    git-daemon java-1.8.0-openjdk-headless

Once the Dockerfile is created, we can build our image:

docker build -t ethomson/libgit2-centos:latest .

And then push it up to Docker Hub:

docker push ethomson/libgit2-centos:latest

Finally, for maintenance and repeatability, we check these Dockerfiles in to a repository once we’ve created them.

Testing that image

One of my favorite things about using containers in my CI build is that I can also use the same containers in my local builds. That way I can make sure that my container is set up exactly how I want it before I push it up to Docker Hub or start my first build with Azure Pipelines.

This keeps the inner loop very tight when you’re preparing your CI system: since everything’s in a container, you can get things working on your local machine without experimenting on the CI system. So there’s no time spent provisioning a VM and no time spent downloading a git repository; it’s all ready to go locally.

The other great thing is that everything’s installed and running within the container. If you have test applications then they stay isolated. In my example, the libgit2 tests will optionally start up a git server and an ssh server. I’m much happier running those in a container than on my actual development box – and I’m lucky enough to work in a company where I’m actually able to start these on my local machine. For developers working in an environment with stricter controls on machine level changes like that, containers provide a fantastic solution.

And with Docker Desktop, you can do this even if you’re using Mac or Windows on your development box and building in a Linux container.

To run our build locally:

docker run 
    -v $(pwd):/src 
    -v $(pwd)/build:/build 
    -e BUILD_SOURCESDIRECTORY=/src 
    -e BUILD_BINARIESDIRECTORY=/build 
    -w /build 
    ethomson/libgit2-centos:latest 
    /src/ci/build.sh

What we’ve done here is mapped the current directory, our git repository, to /src on the container, and a subdirectory called build to the /build directory on the container.

We’ve also set two environment variables, BUILD_SOURCESDIRECTORY and BUILD_BINARIESDIRECTORY. This isn’t strictly necessary, but it’s useful since these are the variables used by the Azure Pipelines build agent. This means you can share your build scripts between a bare metal Azure Pipelines build agent and a container without any changes.

CI/CD in Azure Pipelines

One of the nice features of Azure Pipelines is that you get an actual virtual machine, which means that you can run your own Docker images as part of the CI/CD pipeline.

You can imagine this container as a bit of abstraction over the pipeline. Azure Pipelines will orchestrate multiple container invocations, each picking up where the other left off. Instead of simply invoking the compiler to take your source and build it, you run the compiler inside the container instead. The container is isolated, but since you have your source and binary directories mapped, you capture the output of the build to use in the next stage.

You can do this as coarsely or as fine-grained as you’d like. For libgit2, we have a script that does our build and one that runs our tests. The build script uses cmake within the mapped source directory to discover the container’s environment, configure the build and then run it. The binaries – our library and the tests – will be put in the mapped output directory.

The next step of our build runs the tests, again inside the container. The source and binary directories are mapped just like before, so the test step can pick up where the build step left off. In the example of libgit2, the test script will start some applications that we’ve pre-installed on the container (some network servers that the tests will communicate with) and then run the test applications that we compiled in the build step.

libgit2’s test framework writes a report in JUnit-style XML, which is a common feature in test frameworks, and a feature that Azure Pipelines has native support for. In the next step of the build process, we simply publish that XML so Azure Pipelines can analyze them and display the test results.

Thus, the libgit2 build configuration looks like this:

resources:
  containers:
  - container: centos
    image: ethomson/libgit2-centos:latest

pool:
  vmImage: 'Ubuntu 16.04'

container: centos

steps:
- script: $(Build.SourcesDirectory)/ci/build.sh
  displayName: Build
  workingDirectory: $(Build.BinariesDirectory)
- script: $(Build.SourcesDirectory)/ci/test.sh
  displayName: Test
  workingDirectory: $(Build.BinariesDirectory)
- task: publishtestresults@2
  displayName: Publish Test Results
  condition: succeededOrFailed()
  inputs:
    testResultsFiles: 'results_*.xml'
    mergeTestResults: true

I can check that file right in to my repository – if I name it azure-pipelines.yml and put it at the root of my repo, then Azure Pipelines will detect it during setup and streamline my configuration.

This happens when I set up Azure Pipelines for the first time through the GitHub Marketplace. Or if I’m already an Azure DevOps user, when I set up a new Pipelines Build and select my repository.

Success!…?

I was excited to queue my first build inside a CentOS container but just as quickly dismayed: as soon as it finished, I saw that two of my tests had failed.

Tests Failed

But that dismay evaporated quickly and gave way to interest: although nobody wants to see red in their test runs, a failure should be indicative of a problem that needs to be fixed.

Once I started investigating, I realized that these were the SSH tests that were failing. And they were only failing when trying to connect to GitHub. It turns out that the version of the SSH library included in CentOS 7 is – well, it’s a bit old. It’s old enough that it’s only using older ciphers that GitHub has disabled. I’d need to build libgit2 against a newer version of libssh2.

At that point, I updated my Dockerfile to download the newest SSH library, build it and install it:

FROM centos:7
RUN yum install -y git cmake gcc make openssl-devel openssh-server 
    git-daemon java-1.8.0-openjdk-headless
WORKDIR "/tmp"
RUN curl https://www.libssh2.org/download/libssh2-1.8.0.tar.gz 
    -o libssh2-1.8.0.tar.gz
RUN tar xvf libssh2-1.8.0.tar.gz
WORKDIR "/tmp/libssh2-1.8.0"
RUN ./configure
RUN make
RUN make install
ENV PKG_CONFIG_PATH /usr/local/lib/pkgconfig

At this point, I built my new docker image and pushed it up to Docker Hub. Once it was uploaded, I queued my new build. And now all our tests succeed.

Successful Build

This is a wonderful illustration of why it’s so important to my project to build on a variety of systems: not only do we have confidence that we work correctly on many platforms, we also understand the problems that our users might run into and how they can work around those problems.

Success.

The post Cross-Platform Container Builds with Azure Pipelines appeared first on Azure DevOps Blog.


Reflecting your feedback in Visual Studio 2019

$
0
0

We started sharing Visual Studio 2019 Previews with the goal to deliver the best possible experience through a more open dialog.

Over the last few months, we’ve seen lots of thoughtful and passionate discussion throughout the community, on the blog, in Developer Community, and a bunch of social media sites. Through these previews, we were able to share some of the UI changes in this release.

We want to say thank you for sharing this feedback. Everyone working on the team has read through all the comments and we’ve done our best to respond to the different themes discussed in the threads.

Hopefully, by now you’ve had a chance to try the Visual Studio 2019 Preview. If not, we encourage you to try out some of the new experiences. There were a few popular themes in the feedback that we wanted to acknowledge and talk a bit about the changes we’ve already made based on your feedback.

Combining the title and menu bar

With the combined menu and title bar, we took an opportunity to give back valuable vertical space and update the layout of the upper shell controls. This also makes the new more accurate VS Search more discoverable and accessible. While we strived to do this without impacting your workflow many users highlighted the title bar was home to several information elements including solution name, administrator mode, experimental instance tag, and the preview badge the were important for navigating between multiple instances of Visual Studio running at the same time.

When we first combined the title and menu bar, we were careful not to remove the solution name from the app title that appears when you ALT+TAB or hover over the app icon in the start menu. We found in controlled studies testing these were the most commonly used elements for establishing context across instances of Visual Studio. As we started to test the changes internally though, feedback surfaced that users still found the missing solution name jarring so we started working on some potential solutions.

We heard another piece of feedback from the community that we’d heard in our studies; users often need more context than just the solution name. Having seen this feedback before in some of our studies we experimented in Preview 1 & 2 with a new home for solution name in the status bar next to the existing branch switcher. While this design had the benefit of bringing solution and repo information closer together it wasn’t very discoverable and didn’t work well with cascaded windows. Preview 3 now brings back the solution name into the combined title & menu bar and updates surrounding controls to reserve space for the name.

The Preview 3 title bar with the solution name as well as instance and preview tags

The title bar included other information in addition to the solution name. We took steps to preserve the other contexts that were previously available in the title bar including Administrator mode, experimental instance, and preview channel by displaying them in a badge under the window state controls. The “Send feedback” and “Live Share” controls are available on the toolbar row while the notification icon has a new home in the bottom right of the shell to better align with common notification patterns.

Blue theme changes

Another goal for Visual Studio 2019 was to make it easy to tell the difference when you have different versions of the IDE open, be that 2012, 2017 or 2019. We refreshed the blue theme, which we’d not touched since 2012, to indicate this step into the future.

What we found through the feedback is that some functional elements from the original Blue theme were lost with the first iteration of the theme update. While most of you generally liked the new theme some customers gave us feedback that it removed some spatial definitions in the UI that helped navigate around the IDE quickly and intuitively. Comments also pointed out that the blue theme updates had gotten too bright adding eye strain over longer periods.

The changes in the blue theme from 2017 to 2019 as well as changes to the title bar, which increases the vertical height available.

For Preview 3, We’ve heightened overall contrast which in turn helped improve legibility, accessibility, and wayfinding throughout the IDE. We reduced the brightness of the base color and added a set of complementary and analogous accent colors. These changes preserve an at a glance difference between versions while tackling the functional feedback.

New Icons

While not the most glamourous part of a new release, the new product icon is an important way to identify our IDE in a busy taskbar, start menu or desktop. It’s also another way that you can tell the difference between versions of our product. In our original post we explained the approach we’ve taken to simplify the icon for Visual Studio as well as create a system for showing the difference between a Preview and full released version.

Visual Studio for Mac 2017 (left) Visual Studio for Mac 2019 Preview (Middle) Visual Studio for Mac 2019 (Right)

We’ve now applied this approach to the Visual Studio for Mac icon which is available to download from our website. One of the challenges with the icon for Visual Studio for Mac was how to create an icon that was closely tied to the Visual Studio Family line of products but that stood apart from the Windows version of Visual Studio so as not to confuse you into thinking they’re the same (just on different platforms). We experimented with many ways of showing a differentiation between the two applications but settled on tying them together as a family, with a background shape that echoes the form of the Infinity Loop, and adds a macOS flavor to the icon.

Let’s continue the conversation

We are truly grateful for all the feedback we’ve received so far and hope to use this to continue to improve the experiences we’re delivering throughout the product. As you find bugs or have suggestions the Developer Community site is the place to log these items for the team to review and the benefit of the community. We’re eager to continue the discussion as you continue to use Visual Studio 2019.

The post Reflecting your feedback in Visual Studio 2019 appeared first on The Visual Studio Blog.

Announcing .NET Framework 4.8 Early Access Build 3745

$
0
0

As we get closer to the final version, our efforts are focused on stabilizing the release over the coming weeks. Please keep up the support by trying out our latest preview 3745 and provide any feedback you may have for the build or for .NET 4.8 overall, via the .NET Framework Early Access GitHub repository.

Supported Windows Client versions: Windows 10 version 1809, Windows 10 version 1803, Windows 10 version 1709, Windows 10 version 1703, Windows 10 version 1607, Windows 8.1, Windows 7 SP1

Supported Windows Server versions: Windows Server 2019, Windows Server version 1803, Windows Server version 1709, Windows Server 2016, Windows Server 2012, Windows Server 2012 R2, Windows Server 2008 R2 SP1

This build includes an updated .NET 4.8 runtime as well as the .NET 4.8 Developer Pack (a single package that bundles the .NET Framework 4.8 runtime, the .NET 4.8 Targeting Pack and the .NET Framework 4.8 SDK). Please note: this build is not supported for production use.

Next steps:
To explore the new build, download the .NET 4.8 Developer Pack. Instead, if you want to try just the .NET 4.8 runtime, you can download either of these:

You can checkout the fixes included in this preview build or if you would like to see the complete list of improvements in 4.8 so far, please go here.

This preview build is also included in the next update for Windows 10. You can sign up for Windows Insiders to validate that your applications work great on the latest .NET Framework included in the latest Windows 10 releases.

Thanks!

The post Announcing .NET Framework 4.8 Early Access Build 3745 appeared first on .NET Blog.

IoT in Action: New innovations making IoT faster and simpler

$
0
0

As the Internet of Things (IoT) disrupts global business across every industry, opportunities abound. Partners are building on Microsoft IoT innovations and expanding solution accelerators, while customers of every size are reaping the rewards through increased productivity and efficiency, new revenue streams, and broader market share.

Read on to discover how Microsoft and our partners are making IoT faster, easier, and more cost effective through innovations in Windows IoT, Azure IoT, and Azure Sphere. For greater depth and inspiration, as well as some fantastic networking opportunities, be sure to attend the IoT in Action global event in Nuremberg on February 25, 2019.

IoT image
 

New innovations in Azure IoT

Greater compatibility and flexibility

Azure IoT is evolving to make it more compatible with a growing list of technologies, offering far greater choice and flexibility. In 2018, Microsoft announced the general availability of Azure IoT Edge. And as of February, Azure IoT Edge is now able to run on virtual machines (VM) using a supported operating system. Also announced in February: Azure IoT Hub Java SDK will now support the Android Things platform.

Further integration with artificial intelligence and machine learning

Microsoft partners and customers are driving remarkable innovation through integration of Azure IoT with artificial intelligence (AI) and machine learning. Azure IoT enables customers to build AI-based experiences in a range of applications from cloud to edge. Azure AI Services incorporates machine learning to enable deeper control and customization.

Azure Stream Analytics, now available on IoT Edge, uses machine learning for anomaly detection and substantially reduces the costs and effort required for building and training machine learning models.

Examples of innovations using these technologies range from BMW Group’s upcoming in-car Intelligent Personal Assistant to products like the QuickSet Cloud platform that powers millions of connected devices in homes through customers including Comcast, Sony, LGI, Samsung and others.

The latest in Windows IoT

Long-term support

Windows IoT offers powerful edge-computing capabilities, integrates easily with Azure, and is ideally suited for those who are already familiar with Windows. One of the biggest enhancements to Windows IoT comes in the form of support: Microsoft is now offering long-term, comprehensive support—including silicon and security updates—that provides an extra level of confidence for customers.

Azure Device Agent 2.0

From an innovation standpoint, efficiency, security, simplicity, and repeatability are some key focus areas. Recently released for public preview is Azure Device Agent 2.0 which ticks these boxes, providing everything needed to connect a solution to Azure easily, including manageability and device provisioning. Plug and play (PnP) functionality will be coming soon.

WinML

Microsoft is investing heavily in cloud connectivity and helping to accelerate customers’ journeys to the cloud and the intelligent edge. For machine learning on the edge, Microsoft has released WinML. This set of APIs allows developers to use pre-trained machine learning models for intelligent edge devices, allowing AI compute tasks to be done locally. Benefits include improved performance, reduced bandwidth demands and cloud compute costs, and real-time results delivery.

Innovative IoT security solution: Azure Sphere

Recently shortlisted as a finalist for the Embedded World awards, Azure Sphere offers watertight security for connected microcontroller (MCU) devices at the intelligent edge. Azure Sphere includes three pivotal components that work together to lock down device security. The Azure Sphere MCU provides real-time application processes with Microsoft’s secure silicon architecture. The built-in Azure Sphere OS offers multiple layers of security. And the turnkey security service affords device-to-device and device-to-cloud security, detecting threats, and ensures certificate-based authentication.

Seeed Studio, a partner specializing in hardware developing, prototyping, and manufacturing, offers an Azure Sphere Development Kit that matches the design of the Microsoft reporting database (RDB). They also created a hardware ecosystem with access to over 100 Grove sensor modules to help rapid prototyping. Meanwhile, for developers who want to jumpstart IoT solution design and accelerate development, Avnet offers an Azure Sphere Starter Kit. The kit serves as a secure, fundamental building block that makes it very easy to have a secure device.

In addition to their Starter Kit, Avnet has unveiled two fully certified modules to reduce cost and speed your time-to-market with Microsoft’s Azure Sphere. AI-Link, a subsidiary of ChangHong group, released the first Azure Sphere module that is ready for mass production. AI-Link is a top IoT module developer and manufacturer and has shipped more than 90 million units in 2018.

Microsoft partners are also finding ways to innovate and secure their solutions using Azure Sphere. For instance, LEONI has found a way to make their cable systems smarter, more secure and more reliable. Azure Sphere enables LEONI to quickly connect digital solutions and customer applications, achieve improved hardware performance, and ensure increased security around data and equipment.

Then there is qiio and Feldschlösschen Breweries, a subsidiary of the Carlsberg Group, who have an Azure-enabled beer brewer that sends device utilization, health and performance data to the cloud via the end-to-end IoT solution of qiio. Azure Sphere securely connects the brewer, protecting the device from security breaches, giving Feldschlösschen peace of mind and useful insights around their operations.

Meet these innovative Microsoft partners – LEONI, Avnet, Seeed Studio, qiio, and AI-Link – at Embedded World in Nuremberg to see how they are using Azure Sphere to build transformative solutions.

Register for IoT in Action in Nuremberg on February 25

Register for the IoT in Action global event in Nuremberg on February 25 to find out how these and other Microsoft and partner IoT innovations are driving business transformation in every industry.

For those that want more hands-on learning, join us in Houston on April 16 for a Solution Builder Conference that explores the intelligent edge, IoT cognitive services, hybrid cloud, and IoT solution accelerators. This event is also a fantastic opportunity to talk to experts and build connections in Microsoft’s thriving customer and partner ecosystem.

Visit the Microsoft booth at Embedded World & Mobile World Congress

Explore how Microsoft and its partners’ innovative IoT solutions, using Azure Sphere and Windows IoT, are fueling digital transformation across verticals when you attend one of the following events:

•    Mobile World Congress, February 25-28 (Barcelona, Spain)

•    Embedded World, February 26-28 (Nuremberg, Germany)

Use your favorite extensions with Visual Studio 2019

$
0
0

Do you want to try the preview of Visual Studio 2019 but worry that your favorite extensions aren’t supported yet? A record number of extensions have already added support for Visual Studio 2019. So there is a good chance your favorite extensions are among them. In fact, more than 850 extensions are currently available, and more are being updated every day.

Never has there been as many extensions available with support for a new major version of Visual Studio as there is for Visual Studio 2019 – and it is still in preview!

If you don’t see your favorite extension listed, then send feedback to the owner by leaving a comment on their individual Marketplace or GitHub page, and let them know you want to use it with Visual Studio 2019. Check out this great guide on how to add the support they can follow which contains just a few simple steps.

Partners like DevExpress, Progress/Telerik, Whole Tomato, and many others worked hard to get their extensions ready to support Visual Studio 2019 early on. So extensions such as the very popular CodeRush, Visual Assist, CodeMaid, and Telerik ASP.NET MVC are all there for you to enjoy.

Move your extensions

If you have a lot of extensions installed in previous versions and want to move them over to Visual Studio 2019, then you can now use a handy extension to make that easier. Install Extension Manager 2017 and 2019 respectively in each version of Visual Studio to first export your extensions from 2017 and then import them in 2019.

The extensions top-level menu in VS2019
The Extension Manager 2019 extension showing up in the new Extensions top-level menu in Visual Studio 2019

Extensions get their own top-level menu in Visual Studio 2019. This gives the ecosystem more prominence and declutters the top-level menu when you have a lot of extensions installed.

Try it out

Thanks to the ecosystem of extenders and partners, we now have a very full featured set of extensions ready for Visual Studio 2019. If you haven’t already, this is a good time to install the preview of Visual Studio 2019 and try out some of the popular extensions that lets you customize the IDE and make you more productive.

Remember to tune in to the official launch event of Visual Studio 2019 on April 2nd.

The post Use your favorite extensions with Visual Studio 2019 appeared first on The Visual Studio Blog.

Top Stories from the Microsoft DevOps Community – 2019.02.22

$
0
0

Here in England, we’ve got some lovely spring-like weather coming for the weekend. I can’t wait to put the computer away, get the grill out and enjoy the sunshine. But before I unplug, I wanted to share with you some of the great articles that I found this week around DevOps.

Archie: Easy cross-compilation for busy developers
It’s not much of a secret that I want more builds on more platforms for my software because I want as much validation as I can get for every pull request and every change. I don’t know, but I suspect that Jay Rodgers shares this thinking, because he’s busy making cross-compilation easier for Azure Pipelines.

Using the Basic Process Template in Azure DevOps to make support management easier
The new Basic Process Template is one of my favorite new additions to Azure DevOps – it’s just so simple that it’s perfect for so many simple projects where even Agile or Scrum have too much ceremony. Matteo Emili introduces the new Basic Process and how you can use it for your Lean projects.

Azure DevOps Podcast: DevOps Capabilities in Azure
Jeffrey Palermo is joined by Scott Hunter, the Director of Program Management of .NET at Microsoft. They discuss the differences between .NET Core and .NET Framework, what’s happening in .NET Core 3.0 and how everything fits in to .NET DevOps on Azure.

Managing projects in a modern world
Last week, Mike Neill dropped a blog post with an intriguing screenshot of his project dashboard, and a lot of people took note. He’s back with a series of blog posts that show you how to get visibility into your projects using Azure DevOps and discuss across the team using Microsoft Teams.

Continuous Delivery and DevOps with Azure DevOps: The Big Picture
Marcel de Vries just launched some great video training on Pluralsight, giving a complete introduction to DevOps and Continuous Delivery practices with Azure DevOps. If you’re wanting to implement CD and DevOps within your organization, this is a great piece of training to have in your arsenal.

As always, if you’ve written an article about Azure DevOps or find some great content about DevOps on Azure then let me know! I’m @ethomson on Twitter.

The post Top Stories from the Microsoft DevOps Community – 2019.02.22 appeared first on Azure DevOps Blog.

What’s New with the AI and Machine Learning Tools for Python: February 2019 Update

$
0
0

Across Visual Studio Code and Azure Notebooks, January brought numerous exciting updates to the AI and Machine Learning tooling for Python! This roll-up blog post recaps the latest products updates as well as the upcoming events for AI and Machine Learning:

  • Python Interactive in VS Code now comes with IPython Console support
  • Ability to work with Azure Machine Learning service in VS Code and Azure Notebooks
  • Azure Notebooks Connect() News
  • New resources
  • Upcoming Events

What’s new?

Python Interactive in VS Code

The Python extension for VS Code first introduced an interactive data science experience in the last Oct update. With this release, we brought the power of Jupyter Notebooks into VS Code. Many feature additions have been released since, including remote Jupyter support, ability to export Python code to Jupyter Notebooks, etc.

The most noticeable enhancement in the Jan 2019 update allows code to be typed and executed directly in the Python Interactive window. Now, the window effectively turns into an IPython console that can be used standalone as well as in conjunction with the code editor. As well, the Jan 2019 update brings general Python goodness, such as improved diagnostics for failed tests with pytest and much faster outline view with the Python Language Server. See blog Python in Visual Studio Code – January 2019 Release for the full list.

Work with Azure Machine Learning service in VS Code and Azure Notebooks

To add some simplicity to the complex of world of ML pipelines, Microsoft announced the general availability of the Azure Machine Learning service in December. This service eases and accelerates the building, training, and deployment of your machine learning models from the cloud to the edge.

The Azure Machine Learning extension for VS Code provides an easy way to manage the Azure Machine Learning service. This includes controls over experiments, pipelines, compute, models, and services, all from within VS Code. Interested? Check out how to get started with the AML extension.

Additionally, Azure Notebooks offers a seamless way to take advantage of these new APIs within our hosted Jupyter notebook experience. You can join thousands of others who have tried out our Getting Started with the Azure Machine Learning service sample project . As well, you can connect directly with Azure Notebooks from your AML Workspace.

Azure Notebooks Connect() News

For those who missed our exciting Connect() Announcements, Azure Notebooks released new integrations with the Azure ecosystem as well as a fresh UI. With these improvements, we hope to help data scientists achieve greater productivity on our platform.

For workloads that need a bit more power beyond on our free compute pool, Notebooks now allows you to connect to any SKU of Data Science Virtual Machine. Through this, users can take advantage of the full suite of Azure compute capabilities, right from Azure Notebooks. Read more in our documentation.

We encourage users to try out these new compute options as well as the exciting Azure Machine Learning service integration mentioned above to further productivity on the Azure Notebooks platform.

New resources

Beyond feature releases, there are some great new demo resources from the Ignite Tour:

Upcoming Events

Our Python tooling team as well as many of the Microsoft Cloud Developer Advocates have a few exciting events coming up:

  • The Ignite Tour continues globally
  • PyCascades is a new regional Python conference in the Pacific Northwest this weekend from February 23-24

Tell us what you think

As always, we look forward to hearing your feedback! Please leave comments below or find us on Github (VS Code Python extension, Azure Notebooks)  and Twitter (@pythonvscode, @AzureNotebooks).

The post What’s New with the AI and Machine Learning Tools for Python: February 2019 Update appeared first on Python.

Cloud-based load testing service end of life

$
0
0

When Visual Studio 2019 Preview 1 shipped in early December, we announced our plans to deprecate the load test functionality in Visual Studio. Visual Studio 2019 will be the last version of Visual Studio with web performance and load test features.

To give our customers plenty of notice, I also wanted to let everyone know that we plan on closing down the corresponding Azure DevOps cloud-based load testing service on March 31st, 2020.

Why

Load testing helps you ensure that your apps can scale and do not go down when peak traffic hits. Although we have been shipping our load testing tools and our cloud-based load testing service for many years, unlike our other services their adoption has not been growing. While it is difficult for me to pinpoint one specific reason for this, there are a few contributing factors, including:

  • Load testing is typically initiated for seasonal events such as tax filing season, Black Friday, Christmas, summer sales, etc. The classic example I always like to give is the NORAD Santa Tracker
  • Load testing requires a certain level of expertise to ensure you have the confidence in the results. This ranges from understanding the application & deployment architecture, designing of load tests, authoring/executing of tests at scale and analyzing the results to identify performance and application bottlenecks.
  • We’ve found that very few organizations rely on in-house expertise for this. Instead, most prefer to engage consultants to help them.

Also, frankly, I feel that our offering has fallen behind. When I look around at other offerings in this space (open source as well as commercial offerings that sometimes include consulting services) I honestly feel they are now better suited to meet the needs of our customers.

Please do not think that we are saying that load testing is not important. On the contrary – load testing, done well, remains relevant and will continue to be very important for some of our customers. But I think this need is now best addressed by making sure we have great partner provided or open source load testing options integrated into Azure DevOps.

Timeline for deprecation

  1. Visual Studio and Test Controller/Test Agent for on-premises load testing: Visual Studio 2019 will be the last version of Visual Studio with the web performance and load test capability. Visual Studio 2019 is also the last release for Test Controller and Test Agent (installed via the ‘Agents for Visual Studio’ SKU) for setting up a load test rig on-premises on in a virtual machine.

    While no new features will be added, load test in VS 2019 will continue to be supported for any issues that may arise during the support lifecycle of the product. As outlined in the product lifecycle and servicing documentation, you can continue to use the on-premises offering and will be fully supported for the next 5 years and an additional 5 years of extended support is also available should you need it. Different Visual Studio versions can be installed side-by-side on the same machine. This means that you can continue to use Visual Studio 2019 to maintain your on-premises load tests, while using a newer Visual Studio version when it becomes available in the future for other development needs.

    Visual Studio web performance tests (.webtest files) are tied to the load test functionality and is also deprecated. While it was never designed for it specifically, I know of a few customers who have used .webtest’s for other purposes such as for running API tests. I would encourage them to take a look at some of the other API testing alternatives that are available in the market that would better serve their needs. SOAP UI is a free, open source alternative to consider and is also available as a commercial option with additional capabilities.

  2. Shutting down the cloud-based load testing service: Our cloud-based load testing service will continue to run through March 31st, 2020. You can continue to use all the experiences powered by this service without interruption until then. After the service goes offline, you can continue to use the tests as outlined below while you evaluate alternatives.

    The cloud-based load testing service is also used in the following places, which will no longer be available after March 31st, 2020.

    • Running load tests in Azure Pipelines using the load testing tasks (Cloud-based load test task, Apache JMeter task and the cloud-based web performance test task: These tasks will stop working after March 31st, 2020 and will be removed.
    • Cloud-based load tests in CI/CD pipelines will run until March 31st, 2020. After that you need to self-host them and run using the Test Controller/Test Agent setup invoking the tests through the mstest.exe command-line.
    • Running performance tests using the Azure portal (performance test option in App Services and Application Insights blades): These options will no longer be available after March 31st, 2020.

Load testing Alternatives

There are many alternatives, both free and commercial tools that you can consider. For instance, Apache JMeter is a free, popular open source tool with a strong community backing. It supports many different protocols and has rich extensibility that can be leveraged to customize the tool to your needs. Many commercial services such as Blazemeter support running Apache JMeter tests.

If you use code based tests for load testing and .NET is your platform of choice then tools such as Neoload, Micro Focus Silk Performer and Micro Focus Load Runner are options to consider.

In addition, extensions from several load test vendors such as SOASTA (now Akamai CloudTest), Apica Loadtest and Load Impact are available in the Azure DevOps and Azure marketplace.

More Information

Deprecating features is always hard and therefore not something we ever do lightly. We recognize that this announcement will be disruptive for the people that are using the service today and we will be working with affected customers to ensure they are aware of the changes. Premier support for Enterprise can be engaged for help with migrating tests to alternatives – they can be reached via email on premdevinfo@microsoft.com

If you have any further questions or concerns then please send us an email at vsoloadtest@microsoft.com or contact me directly.

The post Cloud-based load testing service end of life appeared first on Azure DevOps Blog.


Azure Hybrid Benefit for SQL Server

$
0
0

In my previous blog post I talked about how to migrate data from existing on-prem SQL Server instances to Azure SQL Database. If you haven’t heard SQL Server 2008 end of support is coming this summer, so it’s a good time to evaluate moving to an Azure SQL Database.

If you decide to try Azure, chances are you will not be able to immediately move 100% of your on-prem databases and relevant applications in one go. You’ll probably come up with a migration plan that spans weeks, months or even years. During the migration phase you will be spinning up new instances of Azure SQL Database and turning off on-prem SQL Server instances, but for a little bit of time there will be overlap.

To help manage the cost during such transitions we offer the Azure Hybrid Benefit. You can convert 1 core of SQL Enterprise edition to get up to 4 vCores of Azure SQL Database at a reduced rate. For example, if you have 4 core licenses of SQL Enterprise edition, you can receive up to 16 vCores of Azure SQL Database.

If you want to learn more check out the Azure Hybrid Benefit FAQ and don’t forget, if you have any questions around migrating your .NET applications to Azure you can always ask for free assistance. If you have any questions about this post just leave us a comment below.

The post Azure Hybrid Benefit for SQL Server appeared first on ASP.NET Blog.

Microsoft and SAP extend partnership to Internet of Things

$
0
0

The Internet of Things (IoT) is becoming mainstream. Companies are seeing market-making benefits from IoT and deploying at scale – from transforming operations and logistics, remote monitoring, and predictive maintenance at the edge to new consumer experiences powered by connected devices. In all of these solutions, IoT data and AI are producing powerful insights that lead to new opportunities.

Today at MWC 2019, we’re announcing that Microsoft and SAP are extending our partnership to IoT. SAP Leonardo IoT will integrate with Azure IoT services providing our customers with the ability to contextualize and enrich their IoT data with SAP business data within SAP Leonardo IoT to drive new business outcomes.

Microsoft has collaborated with SAP for over two decades to enable enterprise SAP solution deployments which include Azure, Windows Server, and SQL Server. Microsoft and SAP have also collaborated in the Industrial Internet Consortium, the OPC Foundation, and the Plattform Industrie 4.0 for many years, jointly helping to define and build products on open industrial interoperability and security standards. Last fall, we also announced the Open Data Initiative with SAP and Adobe, designed to eliminate data silos and deliver world-class customer experiences.

Now SAP and Microsoft are expanding their partnership to physical devices and assets with a new collaboration in the IoT space. As part of this partnership, SAP Leonardo IoT will leverage services from Azure IoT, including Azure IoT Hub and Azure IoT Edge, to provide access to secure connectivity at scale, powerful device management functionality, and industry-leading edge computing support.

By harnessing the power of Microsoft Azure IoT and SAP Leonardo IoT, we provide our customers with the ability to intelligently combine business data to provide industrial IoT capabilities and services to be consumed by SAP business applications. This provides our joint customers with a complete view on their data from physical assets to business processes to customer relationships and offers a full digital feedback loop.

As part of this partnership, we’ll also enable customers to seamlessly extend their SAP solution-based business processes to the Azure IoT Edge platform. Using SAP Essential Business Functions from SAP Leonardo IoT Edge, customers will have the ability to opt for Microsoft Azure IoT Edge as a runtime environment for SAP Leonardo IoT Edge Essential Business Functions. Deployed on enterprise-ready certified devices, leveraging the highly secure Azure IoT Edge platform, customers can reduce their dependency on latency, bandwidth, or connectivity while creating immersive business experiences that are highly responsive and contextually aware.

To learn more, read today’s announcement from SAP.

MWC 2019: Azure IoT customers, partners accelerate innovation from cloud to edge

$
0
0

The Internet of Things (IoT) has expanded the world of computing far beyond mobile and PC, bringing a new and ever-growing class of cloud-connected devices that is on track to reach 20 billion devices by 2020. This year’s Mobile World Congress (MWC) programming reflects this profound shift, where IoT is transforming industries from agriculture to retail, leveraging emerging technologies including AI, Mixed Reality, edge computing, 5G, and more to not only accelerate business but to also address societal issues like improving our global food supply, reducing energy use, and waste.

IoT unlocks the power of the intelligent cloud and intelligent edge, enabling businesses to take informed actions based on real-time insights from any physical part of their business. Customers including Chevron, Volkswagen, Kohler, CBRE, Thyssenkrupp, and more are embracing IoT as critical to their technology portfolio and using it to optimize business processes, create new connected experiences, and manage digital and physical assets at scale.

Today, Microsoft made a series of announcements for new devices and cloud services that will further increase the strategic value of IoT. Microsoft boasts one of the fastest growing IoT partner ecosystems in the market, with 10,000 IoT partners developing intelligent edge to intelligent cloud and 1,500 IoT solutions built by partners.

Through this partner and solution ecosystem, we can jointly serve customers in their mission to find business value from IoT, no matter what their industry or solution needs are.

Announcing new IoT partnerships for global-scale IoT solutions

This week, we are announcing new partnerships to enable global-scale IoT solutions with SAP, Inmarsat, and myDevices.

SAP Leonardo IoT and Azure IoT integration: Today at MWC, we are announcing new integrations between SAP Leonardo IoT and Azure IoT to deliver a complete solution for customers that simplifies the collection and ingestion of data and streams it into familiar business applications that can act on it, such as SAP S/4HANA. The SAP Leonardo IoT will leverage Azure IoT services, including Azure IoT Hub and Azure IoT Edge, to provide access to market leading secure connectivity, powerful Device Management functionality and a global scale data ingestion engine. This joint solution will enable SAP Leonardo IoT to fully manage the physical assets in a secure manner while streaming the data they produce to SAP’s portfolio of fully integrated business applications used by many of the world’s largest companies, ultimately creating improved customer experiences.

We are also adding enterprise-grade capabilities to the edge. Customers can now run SAP Leonardo IoT Edge Business Essential Functions on the highly secure Azure IoT Edge platform deployed on enterprise-ready certified devices. This will enable customers to seamlessly extend their SAP enterprise business processes to the edge, reducing their dependency on latency, bandwidth or connectivity, while creating immersive business experiences that are highly responsive and contextually aware.

Inmarsat and Microsoft collaborate to bring cloud-powered industrial IoT to global supply chain: Inmarsat, a world leader in global, mobile satellite communications services, is collaborating with Microsoft to enable its customers to transfer data collected by their Industrial IoT solutions to the Microsoft Azure IoT Central platform. Azure customers will also be able to access Inmarsat’s global, highly reliable and secure satellite communications network, enabling them to connect their IoT infrastructure to cloud-based applications. The collaboration will initially focus on the delivery of Industrial IoT-based solutions to the agriculture, mining, transportation, and logistics sectors. Customers will gain access to a variety of tools that will help connect anything to anything, bringing together assets in the physical world with applications in the digital world, no matter how remote the location with the power of the intelligent edge.

myDevices to connect LoRaWAN Sensors to Microsoft Azure: myDevices, which designs solutions to help enterprises to quickly design, prototype, and commercialize IoT solutions, today announced a powerful collaboration with Microsoft that empowers users to onboard hundreds of LoRaWAN devices to instantly send data directly to Microsoft Azure. myDevices has amassed one of the most extensive known catalogs of pre-configured LoRaWAN sensors and gateways consisting of nearly 200 different devices from over 50 hardware manufacturers from around the world. The devices range from standard indoor temperature sensors to industrial strength tank monitoring devices and everything in between. All of the sensors and gateways include a scannable QR code that is used with the myDevices’ IoT in a Box mobile application. Scanning the QR code with the app connects the device, decodes the payload, normalizes the data for interoperability, and provides the user with features such as sensor activity logs, time-series visualization charts, sensor maps, customizable alerts, corrective action reports, permission-based user management, white label deployments, and more.

Azure IoT partners and customer solution demos

At MWC we have several partners showcasing Azure IoT solutions in our booth across industries ranging from manufacturing and healthcare to real estate and retail:

  • CBRE, the world’s largest commercial real estate services and investment firm, will be demonstrating CBRE Host. The Host mission is to increase individual well-being, personal productivity and organizational effectiveness through people-led, technology-enabled services. CBRE Host uses Azure Digital Twins to model its workplaces and derive insights about how space is being used.
  • Cradlepoint, the global leader in cloud-delivered LTE and 5G wireless network edge solutions, is partnering with Microsoft to create an integrated solution powered by Azure IoT Central that will make it faster and easier for enterprises to create distributed enterprise IoT solutions. Cradlepoint makes is simple to securely connect, manage, and monitor thousands of IoT devices using a combination of LTE-as-a-WAN and real-time cloud-based management.
  • qiio and Feldschlösschen Breweries, a subsidiary of the Carlsberg Group, will be showcasing an Azure-enabled beer brewer that sends device utilization, health, and performance data to the cloud via the end-to-end IoT solution of qiio. Azure Sphere securely connects the brewer, protecting the device from security breaches, giving Feldschlösschen a peace of mind and useful insights around their operations.
  • Sensoria Health with partner Optima Molliter are solving for health needs with smart aging digital solutions, such as the first smart diabetic footwear product, MOTUS Smart powered by Sensoria, that monitors patient compliance to a clinician’s prescribed mechanical offloading protocol to help reduce the risk of amputations.
  • Toyota Material Handling Europe created new and evolved “lean” processes that leverage AI to help service technicians optimize tasks and lower inefficiencies. Toyota is showcasing an autonomous pallet drone that identifies safety hazards with AI-enabled cameras and processes the data at the edge with Azure IoT Edge to cut down latency and response times.

If you’re at MWC, make sure to stop by our booth 3N30 in Hall 3 to see the demos in person and learn more about the Azure IoT platform.

Azure.Source – Volume 71

$
0
0

Now in preview

Preview: Distributed tracing support for IoT Hub

Announcing distributed tracing support for IoT Hub now in public preview. As with most IoT solutions, including our Azure IoT reference architecture, an IoT message travels from a device through a dozen or more services before it is stored or visualized. It can be very challenging to pinpoint when something has gone wrong in the flow. To completely understand the flow of messages through an IoT Hub, you must trace each message's path using unique identifiers. IoT Hub is a managed service, hosted in the cloud, that acts as a central message hub for bi-directional communication between your IoT application and the devices it manages. You can use Azure IoT Hub to build IoT solutions with reliable and secure communications between millions of IoT devices and a cloud-hosted solution backend. You can connect virtually any device to IoT Hub.

Screenshot showing UI for enabling distributed tracing for Azure IoT Hub

Now generally available

Update to Azure DevOps Projects support for Azure Kubernetes Service

Kubernetes is gaining strength as adoption across the industry continues to grow. However, many customers coming to container orchestration for the first time are also building familiarity with Docker and containers in general. To help with container adoption, we updated our Azure Kubernetes Service (a fully managed Kubernetes container orchestration service) and released Azure DevOps Projects (a simplified experience to help you launch an app on an Azure Service of your choice) to help you deploy multiple apps to a single Azure Kubernetes Service (AKS) cluster. These features are now generally available in the Azure portal.

Animation showing how to use Azure DevOps Projects in the Azure portal to deploy apps to an AKS cluster

More reliable event-driven applications in Azure with an updated Event Grid

Event-driven programming as a core building block for cloud application architecture has been on the rise. Enabling you to build more sophisticated, performant, and stable event-driven applications in Azure is important. Announcing the general availability of features previously in preview: Dead lettering, Retry policies, Storage Queues as a destination, Hybrid Connections as a destination, and Manual Validation Handshake. To take advantage of the these features, use 2019-01-01 API and SDKs. If you are using CLI or PowerShell, use versions 2.0.56 or later for CLI and 1.1.0 for PowerShell.

Class schedules on Azure Lab Services

Classroom labs in Azure Lab Services makes it easy to set up labs by handling the creation and management of virtual machines, enabling infrastructure to scale. Schedules management is one of the key features requested by classroom labs customers who also need to easily create, edit, and delete schedules. Through continuous enhancements, the latest deployment of Azure Lab Services now includes added support for class schedules.

News and updates

Modernize alerting using Azure Resource Manager storage accounts

Azure Monitor is a unified monitoring service that includes alerting and other monitoring capabilities. Classic alerts in Azure Monitor reach retirement in June, 2019. We recommend you migrate your classic alert rules defined on your storage accounts if you want to retain alerting functionality with the new alerting platform. If you have classic alert rules configured on classic storage accounts, you should upgrade your accounts to Azure Resource Manager (ARM) storage accounts before you migrate alert rules.

Technical content

Use GraphQL with Hasura and Azure Database for PostgreSQL

Azure Database for PostgreSQL provides a fully managed, enterprise-ready community PostgreSQL database as a service for easily migrating existing apps to the cloud or for developing cloud-native applications using the languages and frameworks you choose. Learn how to take advantage of the Hasura GraphQL Engine that can instantly provide a real-time GraphQL API on a PostgreSQL database.

Introduction to Linux on Azure

An introduction to running a Linux virtual machine on Azure. This workshop has been a collaboration between Researc/hers Code and Microsoft. Researc/hers Code supports women in tech and academia by running skills workshops and podcasting the talent of women in tech and research.

New Reference Architecture: Batch scoring of Spark models on Azure Databricks

Reference architectures provide a consistent approach and best practices for a given solution. Each architecture includes recommended practices, together with considerations for scalability, availability, manageability, security, and more. The full array of reference architectures is now available on the Azure Architecture Center. This reference architecture shows how to build a scalable solution for batch scoring an Apache Spark classification model.

Six tips for securing identity in the cloud

Many customers are turning to cloud services as an asset in fighting evolving cybersecurity threats. In this three-part series on Azure Government security, learn to use best practices for securing your Azure Government resources with essential steps needed to secure identities in the cloud. Also learn specific actions you can take to create more secure identity management within your agency or organization.

Diagram showing the built-in security controls in Azure

2018 Guidance from AzureCAT: SAP on the Microsoft Platform

Technical documentation for getting up-to-speed and staying up-to-date with features and industry trends in development is vital. This past year was a busy one for the Azure Customer Advisory Team. Stay informed with this useful reference list of all the SAP guidance that was published or refreshed in 2018.

Create a CI/CD pipeline for your Azure IoT Edge solution with Azure Pipelines

New CI/CD tools can help developers deliver value faster and more transparently, but the need for customized scripts that address different kinds of edge solutions still presents a challenge for some CI/CD pipelines. Now, with the Azure IoT Edge task in Azure Pipelines, developers have an easier way to build and push the modules in different platforms and deliver to a set of Azure IoT Edge devices continuously in the cloud.

Getting Started with Ansible on Azure

Cloud Advocate Jay Gordon discusses how to get started with Ansible on the Azure Cloud.  You'll get the easy first steps to use Ansible on the Cloud Shell and create a Linux VM!

Thumbnail from Getting Started with Ansible on Azure on YouTube

Cross-Platform Container Builds with Azure Pipelines

Choosing distribution options aren’t just based on personal preference. There is usually a solid technical reason for wanting a CI build deployed on a particular platform. To aid in developing your CI/CD pipeline, Azure Pipelines enables virtual machines for running your own Docker images that have the exact version of the dependencies that you want as part of your CI/CD pipeline. Now you can have confidence that your deployment works correctly on whatever platform you choose.

Keep Calm, and Keep Coding with Azure Cosmos DB and Node.js

In this quick read, John Papa shows you how to get up and running - with links to docs to get started ASAP. In John's case, he wanted a list of heroes from his database ("Just give them to me without making me work so hard!") and he shares how the Azure Cosmos DB SDK delivers with a simple line of code.

John Papa’s Sketchnote of Cosmos and Node Together

Quick look at the Azure Shared Image Gallery

Shared Image Gallery is a service that helps you build structure and organization around your custom managed VM images. Using a Shared Image Gallery you can share your images to different users, service principals, or AD groups within your organization. Shared images can be replicated to multiple regions, for quicker scaling of your deployments. In this post, Thomas Maurer provides an overview and shows how to get started.

AZ-202 Microsoft Azure Developer Certification Transition Study Guide

Microsoft has published the exam guide for AZ-202 Microsoft Azure Developer Certification. This helpful study guide contains a list of resources you can use to help you study for the exam.

Azure shows

Episode 267 - What the Hack? | The Azure Podcast

Microsoft Cloud Solution Architects Gino Filicetti and Peter Laudati talk to the Azure Podcast team about an innovative approach to getting your team to learn Azure. They have developed a set of challenge-based hacks which allow for better retention of knowledge.

Episode 267 - What the Hack? transcript

Third Party Azure IoT solution accelerators | Internet of Things Show

Several Microsoft partners have developed solutions ranging from edge video analytics, to digital signage, to remote well monitoring for oil and gas. These are published under our partner's GitHub repositories and free for anyone to use, rebrand, or even resell. Here’s how to leverage those partner built, open sourced, Solution Accelerators to expedite your IoT solution development.

Using Azure Boards with GitHub | The DevOps Lab

As your organization and projects grow, it can get challenging to stay focused on what's most important and to organize the various types of work involved to make progress. Now you can integrate Azure Boards with your code repository on GitHub to reduce the integration tax of using multiple systems by simply mentioning work items in your commits or pull requests. See how to integrate Azure Boards with your GitHub project.

An overview of Azure Integration Services | Azure Friday

Azure Integration Services brings together API Management, Logic Apps, Service Bus, and Event Grid as a reliable, scalable platform for integrating on-premises and cloud-based applications, data, and processes across your enterprise.

Blockchain based registries | Block Talk

Registries are used in every industry and in multiple scenarios. Blockchain-based registries that are shared, immutable and cryptographically secure serve an important need, but it's not often apparent how to write these sort of contracts. In this episode we review a blockchain devkit accelerator that can help generate the contracts from simple JSON based descriptions.

Application Insights integrations and service updates | On .NET

In this episode, Michael Milirud returns to give us updates on some new capabilities that are available Azure Application Insights. He shows us demos covering Azure DevOps, dependency tracing, Azure Functions integration, and much more.

Inception with Azure DevOps | Visual Studio Toolbox

In this episode, Donovan is joined by Gopinath Chigakkagari from the Azure DevOps team. Gopinath shows how they use Azure DevOps to build Azure DevOps! He also shows how to integrate Azure DevOps to multiple 3rd party tools and deploy to multiple clouds with a single pipeline.

How to use the Azure Virtual Machines Serial Console | Azure Tips and Tricks

In this edition of Azure Tips and Tricks, learn how to use the Azure Virtual Machines Serial Console to easily troubleshoot your virtual machines. The Azure Virtual Machine Serial Console feature is available for Windows and Linux VM images.

Thumbnail from How to use the Azure Virtual Machines Serial Console on YouTube

How to configure a new virtual machine with the Azure Portal | Azure Portal Series

Microsoft Azure provides many virtual machine configuration options for any workload or application. In this video of the Azure Portal "How To" series, learn about some of the configuration options that are available when setting up a virtual machine in the Azure Portal.

Thumbnail from How to configure a new virtual machine with the Azure Portal

Scott Hunter on DevOps Capabilities in Azure - Episode 24 | The Azure DevOps Podcast

Learn the differences between .NET Core and .NET Framework and when and why you should move to .NET Core 3.0 in the future. In this episode of the Azure DevOps Podcast, Scott Hunter joins Jeffrey Palermo to discuss DevOps capabilities in Azure. Hear how .NET Standard bridges the gap between .NET Core and .NET Framework, where all the different architectures fit into the .NET ecosystem. The two also give an update and overview on WebAssembly and Blazor, as well as a preview of and their motivation for writing their upcoming book, .NET DevOps for Azure.

Events

IoT in Action: New innovations making IoT faster and simpler

Several events are scheduled this week where you can learn more about IoT solutions: Mobile World Conference in Barcelona, IoT in Action global event and Embedded World in Nuremberg, and Solution Builder Conference in Houston. As the Internet of Things (IoT) disrupts global business across every industry, opportunities abound. Partners are building on Microsoft IoT innovations and expanding solution accelerators, while customers of every size are reaping the rewards through increased productivity and efficiency, new revenue streams, and broader market share. Learn how Microsoft and our partners are making IoT faster, easier, and more cost effective through innovations in Windows IoT, Azure IoT, and Azure Sphere.

Register Now: Free Hybrid Cloud Virtual Event

Join us on March 28, 2019, 8 AM-9:30 AM Pacific Time to be among the first to see new hybrid product announcements. Hear from your peers and technology leaders to gain valuable insights on ways to accelerate your hybrid cloud roadmap. Register now for free.

Thumbnail from Free Hybrid Cloud Virtual Event promo video

Azure webinar series - Migrate Your Web Applications to Azure for Scale and Agility

Thursday, February 28, 2019 10:00 AM–11:00 AM Pacific Time - Learn the simple steps for modernizing a wide variety of web apps to Azure. Esteemed Microsoft engineer Jay Schmelzer shares implementation stories of how customers scaled with Azure and solved performance and security considerations across their apps; including .NET, PHP, and Node.js. This series also features Q&A and a learning path for hosting your web apps on Azure.

Live stream analysis using Video Indexer

Video Indexer is an Azure service designed to extract deep insights from video and audio files offline. At the EBU Production Technology Seminar in Geneva last month, an end-to-end solution was demonstrated by Microsoft that uses Video Indexer in near real-time resolutions on live feeds. Several live feeds were ingested to Azure using Dejero technology or the webRTC protocol, and sent to Make.TV Live Video Cloud to switch inputs. The selected input was sent as a transcoded stream to Azure Media Services for multi bitrate transcoding and OTT delivery in low latency mode.  The same stream was also processed in near real time with Video Indexer. The full code and a step-by-step guide to deploy the results is available on GitHub.


Azure This Week - 22 February 2019 | A Cloud Guru - Azure This Week

This time on Azure This Week, Lars talks about machine learning in Stream Analytics to detect evil doings, new Azure Maps service, and Azure DevOps pipelines team have created an app for your Slack.

Thumbnail from Azure This Week - 22 February 2019 from A Cloud Guru on YouTube

Announcing Azure Spatial Anchors for collaborative, cross-platform mixed reality apps

$
0
0

It’s amazing to look back at everything we’ve learned from our customers since we first released HoloLens. Across manufacturing, education, retail, gaming, and many other industries, developers and businesses are using mixed reality in their daily workflows and giving us feedback on what they’d like to see next. When we look across all the mixed reality solutions that customers have been building over the last few years, two things really stand out: collaboration and spatial awareness. Customers want to easily share their mixed reality experiences and place applications in the context of the real world, and thereby increase their efficiency and achieve greater productivity.

Yesterday at MWC Barcelona, we announced Azure Spatial Anchors, a mixed reality service that enables you to build a new generation of mixed reality applications that are collaborative, cross-platform, and spatially aware. Today, we’re sharing two application patterns gaining momentum across industries, and how Azure Spatial Anchors can help you deliver them with greater ease and speed.

Collaborative mixed reality experiences

Mixed reality enables us, as humans, to do more and to collaborate with those around us in a more natural and intuitive way. Whether it’s architects and site workers reviewing the day’s plans for a new construction project, designers and managers collaborating on next year’s car model, or a team of surgeons planning a procedure before operating, mixed reality has changed the way that humans now design, review, and learn together. Across industries, one of the top asks from our customers is to make it easier to share such experiences in mixed reality.

As one example, Pearson Education enables nursing students and professors to practice diagnosing and treating ill patients in 3D in the real world before the pressure of a real case. Students and professors may be using HoloLens devices, or they may be using mobile phones and tablets. Until today, sharing mixed reality experiences across devices and across platforms required either environmental setup (such as QR codes) or complex coding to handle the different sensors and endpoints. Azure Spatial Anchors provides a common coordinate frame for shared mixed reality experiences across HoloLens, iOS, and Android devices without any environmental setup needed. With Azure Spatial Anchors, everyone can collaborate in mixed reality, whether they are in a heads-up, hands-free experience on HoloLens devices, or they are participating via mobile phones and tablets.

Instructor wearing a HoloLens, holding and visualizing a holographic H2O molecule. Four students visualizing that molecule and a holographic CH4 molecule through three mobile tablets.

Connected devices and places

Sometimes, the best way to work on problems and find solutions isn’t in the traditional conference room; oftentimes, better, faster decisions can be made by seeing insights and data in the real-world context of the problem itself. For example, our customers in manufacturing want to be able to walk along the factory line and easily visualize the status of each machine in order to quickly navigate and focus on the equipment with issues. Until today, precisely mapping a large space and persisting that spatial understanding was not possible for most of our mixed reality customers. Azure Spatial Anchors is designed for this: connecting the right data to the right people in the right places, so people can work like they live—in 3D.

Internet of Things (IoT) can make your connected solutions even stronger. Most IoT projects today start from a things-centric approach, but with Azure Digital Twins, we’ve flipped that around. We’ve found that customers realize huge benefits by first modeling the physical environment and then connecting existing or new devices (“things”) to that model. With Azure Spatial Anchors and Azure Digital Twins, customers gain new spatial intelligence capabilities and new insights into how spaces and infrastructure are really used. By visualizing IoT data onsite and in-context on HoloLens or mobile devices, people can uncover and respond to operational issues before they impact workstreams.

A man wearing a HoloLens in a factory visualizes and points to holographic data surrounding a factory engine, signaling the machine is “offline.”

Get started today!

Azure Spatial Anchors is in public preview today! We’re so excited for you to start building with us. Here are a few quick tips to get started:

We would love to see what you create, and we hope you share it with us via #Azure and #SpatialAnchors. We can’t wait to see what you build!

Announcing the general availability of Java support in Azure Functions

$
0
0

Azure Functions provides a productive programming model based on triggers and bindings for accelerated development and serverless hosting of event-driven applications. It enables developers to build apps using the programming languages and tools of their choice, with an end-to-end developer experience that spans from building and debugging locally, to deploying and monitoring in the cloud. Today, we're pleased to announce the general availability of Java support in Azure Functions 2.0!

Ever since we first released the preview of Java in Functions, an increasing number of users and organizations have leveraged the capability to build and host their serverless applications in Azure. With the help of input from a great community of preview users, we’ve steadily improved the feature by adding support for easier authoring experiences and a more robust hosting platform.

What's in the release?

With this release, Functions is now ready to support Java workloads in production, backed by our 99.95 percent SLA for both the Consumption Plan and the App Service Plan. You can build your functions based on Java SE 8 LTS and the Functions 2.0 runtime, while being able to use the platform (Windows, Mac, or Linux) and tools of your choice. This enables a wide range of options for you to build and run your Java apps in the 50+ regions offered by Azure around the world.

Powerful programming model

Using the unique programming model of Functions, you can easily connect them to cloud scale data sources such as Azure Storage and Cosmos DB, and messaging services such as Service Bus, Event Hubs, and Event Grid. Triggers and bindings enable you to invoke your function based on an HTTP request, or schedule an event in one of the aforementioned source systems. You can also retrieve information or write back to these sources as part of the function logic, without having to worry about the underlying Java SDK.

Development and monitoring in Azure Functions

Easier development and monitoring

Using the Azure Functions Maven plugin you can create, build, and deploy your Functions from any Maven-enabled project. The open source Functions 2.0 runtime will enable you to run and debug your functions locally on any platform. For a complete DevOps experience, you can leverage the integration with Azure Pipelines or setup a Jenkins Pipeline to build your Java project and deploy it to Azure.

Setup a Jenkins Pipeline to build your Java project and deploy it to Azure

What is even more exciting is that popular IDEs and editors such as Eclipse, IntelliJ, and Visual Studio Code can be used to develop and debug your Java Functions.

Debug Functions in Visual Studio Code

One of the added benefits of building your serverless applications with Functions is that you automatically get access to rich monitoring experiences thanks to the Azure Application Insights integration for telemetry, querying, and distributed tracing.

Enterprise-grade serverless

Azure Functions also makes it easy to build apps that meet your enterprise requirements. Leverage features like App Service Authentication / Authorization to restrict access to your app, and protect secrets using managed identities and Azure Key Vault. Azure boasts a wide range of compliance certifications, making it a fantastic host for your serverless Java functions.

Next steps

To get started, take a closer look at how the experience of building event-driven Java apps with Azure Functions looks like following the links below:

With so much being released now and coming soon, we’d sincerely love to hear your feedback. You can reach the team on Twitter and on GitHub. We also actively monitor Stack Overflow and UserVoice, so feel free to ask questions or leave your suggestions. We look forward to hearing from you!

Improving the TypeScript support in Azure Functions

$
0
0

TypeScript is becoming increasingly popular in the JavaScript community. Since Azure Functions runs Node.js, and TypeScript compiles to JavaScript, motivated users already could get TypeScript code up and running in Azure Functions. However, the experience wasn’t seamless, and things like our default folder structure made getting started a bit tricky. Today we’re pleased to announce a set of tooling improvements that improve this situation. Azure Functions users can now easily develop with TypeScript when building their event-driven applications!

For those unfamiliar, TypeScript is a superset of JavaScript which provides optional static typing, classes, and interfaces. These features allow you to catch bugs earlier in the development process, leading to more robust software engineering. TypeScript also indirectly enables you to leverage modern JavaScript syntax, since TypeScript is compatible with ECMAScript 2015.

With this set of changes to the Azure Functions Core Tools and the Azure Functions Extension for Visual Studio Code, Azure Functions now supports TypeScript out of the box! Included with these changes are a set of templates for TypeScript, type definitions, and npm scripts. Read on to learn more details about the new experience.

Templates for TypeScript

In the latest version of the Azure Functions Core Tools and the Azure Functions Extension for VS Code, you’re given the option to use TypeScript when creating functions. To be more precise, on creation of a new function app, you will now see the option to specify TypeScript on language stack selection. This action will opt you into default package.json and .tsconfig files, setting up their app to be TypeScript compatible. After this, when creating a function, you will be able to select from a number of TypeScript specific function templates. Each template represents one possible trigger, and there is an equivalent present in TypeScript for each template supported in JavaScript.

Select from a number of TypeScript specific function templates for different triggers

The best part of this new flow is that to transpile and run TypeScript functions, you don’t have to take any actions at all that are unique to Functions. For example, what this means is that when a user hits F5 to start debugging Visual Studio Code, Code will automatically run the required installation tasks, transpile the TypeScript code, and start the Azure Functions host. This local development experience is best in class, and is exactly how a user would start debugging any other app in VS Code.

Learn more about how to get your TypeScript functions up and running in our documentation.

Type definitions for Azure Functions

The @azure/functions package on npm contains type definitions for Azure Functions. Have you ever wondered what’s an Azure Function object is shaped like? Or maybe, the context object that is passed into every JavaScript function? This package helps! To get the most of TypeScript, this should to be imported in every .ts function. JavaScript purists can benefit too – including this package in your code gives you a richer Intellisense experience. Check out the @azure/functions package on npm to learn more!

Npm scripts

Included by default in the TypeScript function apps is a package.json file including a few simple npm scripts. These scripts allow Azure Functions to fit directly into your typical development flow by calling specific Azure Functions Core Tools commands. For instance, ‘npm start’ will automatically run ‘func start’, meaning that after creating a function app you don’t have to treat it differently than any other Node.js project.

To see these in action, check out our example repo!

Try it yourself!

With either the Azure Functions Core Tools or the Azure Functions Extension for VS Code, you can try out the improved experience for TypeScript in Azure Functions on your local machine, even if you don’t have an Azure account.

Next steps

As always, feel free to reach out to the team with any feedback on our GitHub or Twitter. Happy coding!


National Bank of Canada reinvents workplace, empowers employees with Microsoft 365

National Bank of Canada reinvents workplace, empowers employees with Microsoft 365

Bing Maps Autosuggest API Now Supports Business Suggestions in 8 New Countries

$
0
0

Bing Maps Autosuggest is part of the Bing Maps API offering both as part of the Bing Maps V8 Web Control and a standalone REST API service. Autosuggest provides suggestions for roads, addresses, intersections, places, and businesses. This is very useful for companies who have customers typing in addresses/locations where we will suggest and auto complete a business name or address which assists user input and increases accuracy.

Today, we announce expanded support for business suggestions in 8 new countries. Those countries are: Australia, Brazil, Canada, France, Germany, India, Italy, and Spain.

This functionality is available now in both the V8 Web Control and REST API. Below, we illustrate these suggestions appearing on the web control. (Note: We limited the suggestions to business suggestions only using the AutosuggestOptions Object for this example).

Australia

Bing Maps Autosuggest - Australia

Brazil

Bing Maps Autosuggest - Brazil

Canada

Bing Maps Autosuggest - Canada

France

Bing Maps Autosuggest - France
 

Germany

Bing Maps Autosuggest - Germany
 

India

Bing Maps Autosuggest - India

 

Italy

Bing Maps Autosuggest - Italy
 

Spain

Bing Maps Autosuggest - Spain


To learn more about Bing Maps Autosuggest API, check out the documentation.

- Bing Maps Team

Announcing Azure Monitor AIOps Alerts with Dynamic Thresholds

$
0
0

We are happy to announce that Metric Alerts with Dynamic Thresholds is now available in public preview. Dynamic Thresholds are a significant enhancement to Azure Monitor Metric Alerts. With Dynamic Thresholds you no longer need to manually identify and set thresholds for alerts. The alert rule leverages advanced machine learning (ML) capabilities to learn metrics' historical behavior, while identifying patterns and anomalies that indicate possible service issues.

Metric Alerts with Dynamic Thresholds are supported through a simple Azure portal experience, as well as provides support for Azure workloads operations at scale by allowing users to configure alert rules through an Azure Resource Manager (ARM) API in a fully automated manner.

Why and when should I apply Dynamic Thresholds to my metrics alerts?

Smart metric pattern recognition – A big pain point with setting static threshold is that you need to identify patterns on your own and create an alert rule for each pattern. With Dynamic Thresholds, we use a unique ML technology to identify the patterns and come up with a single alert rule that has the right thresholds and accounts for seasonality patterns such as hourly, daily, or weekly. Let’s take the example of HTTP requests rate. As you can see below, there is definite seasonality here. Instead of setting two or more different alert rules for weekdays and weekends, you can now get Azure Monitor to analyze your data and come up with a single alert rule with Dynamic Thresholds that changes between weekdays and weekends.

Server request (Platform) graph

Scalable alerting – Wouldn’t it be great if you could automatically apply an alert rule on CPU usage to any virtual machine (VM) or application that you create? With Dynamic Thresholds, you can create a single alert rule that can then be applicable automatically to any resource that you create. You don’t need to provide thresholds. The alert rule will identify the baseline for the resource and define the thresholds automatically for you. With Dynamic Thresholds, you now have a scalable approach that will save a significant amount of time on management and creation of alerts rules.

Domain knowledge – Setting a threshold often requires a lot of domain knowledge. Dynamic Thresholds eliminates that need with the use of your ML algorithms. Further, we have optimized the algorithms for common use cases such as CPU usage for a VM or requests duration for an application. So you can have full confidence that the alert will capture any anomalies while still reducing the noise for you.

Intuitive configuration – Dynamic Thresholds allow setting up metric alerts rules using high-level concepts, alleviating the need to have extensive domain knowledge about the metric. This is expressed by only requiring users to select the sensitivity for deviations (low, medium, high) and boundaries (lower, higher, or both thresholds) based on the business impact of the alert in the UI or ARM API.

Screenshot of intuitive configuration with Dynamic Thresholds

Dynamic Thresholds also allow you to configure a minimum amount of deviations required within a certain time window for the system to raise an alert, the default time window is four deviations in 20 minutes. The user can configure this and choose what he/she would like to be alerted on by changing the failing periods and time window.

Setting number of violations to trigger the alert screenshot

Setting number of violations to trigger the alert screenshot

Metric Alerts with Dynamic Threshold is currently available for free during the public preview. To see the pricing that will be effective at general availability, visit our pricing page. To get started, please refer to the documentation, “Metric Alerts with Dynamic Thresholds in Azure Monitor (Public Preview).” We would love to hear your feedback! If you have any questions or suggestions, please reach out to us at azurealertsfeedback@microsoft.com.

Please note, Dynamic Threshold based alerts are available for all Azure Monitor based metric sources listed in the documentation, “Supported resources for metric alerts in Azure Monitor.”

MWC Barcelona 2019—Lenovo announces new devices geared towards Firstline Workers

Viewing all 5971 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>