Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

Try the new System.Text.Json APIs

$
0
0

For .NET Core 3.0, we’re shipping a brand new namespace called System.Text.Json with support for a reader/writer, a document object model (DOM), and a serializer. In this blog post, I’m telling you why we built it, how it works, and how you can try it.

We also have a video:

Getting the new JSON library

  • If you’re targeting .NET Core. Install the latest version of the .NET Core 3.0 preview. This gives you the new JSON library and the ASP.NET Core integration.
  • If you’re targeting .NET Standard or .NET Framework. Install the System.Text.Json NuGet package. In order to get the integration with ASP.NET Core, you must target .NET Core 3.0.

The future of JSON in .NET Core 3.0

JSON has become an essential part of virtually all modern .NET applications and in many cases even surpassed the usage of XML. However, .NET hasn’t had a (great) built-in way to deal with JSON. Instead, we’ve relied on Json.NET until now, which continues to serve the .NET ecosystem well.

We’ve decided that we needed to build a new JSON library:

  • Provide high-performance JSON APIs. We needed a new set of JSON APIs that are highly tuned for performance by using Span<T> and can process UTF-8 directly without having to transcode to UTF-16 string instances. Both aspects are critical for ASP.NET Core, where throughput is a key requirement. We considered contributing changes to Json.NET, but this was deemed close to impossible without either breaking existing Json.NET customers or compromising on the performance we could achieve. With System.Text.Json, we were able to gain 1.3x – 5x speed up, depending on the scenario (see below for more details). And we believe we can still squeeze out more.
  • Remove Json.NET dependency from ASP.NET Core. Today, ASP.NET Core has a dependency on Json.NET. While this provides a tight integration between ASP.NET Core and Json.NET, it also means the version of Json.NET is dictated by the underlying platform. However, Json.NET is frequently updated and application developers often want to — or even have to — use a specific version. Thus, we want to remove the Json.NET dependency from ASP.NET Core 3.0, so that customers can choose which version to use, without fearing they might accidentally break the underlying platform.
  • Provide an ASP.NET Core integration package for Json.NET. Json.NET has basically become the Swiss Army knife of JSON processing in .NET. It provides many options and facilities that allow customers to handle their JSON needs with ease. We don’t want to compromise on the Json.NET support customers are getting today. For example, the ability to configure the JSON serialization in ASP.NET Core via the AddJsonOptions extension method. Thus, we want to provide the Json.NET integration for ASP.NET Core as a NuGet package that developers can optionally install, so they get all the bells and whistles they get from Json.NET today. The other part of this work item is to ensure we have the right extension points so that other parties can provide similar integration packages for their JSON library of choice.

For more details on the motivation and how it relates to Json.NET, take a look at the announcement we made back in October.

Using System.Text.Json directly

For all the samples, make sure you import the following two namespaces:

Using the serializer

The System.Text.Json serializer can read and write JSON asynchronously and is optimized for UTF-8 text, making it ideal for REST API and back-end applications.

By default, we produce minified JSON. If you want to produce something that is human readable, you can pass in an instance of JsonSerializerOptions to the serializer. This is also the way you configure other settings, such as handling of comments, trailing commas, and naming policies.

Deserialization works similarly:

We also support asynchronous serialization and deserialization:

You can also use custom attributes to control serialization behavior, for example, ignoring properties and specifying the name of the property in the JSON:

We currently don’t have support for F# specific behaviors (such as discriminated unions and record types), but we plan on adding this in the future.

Using the DOM

Sometimes you don’t want to deserialize a JSON payload, but you still want structured access to its contents. For example, let’s say we have a collection of temperatures and want to average out the temperatures on Mondays:

The JsonDocument class allows you to access the individual properties and values quite easily.

Using the writer

The writer is straight forward to use:

The reader requires you to switch on the token type:

Integration with ASP.NET Core

Most use of JSON inside of ASP.NET Core is provided via the automatic serialization when accepting or returning object payloads, which in turn means that most of your application’s code is agnostic to which JSON library ASP.NET Core is using. That makes it easy to switch from one to another.

You can see the details on how you can enable the new JSON library in MVC and SignalR later on in this post.

Integration with ASP.NET Core MVC

In Preview 5, ASP.NET Core MVC added support for reading and writing JSON using System.Text.Json. Starting with Preview 6, the new JSON library is used by default for serializing and deserializing JSON payloads.

Options for the serializer can be configured using MvcOptions:

If you’d like to switch back to the previous default of using Newtonsoft.Json, do the following:

  1. Install the Microsoft.AspNetCore.Mvc.NewtonsoftJson NuGet package.
  2. In ConfigureServices() add a call to AddNewtonsoftJson()

Known issues

  • Support for OpenAPI / Swagger when using System.Text.Json is ongoing and unlikely to be available as part of the 3.0 release.

Integration with SignalR

System.Text.Json is now the default Hub Protocol used by SignalR clients and servers starting in ASP.NET Core 3.0 Preview 5.

If you’d like to switch back to the previous default of using Newtonsoft.Json, then you can do so on both the client and server.

  1. Install the Microsoft.AspNetCore.SignalR.Protocols.NewtonsoftJson NuGet package.
  2. On the client add .AddNewtonsoftJsonProtocol() to the HubConnectionBuilder:
  3. On the server add .AddNewtonsoftJsonProtocol() to the AddSignalR() call:

Performance

Since this feature is heavily motivated by performance, we’d like to share some high-level performance characteristics of the new APIs.

Please keep in mind that these are based on preview builds and the final numbers will most likely differ. We’re also still tweaking default behaviors which will affect performance (for example, case sensitivity). Please note that these are micro benchmarks. Your mileage will most certainly differ, so if performance is critical for you, make sure to make your own measurements for scenarios that best represent your workload. If you encounter scenarios you’d like us to optimize further, please file a bug.

Raw System.Text.Json

Just doing micro benchmarks to compare System.Text.Json with Json.NET yields the following output:

Scenario Speed Memory
Deserialization 2x faster Parity or lower
Serialization 1.5x faster Parity or lower
Document (read-only) 3-5x faster ~Allocation free for sizes < 1 MB
Reader 2-3x faster ~Allocation free (until you materialize values)
Writer 1.3-1.6x faster ~Allocation free

 

System.Text.Json in ASP.NET Core MVC

We’ve written an ASP.NET Core app that generates data on the fly that is then serialized and deserialized from MVC controllers. We then varied the payload sizes and measured the results:

JSON deserialization (input)

Description RPS CPU (%) Memory (MB)
Newtonsoft.Json – 500 B 136,435 95 172
System.Text.Json – 500 B 167,861 94 169
Newtonsoft.Json – 2.4 KB 97,137 97 174
System.Text.Json – 2.4 KB 132,026 96 169
Newtonsoft.Json – 40 KB 7,712 88 212
System.Text.Json – 40 KB 16,625 96 193

 

JSON serialization (output)

Description RPS CPU (%) Memory (MB)
Newtonsoft.Json – 500 B 120,273 94 174
System.Text.Json – 500 B 145,631 94 173
Newtonsoft.Json – 8 KB 35,408 98 187
System.Text.Json – 8 KB 56,424 97 184
Newtonsoft.Json – 40 KB 8,416 99 202
System.Text.Json – 40 KB 14,848 98 197

 

For the most common payload sizes, System.Text.Json offers about 20% throughput increase in MVC during input and output formatting with a smaller memory footprint.

Summary

In .NET Core 3.0, we’ll ship the new System.Text.Json APIs, which provide built-in support for JSON, including reader/writer, read-only DOM, and serializer/deserializer. The primary goal was performance and we see typical speedups of up to 2x over Json.NET, but it depends on your scenario and your payload, so make sure you measure what’s important to you.

ASP.NET Core 3.0 includes support for System.Text.Json, which is enabled by default.

Give System.Text.Json a try and send us feedback!

{"happy": "coding!"}

The post Try the new System.Text.Json APIs appeared first on .NET Blog.


Visual Studio 2019 for Mac version 8.1 is now available (and a Preview for 8.2)

$
0
0

Todaywe are excited to announce the release of Visual Studio 2019 for Mac version 8.1 along with the first preview of Visual Studio 2019 for Mac version 8.2. Both releases contain exciting new features as well as improvements to performance and stability across the IDE. You can download the latest update on the Visual Studio download page or update an existing installation via the Updater within Visual Studio for MacYou can find release notes for both the stable and preview releases on our Release Notes page. We’ve also updated our Roadmap to give you a look at whatcoming over the next 3 months in versions 8.2 and beyond – including updated web editors, support for projects using multiple target frameworks, and solution-level NuGet package management! 

Visual Studio 2019 for Mac version 8.1 

Version 8.1 continues to expand on the improvements we’ve brought to Visual Studio for Mac with new features added to the C# code editor as well as new ASP.NET Core templates.  

The new editor, which was initially announced as a preview feature in Visual Studio 2019 for Mac version 8.0is now the default editing experience for C# filesThe new editor features an updatedfully native UI built on top of the reliable Visual Studio foundation. You’ll find several additional features that close the gap between the old and new editor in addition to some that were explicitly requested by the Visual Studio for Mac user community. These features include: 

  • Faster code generation through code snippets 
  • Improved analyzer functionality with inline lightbulbs 
  • Improved multicaret functionality 
  • Quick navigation via Document Outline and an improved Go to Line implementation  
  • Faster editing with drag and drop across views 

One of the most requested features from our users was the ability to utilize code snippets within a C# file. They’re a great way to quickly add blocks of common code to your project, such as properties and constructors, by leveraging the power of IntelliSense. Visual Studio for Mac now contains dozens of built-in snippets and allows you to easily expand the default library with custom ones using the Code Snippets editor within Preferences.  

Another feature we’ve added to help increase your coding efficiency is inline lightbulbs for C# projects.  For cases where you’re looking for alternative code suggestions or trying to find that code error that stops your code from building, inline lightbulbs help you refine your code and quickly detect errors in real-timeAll it takes to apply a suggested fix is a few clicks of the mouseallowing you to quickly get back to codingAnd because the logic powering these suggestions is shared with Visual Studio on Windows, you’ll always be up to date and have support for the latest analyzers.  

The latest version also brings improved navigation and editing shortcuts, such as the Document Outline and a brand-new Go to Line implementation. The new editor also supports the ability to drag and drop code across files within your project. All these improvements have been implemented with your improved productivity as our primary focus, empowering you to write better code, faster. 

In addition to the new C# editor features, you can now edit Android layout files side-by-side with a real-time designer view. This experience allows you to see changes you make to an Android layout file on the fly as you are editing, greatly improving the efficiency while editing these files. 

Finallywe’ve introduced a handful of improvements to performance and stability, updated ASP.NET Core project templates, and refined the component selection process at install based on your feedback. 

Visual Studio 2019 for Mac version 8.2 Preview 

As we mentioned earlier in this post, today we’re also releasing the first preview of Visual Studio 2019 for Mac version 8.2. In order to use the 8.2 Preview, you can easily switch to the Preview channel within Visual Studio for Mac. 

Improved XAML Editing Experience 

The 8.2 Preview release introduces new XAML and AXML editing experiences based on the same core as the new C# editor as well as the XAML language service from Visual Studio on Windows. These components provide improved XAML and AXML editing experiences, including more accurate and powerful IntelliSense, faster performance, better matching, improved linting, and an overall increase in the reliability of the editing experience. You can enable the new XAML editor via Visual Studio > Preferences > Text Editor > Behavior > XAML and selecting “Enable new XAML language service”.

One of the advantages of the new XAML Language Service with Visual Studio for Mac is that you now have access to much improved matching capabilities. For example, completions now support fuzzy matching, substring matching, and CamelCase matching, reducing the time it takes to find keywords without the need to perfectly match casing. Examples of the new matching patterns supported in the XAML editor include: 

  • Substring Matching: Matches will be listed when you type a part of a string, even if it is in the middle of the string. This is a great feature if you recall a section of a command, but not the entire command. Typing “Lay” will match “StackLayout” along with any other string which contains “lay”. 
  • Case Insensitive Matching: If you can’t recall the exact casing of a string you’re trying to find, case insensitive matching will ensure you can still find what you’re looking for. With support for this kind of matching, typing “stack” will match to “StackLayout. 
  • Fuzzy Matching: Typing any portion of a string will provide a list of matching and like matches. If you type “stck”, StackLayout will still appear as an option. 

You’ll also see improvements to suggested completions in a wide variety of scenarios for XAML, including light bulb suggestions and binding, resource, XMLNS, and markup extension completion, allowing you to write code faster and more accurately. 

.NET Core 3 and C# 8 Preview 

At Build 2019, we highlighted the next evolution in the .NET Core ecosystem.NET Core 3With it comes improved performance, streamlined project files, and an updated deployment experience. You can learn more about the new features and fixes offered in .NET Core 3 through the What’s New documentation. 

In addition to .NET Core 3, Visual Studio for Mac 8.2 also offers a preview of support for C# 8. This next generation update of C# continues to improve upon the solid foundation of C# through the integration of language features to help you write powerful code in less time. You can learn more about some of the new features in C# 8 through our documentation. 

Improved Android XML editing experience 

Android layout and resource file editing is now powered by Visual Studio for Mac’s new editorThis means that you will be able to experience all of the rich editing experiences and performance that you see in the C# editor in your Android layout files. These enhancements include improved IntelliSense, go-to-definition, and semantic editing of your files, all within Visual Studio for Mac. 

Download and try today 

We encourage you to download and try out the version 8.1 release todayWe hope this release brings us one step closer to our goal of providing you with the right tools to make .NET development on macOS a breeze.  

We also invite you to try the 8.2 Preview 1 release of Visual Studio 2019 for Mac if you’re interested in benefiting from the new XAML editor and in helping us build a better product by providing feedback and suggestions. As always, if you come across any bugs or issues, please use thReport a Problem feature to help us to improve this new experience and make each release as powerful and reliable as possible. 

The post Visual Studio 2019 for Mac version 8.1 is now available (and a Preview for 8.2) appeared first on The Visual Studio Blog.

Announcing Entity Framework Core 3.0 Preview 6 and Entity Framework 6.3 Preview 6

$
0
0

New previews of the next versions of EF Core and EF 6 are now available on NuGet.Org.

What is new in EF Core 3.0 Preview 6

In recent months, a lot of our efforts have been focused on a new LINQ implementation for EF Core 3.0. Although the work isn’t complete and a lot of the intended functionality hasn’t been enabled, before preview 6 we reached a point in which we couldn’t make more progress without integrating the new implementation into the codebase in the main branch.

Query changes

IMPORTANT: Although as always, we want to encourage you to experiment with our preview bits in a controlled environment and to provide feedback, preview 6 has significant limitations in the LINQ implementation that we expect to affect any application that performs all but the most trivial queries. Given this, we want to explicitly recommend you against trying to update any applications you have in production to this preview.

While some of the limitations are caused by intentional breaking changes, a lot more are temporary issues we expect to fix before RTM.

These are some of the main things to know:

  • Temporary limitation: In-memory database and Cosmos DB providers aren’t functional in this preview: In the initial phase of the switch to the new implementation, we have prioritized getting our relational providers working. Functionality with in-memory database and Cosmos DB providers is broken, and we recommend you skip preview 6 if you have code that depends on these providers. We expect to gradually restore functionality in subsequent previews.
  • Temporary limitation: Several areas of query translation aren’t working with relational databases: Queries that use any of these constructs will very likely fail to translate correctly or execute:
    • Owned types
    • Collections reference on projections
    • GroupBy operator
    • Equality comparisons between entities
    • Query tags
    • Global query filters
  • Intentional breaking change: LINQ queries are no longer evaluated on the client: This is actually one of the main motivations we had for building a new LINQ implementation into EF Core 3.0. Before this version of EF Core, expressions in the query that could not be translated for SQL would be automatically evaluated on the client, regardless of their location in the query. This contributed to hard to predict performance issues, especially when expressions used in predicates were not translated and large amounts of data ended up being filtered on the client, and also caused compatibility problems every time we introduced new translation capabilities. In the new implementation, we only support evaluating on the client expressions in the top-level projection of the query. Read the full description of this breaking change in our documentation.
  • Intentional breaking change: Existing FromSql overloads has been renamed to FromSqlRaw and FromSqlInterpolated, and can only be used at the root of queries: Read more details about it in the breaking change documentation.

If you run into any issues that you don’t see in this list or in the list of bugs already tracked for 3.0, please report it on GitHub.

If you hit limitations only in a few queries, here are some workarounds that you might be able to use to get things working:

  • Switch explicitly to client evaluation if you need to filter data based on an expression that cannot be translated to SQL, using the AsEnumerable() or ToList() extension methods. For example, the following query will no longer work in EF Core 3.0 because one of the predicates in the where clause requires client evaluation:
    var specialCustomers = context.Customers
      .Where(c => c.Name.StartsWith(n) && IsSpecialCustomer(c));

    But if you know it is ok to perform part of the filtering on the client, you can rewrite this as:
    var specialCustomers = context.Customers
      .Where(c => c.Name.StartsWith(n))
      .AsEnumerable()
      .Where(c => IsSpecialCustomer(c));
  • Use the new FromSqlRaw() or FromSqlInterpolated() methods to provide your own SQL translations for anything that isn’t yet supported.
  • Skip preview 6 and stay on preview 5 until more issues are fixed, or try our nightly builds.

Switch to Microsoft.Data.SqlClient

As announced recently, development of the ADO.NET provider for SQL Server has moved to this new package. The EF Core provider for SQL Server now uses the new package to connect to SQL Server databases.

Please continue to create issues for the EF Core layer (for example SQL generation) on the EF Core issue tracker. Any issues related to SQL Server database connectivity are better reported to the SqlClient issue tracker.

DbContext scaffolding improvements

We now have support for:

  • Scaffolding entities without keys.
  • Scaffolding entities from database views.
  • Scaffolding DbContext from an Azure SQL Data Warehouse.
  • A new dotnet ef dbcontext script command to generate the SQL script equivalent to calling EnsureCreated().
  • New Package Manager Console command functionality in Get-DbContext for listing DbContext types available in an application.

Thank you very much @ErickEJ for all these contributions! You rock!

What’s new in EF 6.3 Preview 6

Here are some of the main changes since preview 5:

  • We automatically locate System.Data.SqlClient when it isn’t registered with DbProviderFactories: This means it’s no longer necessary to register SqlClient as a workaround before you start using EF.
  • EF 6 tests are passing on .NET Core: We made infrastructure updates necessary to get our existing tests running on .NET Core. This has allowed us to identify and fix problems in the product code as well as an issue that was fixed in .NET Core 3.0 Preview 6. There is a known issue with the translation of String methods like StartsWith, EndsWith and Contains that wasn’t fixed on time for Preview 6.

What’s next

For upcoming previews and the rest of the EF Core 3.0 release, our main focus will be on restoring and improving the query functionality using our new LINQ implementation.

We are also working to enable more aspects of the experience you would expect from using EF 6.3 on .NET Core. For example, we are making additional changes to simplify working when an application .config file isn’t present, enabling embedding metadata from an EDMX into the compilation output on .NET Core, and reimplementing the EF migration commands in a way that is compatible with .NET Core.

Weekly status updates

If you’d like to track our progress more closely, we now publish weekly status updates to GitHub. We also post these status updates to our Twitter account, @efmagicunicorns.

Thank you

As always, thank you for trying our preview bits, and thanks to all the contributors that helped in this preview!

The post Announcing Entity Framework Core 3.0 Preview 6 and Entity Framework 6.3 Preview 6 appeared first on .NET Blog.

Top Stories from the Microsoft DevOps Community – 2019.06.14

$
0
0

I’m back home this week in England – and after week in sunny Raleigh-Durham, North Carolina (and before that, a week in even sunnier New Orleans), the cold rain has me stuck inside in the office. But it’s not all bad, as that gives me an opportunity to catch up on some of the great articles written by the Azure DevOps community:

Global DevOps Bootcamp
The Global DevOps Bootcamp is this weekend! Global DevOps Bootcamp is incredible, with DevOps thought leaders around the world coming together to show you how to build modern apps using continuous delivery and DevOps practices. Are you ready? Have you signed up for one of the events near you?

Azure Pipelines – Git checkout step fails with cannot lock ref error
One of git’s leaky abstractions is that it sometimes uses the filesystem to store branch names. That means that you can have different behavior – and even errors – if you mix capitalization of branch names on a case insensitive platform like Windows. Utkarsh Shigihalli explains the problem, how to fix it and how to configure Azure Repos so that you never have the problem again.

End-to-end Pull Request on Azure DevOps
Are you using all the features of pull requests in Azure Repos? Olivier Léger provides an incredibly thorough deep dive into them, including how to set up policies, tying them in with pipelines, merging them, build triggers, and more.

Azure DevOps: Build pipelines for Arduino microcontrollers
It’s clear that whether you’re building web apps, desktop apps or mobile apps, you should have a modern build and release pipeline for maximum productivity and safety. But what about something a little more exotic? Roberto Peña shows you how to create a pipeline for an Arduino device.

Workspace management tips for TFVC
Using TFVC instead of Git? No problem, a centralized version control system is the best choice for many workflows. But are you using it as effectively as you could? Jessie Houwing explains how to set up individual workspaces and best practices for using them for maximum efficiency.

As always, if you’ve written an article about Azure DevOps or find some great content about DevOps on Azure then let me know! I’m @ethomson on Twitter.

The post Top Stories from the Microsoft DevOps Community – 2019.06.14 appeared first on Azure DevOps Blog.

Fun with R and the Noops

$
0
0

Earlier this week, Github introduced Noops, a collection of simple black-box machines with API endpoints, with the goal of challenging developers of all skill levels to solve problems with them. Five "Noops" machines have been released so far along with challenges suitable for beginner programmers, with 15 further machines (and some more complex challenges) to be released over the coming three weeks. The bots available now are:

  • Hexbot, which generates random RBG color strings and x/y coordinates
  • Vexbot, which generates random endpoint coordinates of (possibly-connected) line segments
  • Directbot, which will generate sequences of LOGO-like movements, either at random or according to a pattern you specify
  • Drumbot, which generates commands for various pre-defined drumbeats
  • Fizzbot, which tests your knowledge of classic programmer interview questions

Descriptions of the Noops and their APIs are provided on GitHub, along with sample code in Javascript and/or Python. However, as long as you can connect to the internet, you can use any language you like to solve the challenges. 

R is an idea language for interfacing with these bots. The httr package for R makes it easy to interact with APIs using the GET operation, which is also what you use to control most Noops. (Fizzbot also uses the POST verb.) All I need to do is to load the package, and define the endpoint URL for Hexbot. And the data from the bots lends itself to graphical visualization, which is also easy to do with R. So I thought it would be fun to take Hexbot for a spin with R.

Calling the endpoint without any parameters generates a single RGB color specification (for example: #95DDFA), and this is easily done using the GET function from the httr package. Here, I define a function random.color to return a random color generated by Hexbot as a single R character. Luckily, you can pass RGB strings as colors to the base graphics functions in R, so I can use the barplot function to generate a block of random color:

Onecolor

Hexbot can generate more than one color at once, if you pass in the count parameter. For this, we just need to use the query option to the GET function. Here, we define a function to generate n random colors and extract the results as a vector, which we can then display with the barplot function once more:

Manycolors
You can also provide multiple parameters to the endpoint: if you also provide width and height, it will return random x and y locations within those ranges, along with the random colors. This adds some additional complexity when it comes to extracting the data from the payload, but with that done we can display the results as a scatterplot:

Noops-scatter
Of course, this is all rather pointless, in the sense that I could trivially use R itself to generate random points and colors. But still, it's a useful exercise to work through this yourself if you're not familiar with interacting with HTTP APIs. You'll learn several things along the way:

  • about the HTTP GET operation and how to provide query parameters;
  • how to navigate the nested lists that httr creates from the JSON content payload; and
  • how to unroll those nested lists to extract the data you need in R.

The Noops yet to be revealed promise to offer more complex challenges, so I expect we'll learn about other aspects of HTTP APIs when they're to be released in the coming weeks as well.

Github: Meet the Noops

Fun with R and the Noops

$
0
0

Earlier this week, Github introduced Noops, a collection of simple black-box machines with API endpoints, with the goal of challenging developers of all skill levels to solve problems with them. Five "Noops" machines have been released so far along with challenges suitable for beginner programmers, with 15 further machines (and some more complex challenges) to be released over the coming three weeks. The bots available now are:

  • Hexbot, which generates random RBG color strings and x/y coordinates
  • Vexbot, which generates random endpoint coordinates of (possibly-connected) line segments
  • Directbot, which will generate sequences of LOGO-like movements, either at random or according to a pattern you specify
  • Drumbot, which generates commands for various pre-defined drumbeats
  • Fizzbot, which tests your knowledge of classic programmer interview questions

Descriptions of the Noops and their APIs are provided on GitHub, along with sample code in Javascript and/or Python. However, as long as you can connect to the internet, you can use any language you like to solve the challenges. 

R is an ideal language for interfacing with these bots. The httr package for R makes it easy to interact with APIs using the GET operation, which is also what you use to control most Noops. (Fizzbot also uses the POST verb.) All I need to do is to load the package, and define the endpoint URL for Hexbot. And the data from the bots lends itself to graphical visualization, which is also easy to do with R. So I thought it would be fun to take Hexbot for a spin with R.

Calling the endpoint without any parameters generates a single RGB color specification (for example: #95DDFA), and this is easily done using the GET function from the httr package. Here, I define a function random.color to return a random color generated by Hexbot as a single R character. Luckily, you can pass RGB strings as colors to the base graphics functions in R, so I can use the barplot function to generate a block of random color:

Onecolor

Hexbot can generate more than one color at once, if you pass in the count parameter. For this, we just need to use the query option to the GET function. Here, we define a function to generate n random colors and extract the results as a vector, which we can then display with the barplot function once more:

Manycolors
You can also provide multiple parameters to the endpoint: if you also provide width and height, it will return random x and y locations within those ranges, along with the random colors. This adds some additional complexity when it comes to extracting the data from the payload, but with that done we can display the results as a scatterplot:

Noops-scatter
Of course, this is all rather pointless, in the sense that I could trivially use R itself to generate random points and colors. But still, it's a useful exercise to work through this yourself if you're not familiar with interacting with HTTP APIs. You'll learn several things along the way:

  • about the HTTP GET operation and how to provide query parameters;
  • how to navigate the nested lists that httr creates from the JSON content payload; and
  • how to unroll those nested lists to extract the data you need in R.

The Noops yet to be revealed promise to offer more complex challenges, so I expect we'll learn about other aspects of HTTP APIs when they're to be released in the coming weeks as well.

Github: Meet the Noops

Bootstrapping Azure DevOps extensions with Yeoman

$
0
0

Azure DevOps has evolved over the last several years, which has resulted in many improvements to the extension landscape. Many engineering teams use this extension functionality heavily to customize their development environment exactly how they want it. When people first get started building an extension, they often deploy the extension to the Visual Studio Marketplace and use the browser’s built-in debugging tools. Although this approach works, we wanted to demonstrate a faster way to debug right in Visual Studio Code using hot reload.

The azure-devops-extension-hot-reload-and-debug repo already demonstrates this scenario, but we also wanted to offer developers an easy way to bootstrap their own projects. Our Yeoman generator will create a new, clean extension project for you and handle the details of filling in the required config files with the correct values.

Although the TFX CLI offers a command (tfx extension init) that will bootstrap an extension project, it does not contain the necessary bits for this hot reload and debugging scenario.

How to generate a new project

  1. If you already have Node.js installed, you can use npm to install Yeoman:

    npm install -g yo

  2. Yeoman is just the scaffolding engine so it doesn’t know how to generate an Azure DevOps extension project. We will need to install our generator that knows the specifics of how the extension should be set up:

    npm install -g @microsoft/generator-azure-devops-extension

  3. Now that we have the Yeoman engine and the Azure DevOps extension generator installed we can use Yeoman to run the generator (we can leave off the package’s generator- prefix because Yeoman knows to only look for packages with this prefix):

    yo @microsoft/azure-devops-extension

  4. The generator will ask you to fill in several parameters:

    • Extension ID: The unique ID that you want to publish your extension under.
    • Extension name: The name that users will see in the Visual Studio Marketplace and Azure DevOps.
    • Extension description: A short description that users will see in the Visual Studio Marketplace and Azure DevOps.
    • Extension publisher ID: The Visual Studio Marketplace publisher ID under which you want to publish the extension. To learn more see Package and publish extensions.

The result

The generator creates a new folder with the following folder structure (the folder’s name will be the extension ID you chose during creation):

.
├── .eslintrc.js
├── .gitignore
├── .vscode
│   └── launch.json
├── README.md
├── configs
│   ├── dev.json
│   └── release.json
├── img
│   └── world.png
├── package.json
├── src
│   └── hub
│       ├── hub.html
│       ├── hub.scss
│       └── hub.tsx
├── tsconfig.json
├── vss-extension.json
└── webpack.config.js

What’s next?

Now that you have generated a new project, you are ready to start debugging. Refer to the generated readme in your new project for instructions on how to get started. You should also check out our blog post for an in-depth look at how these features work.

The post Bootstrapping Azure DevOps extensions with Yeoman appeared first on Azure DevOps Blog.

How to Upgrade to Typescript Without Anybody Noticing, Part 1

$
0
0

This guide will show you how to upgrade to TypeScript without anybody noticing. Well, people might notice — what I really mean is that you won’t have to change your build at all. You’ll have the ability to get errors and completions in supported editors and to get errors on the command line from tsc, the TypeScript compiler, but you won’t have to integrate TypeScript into your build.

Here’s what you’ll actually need to check in to source control:

  1. A TypeScript configuration file, tsconfig.json.
  2. New dev dependencies from the @types package.
  3. A TypeScript declaration file to hold miscellaneous types.

How can you upgrade with so little change? Well, the secret is that you’re not using TypeScript. The TypeScript compiler can check Javascript just fine, so you can stick with Javascript and use JSDoc to provide type information. This is less convenient than TypeScript’s syntax for types, but it means that your files stay plain old Javascript and your build (if any) doesn’t change at all.

Let’s use TypeScript-eslint-parser as an example package so you can follow along if you want. Confusingly, even though the name includes “TypeScript”, the parser is actually written in Javascript, although it has since been merged into a larger project that is written in TypeScript.

This guide assumes that you have used TypeScript enough to:

  1. Have an editor set up to work with TypeScript.
  2. Have used npm to install a package.
  3. Know the basic syntax of type annotationsnumber, { x: any }, etc.

If you want to look at the package after the upgrade, you can run the following commands, or take a look at the branch on github:

git clone https://github.com/sandersn/TypeScript-eslint-parser
cd TypeScript-eslint-parser
git checkout add-tsconfig
npm install

Also make sure that you have TypeScript installed on the command line:

npm install -g TypeScript

 

Add tsconfig

Your first step is to start with tsc --init and change the settings in the tsconfig.json that it produces. There are other ways to get started, but this gives you the most control. Run this command from the root of the project:

tsc --init

Here is what you should end up with, skipping the lines you don’t need to change:

{
  "compilerOptions": {
    "allowJs": true,
    "checkJs": true,
    "noEmit": true,
    "target": "esnext",
    "module": "commonjs",
    "resolveJsonModule": true,
    "strict": false
  },
  "exclude": [
    "tests/fixtures/",
    "tests/integration/"
  ]
}

For Javascript, your “compilerOptions” will pretty much always look like this.

  • allowJs — Compile JS files.
  • checkJs — Give errors on JS files.
  • noEmit — Don’t emit downlevel code; just give errors.
  • target — Target the newest version of EcmaScript since we’re not emitting code anyway.
  • module — Target node’s module system since we’re not emitting code anyway.
  • resolveJsonModule — Compile JSON files (if they’re small enough).
  • strict — Don’t give the strictest possible errors.

A few notes:

  • "target" and "module" should not actually matter since they have to do with generated downlevel code, but you’ll get some bogus errors if you use ES6 classes like Map or Set.
  • "resolveJsonModule" is optional, but you’ll need it if your code ever requires a JSON file, so that TypeScript will analyze it too.
  • "strict" should be false by default. You can satisfy the compiler on strict mode with pure Javascript, but it can require some odd code.

You may want to specify which files to compile. For TypeScript-eslint-parser, you’ll be happiest with an "exclude" list. You may want to check all the source files, which is what you get by default. But it turns out that checking a Javascript parser’s tests is a bad idea, because the tests are themselves malformed Javascript files. Those malformed test files shouldn’t be checked and mostly don’t parse anyway.

You might want to use "include" if, say, you only want to check your source and not your tests or scripts. Or you can use "files" to give an explicit list of files to use, but this is annoying except for small projects.

OK, you’re all set. Run tsc and make sure it prints out errors. Now open up files in your editor and make sure the same errors show up there. Below are the first few errors you should see:

Makefile.js(55,18): error TS2304: Cannot find name 'find'.
Makefile.js(56,19): error TS2304: Cannot find name 'find'.
Makefile.js(70,5): error TS2304: Cannot find name 'echo'.

You should be able to see the same errors when you open Makefile.js in your editor and look at lines 55 and 56.

Congratulations! You’ve done the only required part of the upgrade. You can check in tsconfig.json and start getting benefits from TypeScript’s checking in the editor without changing anything else. Of course, there are a huge number of errors, hardly any of which are due to real bugs. So the next step is to start getting rid of incorrect errors and improving TypeScript’s knowledge of the code.

Here’s the commit.

 

Install @types packages.

Your first order of business is to install types for packages you use. This allows TypeScript to understand them, which makes it the easiest way to reduce the number of errors. Basically, if you have a dependency on some package, say, jquery, and you see errors when you use it, you probably need a dev dependency on @types/jquery. The type definitions in @types/jquery give TypeScript a model of jquery that it can use to provide editor support, even though jquery was written before TypeScript existed.

Definitely Typed is the source for the packages in the @types namespace. Anybody can contribute new type definitions, but tons of packages already have type definitions, so you will probably find that most of your dependencies do too.

Here’s a good starting set for TypeScript-eslint-parser, although there are likely more available:

npm install --save-dev @types/node
npm install --save-dev @types/jest
npm install --save-dev @types/estree
npm install --save-dev @types/shelljs@0.8.0
npm install --save-dev @types/eslint-scope

After the installation, these three types packages still didn’t work (although notice that we intentionally installed an old version of shelljs – more on that later):

  • @types/shelljs
  • @types/eslint-scope
  • @types/estree

They fail for different reasons, though. shelljs and es-lint-scope just don’t have types for a lot of their values. estree has all the correct types, but the types aren’t imported correctly.Part 2 shows how to fix these two problems.

At this point, you have types for some of your packages working and your own code checked by the TypeScript compiler. The next step is to fix compile errors in the rest of the package types, or to fix them in your own code. Or you can just ignore the errors and start using the TypeScript support in the editor.

Next up: Part 2, to learn about the various kinds of fixes.

The post How to Upgrade to Typescript Without Anybody Noticing, Part 1 appeared first on TypeScript.


Announcing the Visual Studio Code Installer for Java

$
0
0

It’s been almost 3 years since the first Java language server was developed during a hackathon in a small conference room at Microsoft’s Zurich office with people from Red Hat, IBM, Codenvy and Microsoft, which later became one of the most popular extensions for Visual Studio Code with more than 2.7 million installations. Since then, Visual Studio Code has gone through a thrilling journey and become to the most popular development environments according to Stack Overflow. More and more Java extensions are now available in Visual Studio Code to serve a growing Java community using Visual Studio Code along with their favorite tools and frameworks.

During this journey, we’ve heard many developers ask how to start with Java in Visual Studio Code. As the vibrant Java community expands to include more students and developers from other languages, many new comers struggle with setting up their environment to be able to start coding. To help people get started, we created the Java extension pack to give you the relevant extensions, and also included tutorials with detailed steps in our documentation.

Back in 2018, Microsoft Azure became a Platinum Sponsor of the AdoptOpenJDK project – that just got renewed until June 2020 – and provides a truly vendor neutral, completely free and open source distribution of the JDK (Java Development Kit) based on the OpenJDK project. This was a turning point for us so much we’ve also added a functionality to detect and help developers install a JDK binary in their environments, having AdoptOpenJDK as the recommended distribution. These efforts were encouraging, but got us thinking about more ways we could make it easier to starting coding in Java.

Introducing the Visual Studio Code Java Pack Installer

So today, we’re releasing a special Installer of Visual Studio Code for Java developers. The package can be used as a clean install or an update for existing environment to add Java or Visual Studio Code to your development environment. Once downloaded and opened, it automatically detects if you have the fundamental components in your local development environment, including the JDK, Visual Studio Code and essential Java extensions.

After clicking Install, it will pull the stable versions of those tools from trusted online sources and install them on your system. Once it’s done, you can open Visual Studio Code and start writing and running Java code directly! Below is a short video showing you how to write and run a Hello World program with Java in Visual Studio Code in less than 1 minute. See more detailed functionality in our tutorial.

The installer is available for download for Windows now while we’re still working on the Mac version. Please have a try and let us know your feedback!

If you’d like to follow the latest of Java on Visual Studio Code, please provide your email with us using the form below. We will send out updates and tips every couple weeks.

Thank you and happy coding.

The post Announcing the Visual Studio Code Installer for Java appeared first on The Visual Studio Blog.

Making a tiny .NET Core 3.0 entirely self-contained single executable

$
0
0

I've always been fascinated by making apps as small as possible, especially in the .NET space. No need to ship any files - or methods - that you don't need, right? I've blogged about optimizations you can make in your Dockerfiles to make your .NET containerized apps small, as well as using the ILLInk.Tasks linker from Mono to "tree trim" your apps to be as small as they can be.

Work is on going, but with .NET Core 3.0 preview 6, ILLink.Tasks is no longer supported and instead the Tree Trimming feature is built into .NET Core directly.

Here is a .NET Core 3.0 Hello World app.

225 files, 69 megs

Now I'll open the csproj and add PublishTrimmed = true.

<Project Sdk="Microsoft.NET.Sdk">

<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp3.0</TargetFramework>
<PublishTrimmed>true</PublishTrimmed>
</PropertyGroup>
</Project>

And I will compile and publish it for Win-x64, my chosen target.

dotnet publish -r win-x64 -c release

Now it's just 64 files and 28 megs!

64 files, 28 megs

If your app uses reflection you can let the Tree Trimmer know by telling the project system about your Assembly, or even specific Types or Methods you don't want trimmed away.

<ItemGroup>

<TrimmerRootAssembly Include="System.IO.FileSystem" />
</ItemGroup>

The intent in the future is to have .NET be able to create a single small executable that includes everything you need. In my case I'd get "supersmallapp.exe" with no dependencies. Until then, there's a cool global utility called Warp.

This utility, combined with the .NET Core 3.0 SDK's now-built-in Tree Trimmer creates a 13 meg single executable that includes everything it needs to run.

C:UsersscottDesktopSuperSmallApp>dotnet warp

Running Publish...
Running Pack...
Saved binary to "SuperSmallApp.exe"

And the result is just a 13 meg single EXE ready to go on Windows.

A tiny 13 meg .NET Core 3 application

If you want, you can combine this "PublishedTrimmed" object with "PublishReadyToRun" as well and get a small AND fast app.

<Project Sdk="Microsoft.NET.Sdk">

<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp3.0</TargetFramework>
<PublishTrimmed>true</PublishTrimmed>
<PublishReadyToRun>true</PublishReadyToRun>
</PropertyGroup>
</Project>

These are not just IL (Intermediate Language) assemblies that are JITted (Just in time compiled) on the target machine. These are more "pre-chewed" AOT (Ahead of Time) compiled assemblies with as much native code as possible to speed up your app's startup time. From the blog post:

In terms of compatibility, ReadyToRun images are similar to IL assemblies, with some key differences.

  • IL assemblies contain just IL code. They can run on any runtime that supports the given target framework for that assembly. For example a netstandard2.0 assembly can run on .NET Framework 4.6+ and .NET Core 2.0+, on any supported operating system (Windows, macOS, Linux) and architecture (Intel, ARM, 32-bit, 64-bit).
  • R2R assemblies contain IL and native code. They are compiled for a specific minimum .NET Core runtime version and runtime environment (RID). For example, a netstandard2.0 assembly might be R2R compiled for .NET Core 3.0 and Linux x64. It will only be usable in that or a compatible configuration (like .NET Core 3.1 or .NET Core 5.0, on Linux x64), because it contains native code that is only usable in that runtime environment.

I'll keep exploring .NET Core 3.0, and you can install the SDK here in minutes. It won't mess up any of your existing stuff.


Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!



© 2018 Scott Hanselman. All rights reserved.
     

Azure.Source – Volume 87

$
0
0

News and updates

Microsoft FHIR Server for Azure extends to SQL

Since the launch of the open source FHIR Server for Azure on GitHub last November, we have been humbled by the tremendously positive response and surge in the use of FHIR in the healthcare community. There has been great interest in Microsoft expanding capabilities in the FHIR service, and today we are pleased to announce that the open source FHIR Server for Azure now supports both Azure Cosmos DB and SQL backed persistence providers. With the SQL persistence provider, developers will be able to perform complex search queries that join information across multiple FHIR resource types and leverage transactions.

Now available

Azure Shared Image Gallery now generally available

At Microsoft Build 2019, we announced the general availability of Azure Shared Image Gallery, making it easier to manage, share, and globally distribute custom virtual machine (VM) images in Azure. Shared Image Gallery provides a simple way to share your applications with others in your organization, within or across Azure Active Directory (AD) tenants and regions. This enables you to expedite regional expansion or DevOps processes and simplify your cross-region HA/DR setup. This blog explains the key benefits of this feature.

Global replication of virtual machine images

Technical content

Monitoring on Azure HDInsight Part 3: Performance and resource utilization

This is the third blog post in a four-part series on Monitoring on Azure HDInsight. Part 1 is an overview that discusses the three main monitoring categories: cluster health and availability, resource utilization and performance, and job status and logs. Part 2 centered on the first topic, monitoring cluster health and availability. This blog covers the second of those topics, performance and resource utilization, in more depth.

Simplify B2B communications and free your IT staff

Today’s business data ecosystem is a network of customers and partners communicating continuously with each other. The traditional way to do this is by establishing a business-to-business (B2B) relationship. The B2B communication requires a formal agreement between the entities. Then the two sides must agree on the formatting of messages. The Azure platform offers a wealth of services for partners to enhance, extend, and build industry solutions. Here we describe how one Microsoft partner uses Azure to solve a unique problem.

Accelerating smart building solutions with cloud, AI, and IoT

Throughout our Internet of Things (IoT) journey we’ve seen solutions evolve from device-centric models, to spatially-aware solutions that provide real-world context. Last year at Realcomm | IBcon, we announced Azure IoT’s vision for spatial intelligence, diving into scenarios that uniquely join IoT, artificial intelligence (AI), and productivity tools. This year we’ve returned to Realcomm | IBcon, joined by over 30 partners who have delivered innovative solutions using our spatial intelligence and device security services to provide safety to construction sites, operate buildings more efficiently, utilize space more effectively, and boost occupant productivity and satisfaction. Here we’ll tell you more about a selection of these smart building partners who are accelerating digital transformation in their industries.

Steelcase workplace advisor app

Taking advantage of the new Azure Application Gateway V2

We recently released Azure Application Gateway V2 and Web Application Firewall (WAF) V2. These SKUs are named Standard_v2 and WAF_v2 respectively and are fully supported with a 99.95 percent SLA. The new SKUs offer significant improvements and additional capabilities to customers. Read the blog for an explanation of the key features.

Virtual machine memory allocation and placement on Azure Stack

Customers have been using Azure Stack in a number of different ways. We continue to see Azure Stack used in connected and disconnected scenarios, as a platform for building applications to deploy both on-premises as well as in Azure. Many customers want to just migrate existing applications over to Azure Stack as a starting point for their hybrid or edge journey. The purpose of this post is to detail how Virtual Machine (VM) placement works in Azure Stack with a focus on the different components that come to play when deciding the available memory for capacity planning.

Three things to know about Azure Machine Learning Notebook VM

Data scientists have a dynamic role. They need environments that are fast and flexible while upholding their organization’s security and compliance policies.

Data scientists working on machine learning projects need a flexible environment to run experiments, train models, iterate models, and innovate in. They want to focus on building, training, and deploying models without getting bogged down in prepping virtual machines (VMs), vigorously entering parameters, and constantly going back to IT to make changes to their environments. Moreover, they need to remain within compliance and security policies outlined by their organizations. This blog looks at three ways the Azure Machine Learning Notebook VM makes life easier for data scientists.

Creating custom VM images in Azure using Packer | Azure Tips and Tricks

Azure provides rich support for open source tools to automate the infrastructure deployments. Some of the tools include Hashicorp's Packer and Terraform . In few simple steps, we'll learn how to create custom VM Linux images in Azure using Packer.

Additional technical content

Azure shows

Troubleshoot resource property changes using Change History in Azure Policy | Azure Friday

Jenny Hunter joins Donovan to showcase a new integration inside Azure Policy that enables you to see recent changes to the properties for non-compliant Azure resources. Public preview of the Resource Change History API is also now available.

Willow is the digital twin for the built world | Internet of Things Show

Willow and Microsoft are partnering together to empower every person and organization to connect with the built world in a whole new way. This digital disruption is happening today, with Digital Twin technology.

Over-the-air software updates for Azure IoT Hub with Mender.io | Internet of Things Show

Introduction to a secure and robust over-the-air (OTA) software update process for Azure IoT Hub with Mender.io, an open source update manager for connected devices. We will cover key considerations for being successful with software updates to connected devices and show a live demo deploying software to a physical device.

Five things you can do with serverless | Five Things

Serverless is like CrossFit, the first rule is to never stop talking about it. In this episode, Eduardo Laureano from the Azure Functions team brings you five things you can do with Serverless that you might not realize are even possible. Also, Burke wears a sweater vest and Eduardo insinuates that there is a better candy than Goo Goo Clusters. The nerve.

Migrating from Entity Framework 6 to Core | On .NET

Entity Framework (EF) Core is a lightweight and cross-platform version of the popular Entity Framework data access technology. In this episode, Diego Vega joins Christos to show us how we can port out Entity Framework 6 code to Entity Framework Core.

How to get started with Azure Machine Learning Service | Azure Tips and Tricks 

In this edition of Azure Tips and Tricks, learn how to get started with the Azure Machine Learning Service and how you can use it from Visual Studio Code.

Xamarin.Forms 4 – Who could ask for anything more? | The Xamarin Podcast

Episode 283: .NET and Azure | The Azure Podcast

Sujit and Cynthia talk with VS and .NET Director, Scott Hunter, on how Microsoft is shifting paradigms in the Linux world and .NET development experience with Azure.

Industries and partners

Join Microsoft at ISC2019 in Frankfurt

The world of computing goes deep and wide in regards to working on issues related to our environment, economy, energy, and public health systems. These needs require modern, advanced solutions that can be hard to scale, take a long time to deliver, and were traditionally limited to a few organizations. Join us at the world's second-largest supercomputing show, ISC High Performance 2019. Learn how Azure customers combine the flexibility and elasticity of the cloud and how to integrate both our specialized compute virtual machines (VMs), as well as bare-metal offerings from Cray.

Azure Stack IaaS – part ten

$
0
0

This blog is co-authored by Andrew Westgarth, Senior Program Manager, Azure Stack 

Journey to PaaS

One of the best things about running your VMs in Azure or Azure Stack is you can begin to modernize around your virtual machines (VMs) by taking advantage of the services provided by the cloud. Platform as a Service (PaaS) is the term often applied to the capabilities that are available to your application to use without the burden of building and maintaining these capabilities yourself. Actually, cloud-IaaS itself is a PaaS since you do not have to build or maintain the underlying hypervisors, software defined network and storage, or even the self-service API and portal. Furthermore, Azure and Azure Stack gives you PaaS services which you can use to modernize your application.

When you need to modernize your application, you have the option to leverage cloud native technologies and pre-built services delivered to you by the cloud platform. Many teams choose to utilize these capabilities when they add new capabilities or features to their existing app. This way they can keep the mature parts of their application in VMs while tapping onto the convenience of the cloud for new code.

In this article we will explore how you can modernize your application with web apps, serverless functions, blob storage, and Kubernetes as part of your Journey to PaaS.

Web apps with Azure App Services

Back in the virtualization days we had VM sprawl. So many VMs were taking up resources just to perform a single purpose. The most common was an entire VM just to host a website. Using an entire VM not only is wasteful from a resources point of view, but also in its on-going management. Azure App Services gives you a new option. Instead of creating a VM and installing a web server, you can just create a website directly on the platform. It is super easy to create a web app in the Azure Stack portal. Really the main thing you must provide is the website name which becomes part of the overall DNS name just like in Azure, but on the network you have installed your Azure Stack.

Screenshot showing how to create a web app in the Azure Stack portal

Once your web app is deployed you can push your website to it using FTP or web deploy or from a repository. You can access this simply by accessing the deployment options of the web app.

Screenshot showing how to access the deployment options of the web app

You can even create a custom domain for your web app so that your users can access your website with a more recognizable name. So, a URL like timecard.appservice.midwest.azurestack.corp.contoso.com can become timecard.contoso.com or some other domain entirely.

Screenshot showing how to create a custom domain

You can learn more about Azure App Services on Azure Stack:

Serverless functions with Azure Functions

If you have something even smaller than a website, why create an entire VM? With Azure Functions you can simply host your code with the platform, no VM required. Example candidates for Azure Functions are scripts that need to be run on a schedule or a script that is triggered by a web request. I’ve seen people using functions to take periodic measurements of resource usage, check if something is responding properly, or notify another system that condition has been met. Azure Stack supports functions written in CSharp, JavaScript, FSharp, PowerShell*, Python*, TypeScript*, PHP*, and Batch* (* indicates experimental languages).

Functions are easy to create directly in the portal. You simply need to provide a name for the function then pick a scenario template to get started:

Screenshot showing how to create functions directly into the poral

You can even test and run your function directly in the Azure Stack portal:

Screenshot showing how to test and run functions directly in the Azure Stack portal

Learn more:

Storage as a service with Azure Storage

Another common but wasteful use of VMs in the virtualization days was storage. Teams set up their own file servers for their web app’s images or for sharing of documents. When you use Azure Stack you can take advantage of the built-in platform storage features. Azure Storage in Azure Stack gives you four storage options:

  • Blobs for unstructured data like web images, files, and logs that can be accessed by a code-friendly URL instead of a file-system object.
  • Queues for durable message queuing accessible via a web friendly REST API.
  • Tables for semi-structured XML data like JSON with OData-base queries.
  • Disks for persistent storage to support your Azure Virtual Machines.

To get started with Azure Storage you create a storage account in your Azure Stack subscription. Once you create the storage account you can create blobs, tables, or queues. Here I have created a queue of items for my team to bring for a picnic which they can de-queue when they sign up for bringing it:

Screenshot of an example storage account creating queues

Using Azure Table Storage is Azure Stack allows you to store and query store semi-structured data for apps that require a flexible data schema. You can access this data through the Azure Storage SDK or use tools like Azure Storage Explorer or Microsoft Excel to view and edit the data. Here is a baseball roster stored in Azure Table Storage viewed from Azure Storage Explorer:

Screenshot of a baseball roster stored in Azure Table Storage viewed from Azure Storage Explorer

Here is the same data accessed from Microsoft Excel:

Screenshot of a baseball roster in Microsoft Excel

Sharing unstructured data is easy as well. Here are all the logs my IoT devices are creating. I can download or even upload logs right in the portal or in code via the blob URL:

Screenshot of sharing unstructured data

Learn more:

Secrets with KeyVault

How did you keep passwords and certificates secret in the virtualization days? Admit it, not always best practices. KeyVault is a built-in Azure platform service that provides a secure place for you to keep your secrets. You can create a VM on Azure Stack with a secret kept in KeyVault. This way you don't need to put passwords or other secrets in clear text in the template you use to deploy the VM.  This is just another example of how you can modernize your VMs by taking advantage of the Azure platform services.

Screenshot example of how you can modernize your VMs by taking advantage of the Azure platform services

Learn more:

Containers with Kubernetes

Another great way to modernize your application is to take advantage of containers. Azure Stack gives you the option to host your containers in a Kubernetes cluster. This Kubernetes cluster is created using the Azure Kubernetes Service (AKS) engine so that you can easily move your applications between AKS in Azure and your Kubernetes cluster in Azure Stack. Creating a Kubernetes cluster is easy in Azure Stack, simply deploy it from the portal’s marketplace.
Azure Stack Kubernetes Cluster is currently in public preview.

Screenshot of a Kubernetes Cluster preview

Learn more:

The platform for apps not just VMs

The primary focus of virtualization platforms is helping improve the life of the infrastructure team. But with cloud platforms the focus is on improving the life for business units and developers. Virtualization VMs will only take you so far. But when you move your VMs to cloud-IaaS in Azure or Azure Stack, you can modernize your app to stay current with your users who now expect cloud cadence. Tapping into native PaaS services gets you out of the business of infrastructure and into the business of your app.

In this blog series

We hope you come back to read future posts in this blog series. Here are some of our past topics:

Microsoft and Truffle partner to bring a world-class experience to blockchain developers

$
0
0

Last month, Microsoft released Azure Blockchain Service making it easy for anyone to quickly setup and manage a blockchain network and providing a foundation for developers to build a new class of multi-party blockchain applications in the cloud.

To enable end-to-end development of these new apps, we’ve collaborated with teams from Visual Studio Code to Azure Logic Apps and Microsoft Flow to Azure DevOps, to deliver a high-quality experience that integrates Microsoft tools developers trust and open-source tools they love.

As we looked at the open source projects for Ethereum-based blockchains, we saw Truffle addressing core needs of developers looking to create, compile, test, and manage smart contract code. We kicked off our relationship in 2018 by co-authoring guidance for using Truffle for consortium DevOps and incorporating Truffle-based tooling in our Azure Blockchain Development Kit for Ethereum.

This week, we doubled down on our relationship by announcing an official partnership between our organizations to bring Truffle blockchain tools for developer experience and DevOps to Microsoft Azure. This will manifest not just in Visual Studio and Azure DevOps, but also upcoming tools from Truffle such as Truffle Teams. Through this partnership, developers working in Truffle environments will have access to Azure services, and Azure customers will have access to the suite of tools Truffle provides to make developing on Ethereum easy, such as:

  • Standalone, local nodes for testing: Development starts on a developer’s laptop or local machine, and Truffle provides a local, standalone blockchain node that is lightweight and can be quickly spun up for local dev/test.
  • Easily populate a node with test data: Once a local node was available, developers also wanted to be able to have a copy of data in the blockchain to test. Unlike a relational database where test data could easily be loaded from a script, a blockchain is populated via signed transactions that require significantly more effort to set up in a test environment. The Truffle Suite solves this by providing the ability to “fork” a blockchain so that developers can get a local copy of real network data to test against.
  • Easily deploy smart contracts: Developers wanted to easily script deployments of their smart contracts to local, private chain or public chain environments.
  • Write and execute tests for smart contracts: With Truffle, developers can write tests using the popular Mocha framework. Those tests can then be executed locally or in Azure Pipelines by individual organizations or as a consortium.
  • Interact with smart contracts: The ability to do interactive testing via the console or a UI can help quickly validate or troubleshoot behavior, which can be especially important for scenarios that span multiple smart contracts.
  • Debug smart contracts: When tests are executed, they aren’t always successful, and that’s where enterprise developers expect to have a debugging experience comparable to what they’ve seen in .NET and Java development.

As partners and as end users, we are big fans of Truffle’s technology and the people behind it. Their customer obsession and open orientation has made them the trusted choice for blockchain developers, and we are eager to see what you will build with Truffle on Azure.

To learn more, check out our Truffle videos on BlockTalk and get started today with the Truffle extensions for Visual Studio Code.

Azure Marketplace new offers – Volume 39

$
0
0

We continue to expand the Azure Marketplace ecosystem. For this volume, 136 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Applications

ACR Lift & Shift

ACR Lift & Shift: Stratum ACR Lift and Shift is an enterprise methodology available for customers of all sizes. Stratum has designed a turnkey, repeatable process for mass migrations to Azure that will increase cloud adoption.

All in One Marketing Automation & CDP 4-Wk PoC

All in One Marketing Automation & CDP: 4-Wk PoC: Related Marketing Cloud offers personalization and marketing automation solutions for omnichannel marketing integrated with Customer Data Platform to increase conversion rates for customer lifecycle stages.

AllOrg

AllOrg: Wavelip's AllOrg is an Azure-based platform for end-users to easily and quickly create a computer automation system. What Wix and WordPress do for your websites, AllOrg does for your internal system to handle your information.

AltitudeCDN Multicast

AltitudeCDN Multicast+: Multicast+ is integrated with Microsoft 365, making it easy for organizations using Microsoft Stream, Teams, Yammer, and Skype for Business to conduct live video broadcasts without stressing the corporate network.

appICE Growth Marketing Platform

appICE Growth Marketing Platform: appICE ensures customer profiling and segmentation is available on the fly and enables a single dashboard that tracks users' lifecycle from acquisition to monetization to churn.

AppService certificate automation tool

AppService certificate automation tool: This application provides easy automation of the Let's Encrypt for Azure app service.

arQive Digital Media Platform

arQive Digital Media Platform: arQive provides a complete solution to store, use, and share digital media files. This application is available only in Dutch.

Azure Monitor for Citrix Environments

Azure Monitor for Citrix Environments: Use the power of Azure Monitor and Log Analytics with this agent for your Citrix Workers – servers and desktops. Collected data are analyzed and shown in your Azure Monitor workspace.

BABEL

BABEL: The BABEL cloud-based library system centrally manages inventory and reservations with its branch function. This application is available only in Japanese.

Calfitec Grid - calculs intensifs

Calfitec Grid - calculs intensifs: Calfitec is an innovative calculation engine with a user interface in natural mathematical language and automatic, exhaustive documentation ensuring the calculation audit trail. This application is available only in French.

CampusPlan for Azure

CampusPlan for Azure: CampusPlan is a school information system that centralizes information, boosts efficiency, and improves student service. This application is available only in Japanese.

CareX Healthcare Communicator

CareX Healthcare Communicator: CareX Healthcare Communicator automates communication workflows using text, voice, and email, which helps providers, payers, and FQHCs reduce their workload while engaging patients at scale before, during, and after care.

Computer Aided Dispatch

Computer Aided Dispatch: RapidDeploy Computer Aided Dispatch combines advanced telematics with first-hand experience to provide relevant, intuitive systems that help maximize the efficiency of service and enable teams to share and report critical information in real time.

COPD Service

COPD Service: The Chronic Obstructive Pulmonary Disease (COPD) Service allows high-risk patients to securely share health data from connected hardware with healthcare professionals.

Cyber Security Awareness Solution

Cyber Security Awareness Solution: Terranova Security Cyber Security Awareness Training and Phishing Simulation Platform helps CISOs and security professionals instill a culture of security and reduce human risk by increasing alertness to cyberattacks.

DeepCam Loss Prevention

DeepCam Loss Prevention: DeepCam's core IP, (Deep Learning)2, combines facial recognition with deep data mining to deliver accurate predictions from source data others consider inadequate.

devNXT_Architect_Bot_Integration

devNXT_Architect_Bot_Integration: This information and document chatbot helps developers by providing pre-created sample projects and templates of different kinds of software as well as information related to these projects.

devNXT_Face_Integration

devNXT_Face_Integration: The application recognizes a patient's face and retrieves their records, which is useful in the health sector for electronics health record retrieval.

devNXT_Image_Classification

devNXT_Image_Classification: Image Classifier can be used to predict online sales growth and provide all information about a product.

Discovery Hub with Azure Data Lake and SQL MI

Discovery Hub with Azure Data Lake and SQL MI: Discovery Hub provides a cohesive data fabric across Microsoft data platforms, allowing you to connect to various data sources, catalog, model, move, and document data for analytics and AI.

Discovery Hub with Azure Data Lake SQL MI and AAS

Discovery Hub with Azure Data Lake SQL MI and AAS: Discovery Hub provides a cohesive data fabric across Microsoft data platforms, allowing you to connect to various data sources, catalog, model, move, and document data for analytics and AI.

Dodai Campus

Dodai Campus: Dodai Campus has all the necessary functions for teaching and academic affairs from kindergarten to university. This application is available only in Japanese.

Effector

Effector: Effector has a robust BPMN 2.0-compliant workflow engine with visual workflow designer, making the platform suitable for back-office workflow digitalization projects.

FAISapp

FAISapp: FAISapp is an innovative software product that offers a range of digital options to replace paper-intensive, time-consuming, and expensive processes.

Field Service Management System

Field Service Management System: Tuten Labs' state-of-the-art field service management solution helps streamline core activities for all stakeholders related to the field operations business.

FlourishDx

FlourishDx: The FlourishDx cloud-based platform combines risk management with wellbeing promotion to assist in the prevention of mental illness and optimize employee mental health.

Intelligent Store - Smart Pages

Intelligent Store - Smart Pages: The Intelligent Store suite provides tools for efficient communication between online retail and customers to personalize the experience for each client. This application is available only in Portuguese.

Intelligent Workplace - LiveTiles QnA Bot

Intelligent Workplace - LiveTiles QnA Bot: RDA delivers an easy-to-use platform built on the Azure QnA Maker and Azure Language Understanding services, providing a way to quickly build out QnA bots.

ITRS OP5 Monitor

ITRS OP5 Monitor: OP5 Monitor for the enterprise monitors the complete stack from hardware level all the way up to application layer.

Lenus Digital Health Platform

Lenus Digital Health Platform: Lenus enables patient-generated health data in the form of physiology and patient reported outcomes from apps, wearables, and sensors to be shared with health professionals and machine learning models as part of new care pathways.

Manage and secure your Microsoft 365 and Azure

Manage and secure your Microsoft 365 and Azure: Softeng's self-management portal allows customers to manage their cloud subscriptions directly and as autonomously as desired while optimizing and gaining visibility of their usage.

Microsoft Professional Program(Open edX on Azure)

Microsoft Professional Program (Open edX on Azure): The Microsoft Professional Program service enables students to acquire the skills required in the fourth industrial revolution era through a series of online courses. This application is available only in Korean.

moca

moca: moca is a cloud-based e-learning system based on one of the most widely used open source learning management systems. This solution is available only in Japanese.

New Retail solution with supply chain digitized

New Retail solution with supply chain digitized: New Retail merges online, offline, and logistics to create a dynamic new world of retailing. The concept turns traditional convenience stores into self-service fulfilment centers located regionally or locally.

New World Computer-Aided Dispatch (CAD)

New World Computer-Aided Dispatch (CAD): This application manages single- or multi-jurisdictional dispatching activities for law enforcement, fire, and EMS, empowering dispatchers and call takers with rapid access to data, mapping capabilities, and more.

O365 Migration by SkyKick

O365 Migration by SkyKick: The SkyKick Migration Suite helps IT partners automate entire Office 365 migration projects from pre-sales to project completion.

OBC

OBC: Make all My Number jobs in your company safe and effortless and store all My Number data securely in Azure. This application is available only in Japanese.

OBC (SaaS)

OBC (SaaS): Make all My Number jobs in your company safe and effortless and store all My Number data securely in Azure. This application is available only in Japanese.

OCR by Primesoft

OCR by Primesoft: This multilingual application speeds up the process of data entry by analyzing scanned document images.

Odyssey Courts & Justice Solution

Odyssey Courts & Justice Solution: Odyssey’s powerful web-based court solutions are anchored by Odyssey Case Manager, offering robust solutions for judges, clerks, and attorneys, as well as the public.

Omnimap - Low code-No code Rapid Innovation Power

Omnimap - Low code/No code Rapid Innovation Power: Built on Azure, Omnimap continuously adds new low-code functionality to the platform and provides centrally managed access to external APIs.

OneStream Software

OneStream Software: Deployed in the cloud or on-premises, OneStream's single unified platform simplifies financial consolidation, planning, reporting, analytics, and financial data for large enterprises worldwide.

OpenLM for Engineering Licensing - Cloud Solution

OpenLM for Engineering Licensing - Cloud Solution: OpenLM is a management system for engineering software licenses that monitors and analyzes the usage of software in your organization and allows you to enforce the usage policy of your floating licenses.

Oscar Campus CRM

Oscar Campus CRM: Designed with the help of many schools, Oscar Campus CRM is dedicated exclusively to higher education institutions. This application is available only in French.

OWI Mail

OWI.Mail: OWI provides large companies with an omnichannel virtual assistant based on a new semantic approach, automating tasks with low added value and processing emails based on their real priority.

Palo Alto Networks VM-Series

Palo Alto Networks VM-Series: The VM-Series next-generation firewall allows developers and cloud security architects to embed inline threat and data loss prevention into their application development workflows.

Paxata Self Service Data Preparation (SaaS)

Paxata Self Service Data Preparation (SaaS): Paxata Self-Service Data Preparation is a solution for business analysts and data professionals to discover, ingest, explore, transform, and export data, creating clean and contextual information from raw data.

Pivotal Postgres

Pivotal Postgres: Pivotal Postgres on Azure is a powerful, open source object-relational database system.

PiXYZ Review

PiXYZ Review: PiXYZ Review is a cost-efficient CAD viewer that enables users to easily import a wide range of complex assemblies from industry-leading solutions and perform desktop, VR, or AR collaborative reviews.

plmpacklib

plmpacklib: TreeDiM's web application and web services give access to a full library of parametric models for packaging and a set of stacking optimization tools for packing, palletization, and truck or container loading.

Pluriportail Mobile

Pluriportail Mobile: Pluriportail Mobile brings together K-12 school administrators, teachers, parents, and students to enhance communication and collaboration. This application is available only in French.

PowerInbox Email Monetization

PowerInbox Email Monetization: With its data algorithms and sophisticated matching capability, PowerInbox gives publishers insight into the types of content they should be creating to drive more clicks and keep customers happy.

PowerStats - managing associations' industry data

PowerStats - managing associations' industry data: PowerStats is an intelligent platform that enables trade bodies – be it an association, society, institute, organization, council, or college – to effortlessly collect, manage, and visualize industry data from members.

Predica Cargo Management Example

Predica Cargo Management Example: Predica Cargo Management is a customized, integrated system that gathers information on shipment localization and potential obstacles to make it easy to plan alternate routes.

PriceSynergy advanced eCommerce market analytics

PriceSynergy advanced eCommerce market analytics: The PriceSynergy suite of business tools allows you to identify weaknesses and strengths in your product portfolio and adjust accordingly.

Printer Management - easyPRIMA & Fleet Management

Printer Management - easyPRIMA & Fleet Management: SEAL Systems' Printer Management solution includes the maintenance, administration, installation, and removal of printers in a multiple system environment (SAP, Windows, etc.) in one central database.

product_logos_img

ProcessRobot - Scalable Robot Process Automation: With ProcessRobot, all enterprise stakeholders and process owners can rely on a single, easy-to-use automation platform to automate processes, distribute workloads to robots, and track and assess performance.

Product Cloud - Product Matching

Product Cloud - Product Matching: The Product Cloud Suite provides tools to organize the catalogs of online retailers, helping improve customer experience. This application is available only in Portuguese.

ProjectReady

ProjectReady: ProjectReady is a modern end-to-end project management and document control solution for the architecture, engineering, and construction (AEC) industry.

Punchh Acquire

Punchh Acquire: Punchh Acquire obtains a customer’s digital identity and creates a dynamic customer profile used to generate and deliver targeted marketing offers that drive omnichannel engagement in-store and online.

Punchh Loyalty

Punchh Loyalty: Punchh Loyalty helps restaurants, physical retailers, and CPG brands convert customers to brand advocates by creating consistent and powerful experiences both in-store and online.

Punchh Offers

Punchh Offers: With Punchh, brands are equipped with the end-to-end functionality required to run a successful campaign – ranging from real-time offer code generation and omnichannel distribution and processing to offer reporting and insights, fraud prevention, and more.

Q10 Academico

Q10 Académico: Q10 Académico is a cloud-based solution that supports academic, administrative, and virtual education management for educational institutions. This application is available only in Spanish.

Questica Budget

Questica Budget: Questica Budget is a fully featured, multi-user web-based operating, capital, and salary budgeting and performance measurement tool.

Quick License Manager

Quick License Manager: Quick License Manager is a license protection framework that creates professional and secure license keys to protect software against piracy.

QUNIS Automation Engine

QUNIS Automation Engine: QUNIS Automation Engine is a solution for automated code generation, deployment, and maintenance of Data Warehouse solutions based on Microsoft SQL Server. This application is available only in German.

Records365

Records365: Records365 brings hassle-free compliance to content stored in a variety of sources. Manage content without cumbersome manual processes or interfering with the daily workflow of your end users.

RedLock Cloud Threat Defense and Compliance

RedLock Cloud Threat Defense and Compliance: Prisma Public Cloud (formerly RedLock) dynamically discovers cloud resources and sensitive data across AWS, Azure, and GCP to detect risky configurations and identify network threats, suspicious user behavior, malware, data leakage, and more.

REFLEKT Remote

REFLEKT Remote: REFLEKT Remote is a “one-button solution” that connects technicians on-site with the right support experts and allows real-time video support with augmented reality.

RemoteScan Enterprise

RemoteScan Enterprise: RemoteScan Enterprise makes it easy to connect your scanners to an Azure cloud environment, enabling you to scan documents directly from applications hosted in Azure.

Revenue Premier Integrated Tax Solution on Azure

Revenue Premier Integrated Tax Solution on Azure: Revenue Premier, hosted in the Azure Government Cloud, gives agencies an adaptive and capable software solution with world-class standards adherence and compliance.

RoboMQ Hybrid Integration Platform

RoboMQ Hybrid Integration Platform: RoboMQ accelerates digital transformation and helps businesses generate competitive advantage by creating integrated business processes using SaaS, cloud, on-premises applications, and IoT devices.

Rubrik Accelerator for Microsoft Azure

Rubrik Accelerator for Microsoft Azure: Rubrik Accelerator for Microsoft Azure simplifies backup and recovery, delivers automation at scale, and accelerates your cloud journey to Azure.

Sagitec HealHub

Sagitec HealHub: HealHub is an enterprise-class HIPAA-compliant digital health stack built on Azure that ingests and merges/transforms data, ensures data quality, manages lineage, and performs analytics.

Samsung SDS Nexledger

Samsung SDS Nexledger: Easily deploy and configure your Nexledger blockchain in a structured DevOps environment.

Sensing  for Agriculture by The Yield

Sensing+ for Agriculture by The Yield: Sensing+ is an integrated system of hardware and software that provides visibility over crop growing conditions as they happen and seven days in advance.

SICCAR Enterprise Blockchain Platform

SICCAR Enterprise Blockchain Platform: SICCAR Enterprise Blockchain Platform has a comprehensive set of templates and tools so that you can design and deploy your application quickly and with the customization you require.

Sideways 6

Sideways 6: Sideways 6 is idea management software for Microsoft Yammer and Teams, enabling employees to share ideas using the tools they know and love.

Silvermedia Cloud Demand Forecasting

Silvermedia Cloud Demand Forecasting: Silvermedia Cloud Demand Forecasting is a solution for manufacturing-sales enterprises that wish to avoid the losses that occur when failing to respond effectively to market demand.

Sincro - Intégrez l'entreprise étendue

Sincro - Intégrez l'entreprise étendue: Sincro is a management and intermediation platform that allows you to control your processes and develop your network. This application is available only in French.

Sinefa Probe

Sinefa Probe: Organizations use Sinefa to boost the customer experience by ensuring business-critical applications are given priority over the data network and are protected from aggressive and recreational data traffic.

SiteFlow

SiteFlow: SiteFlow offers an agile work methodology that unleashes your productivity, centralizes your knowledge, and makes your procedures and technical content reusable from one project to another.

Skolebordet dashboard

Skolebordet dashboard: Schoolboard is a collaboration portal to Microsoft Office 365 for Education, making it easy for teachers and students to start using Office 365 and all its possibilities.

SkyKick Cloud Backup for Office 365

SkyKick Cloud Backup for Office 365: SkyKick Cloud Backup for Office 365 is an ideal second service for Office 365 that helps protect customer data, improves customer retention, and increases recurring revenue.

Smart Building - Integrated Visitor-Guest Journey

Smart Building - Integrated Visitor/Guest Journey: Built on the Omnimap platform and powered by Azure, our solution connects email and calendars, access systems, point of sales systems, lockers, and suppliers to provide guests with a fully integrated experience.

SmartForest

SmartForest: SmartForest enables the complete management of end-to-end forest inventory in a single solution, delivering powerful insights and combining information generated by IoT devices and artificial intelligence algorithms.

Smartwork mobile

Smartwork mobile: With Smartwork mobile, your employees always have their worklist ready on their mobile device. The integrated appointment calendar allows quick visualization and simple adjustment of scheduled appointments. This application is available only in German.

Spectrum Customization Platform

Spectrum Customization Platform: With Spectrum, everyone's a product designer. From WebGL product design through factory integration, Spectrum is an end-to-end solution for product customization.

SQL 2008-R2 End of Support

SQL 2008/R2 End of Support: Avoid business disruptions and use this as an opportunity to modernize your environment in Azure. Our support means your IT can focus on things that matter and drive the business forward.

StagedPay Credit Card-ACH-Recurring Payment Portal

StagedPay Credit Card/ACH/Recurring Payment Portal: StagedPay safeguards your customers’ credit card information whether you take orders over the phone, online, or in person.

Starburst Presto (v 302-e) for Azure HDInsight

Starburst Presto (v 302-e) for Azure HDInsight: Architected for the separation of storage and compute, Presto queries data in Azure Blob Storage, Azure Data Lake Storage, relational databases (Microsoft SQL Server, MySQL, PostgreSQL), Cassandra, MongoDB, Kafka, and more.

Stormshield Elastic Virtual Appliance

Stormshield Elastic Virtual Appliance: The Stormshield Elastic Virtual Appliance helps you in a seamless transition to the cloud. All the multilevel security features that have contributed to the success of Stormshield products are available in this dedicated application.

Stratio

Stratio: By applying machine learning models to detect anomalies and faults, Stratio allows vehicle development teams to automatically find and continuously label new issues and deviations in data.

StreamWeaver for Azure Application Insights

StreamWeaver for Azure Application Insights: StreamWeaver utilizes built-in Azure assignment groups to forward events, allowing IT to focus on event routing instead of setting up events in source tools​.

StreamWeaver for Azure Logs

StreamWeaver for Azure Logs: StreamWeaver enables sending Azure Logs data to Splunk in real time, avoiding the additional step to stage the data first in Azure Event Hubs. This provides a cost-effective solution that allows customers to integrate Azure and Splunk quickly.

Sunlight Enterprise

Sunlight Enterprise: Sunlight Enterprise is an end-to-end solution including rating, policy, billing, claims, CRM (agent, insured, agency, etc.), workflow, producer management, BI reporting, and data warehousing.

SUSE Linux Enterprise Server (SLES) for HPC

SUSE Linux Enterprise Server (SLES) for HPC: SUSE Linux Enterprise Server (SLES) for HPC is a highly scalable, high performance, open source operating system that provides a parallel computing platform for high performance data analytics.

SUSE Linux Enterprise Server for HPC (Priority)

SUSE Linux Enterprise Server for HPC (Priority): SUSE Linux Enterprise Server (SLES) for HPC is a highly scalable, high performance, open source operating system that provides a parallel computing platform for high performance data analytics.

Symantec Cloud Workload Protection for Storage

Symantec Cloud Workload Protection for Storage: Use Symantec Cloud Workload Protection for Storage (CWP for Storage) to protect your Azure Blob Storage from malware using Symantec's latest built-in anti-malware technologies.

Synergy JOIN

Synergy JOIN: Synergy JOIN provides a simple, user-friendly method to schedule and start video meetings. Users can book a meeting in Outlook without plug-ins and join with one click from videoconference systems, desktop, and mobile regardless of vendor.

Talentsoft

Talentsoft: Talentsoft is a European leading developer of cloud-based talent management software. Its application provides concrete results in recruitment, performance, career development, learning, compensation management, as well as core HR and analytics.

TeamSearch

TeamSearch: TeamSearch delivers a unified set of relevant search results for content found in email, Skype conversations, Teams, Yammer, and more. Search using custom parameters and intelligently filter the results to provide insights that span your team.

Telebreeze IPTV-OTT Platform

Telebreeze IPTV/OTT Platform: Telebreeze Video Platform provides a flexible, modular, open ecosystem that enables TV operators to offer competitive next-generation video services. The platform includes advanced technologies, management features, and more.

Thingscare Remote Access

Thingscare Remote Access: Thingscare Remote Access is a simple remote tool for kiosk, digital signage, and IoT devices.

TIBCO Cloud Mashery

TIBCO Cloud Mashery: TIBCO Cloud Mashery delivers full lifecycle API management capabilities for enterprises adopting cloud-native development and deployment practices, such as DevOps, microservices, and containers.

Touchify

Touchify: Touchify is a SaaS platform that allows enterprises to create, share, and analyze interactive content for their touch screens with no technical skills.

Track-em - Materials, Activity & Asset Tracking

Track’em – Materials, Activity & Asset Tracking: Replace your spreadsheets, paper forms, and manual processes with a modern platform to improve productivity and achieve your schedules on time.

Transparency-One

Transparency-One: Transparency-One enables companies to discover, analyze, and monitor all suppliers, components, and facilities in the supply chain, from source to store.

Tulip Manufacturing App Platform Free Trial

Tulip Manufacturing App Platform Free Trial: Build apps for all your shop-floor needs with intuitive visual development. Create user-friendly, functional apps that improve the productivity of your operations, without writing any code.

UiPath Orchestrator (WebApp with SQL)

UiPath Orchestrator (WebApp with SQL): UiPath Orchestrator is a web application that enables you to securely schedule, manage, and control your enterprise-wide digital workforce of UiPath robots.

Vade Secure for Office 365

Vade Secure for Office 365: Vade Secure augments the reputation and signature-based defenses of Office 365 with AI-based predictive email defense, helping to protect users from advanced phishing, spear phishing, and malware attacks.

VivoSense Data Services

VivoSense Data Services: VivoCapture and VivoSense provide a regulatory compliant platform for the analysis of wearable sensor data in pharmaceutical clinical trials. The platform makes use of Azure authentication and secure cloud storage.

Vormetric Data Security Manager v620

Vormetric Data Security Manager v6.2.0: The Vormetric Data Security Manager provisions and manages keys for Vormetric Data Security Platform solutions including Vormetric Transparent Encryption, Vormetric Tokenization with Dynamic Data Masking, and more.

VRTY - Education through Virtual Reality

VRTY - Education through Virtual Reality: This easy-to-use platform supports teachers and students in creating, building, and sharing their curriculum-aligned projects for others to experience.

Vyu realtime streaming CDN

Vyu realtime streaming CDN: The Vyu real-time CDN is a patent-pending innovation that delivers live streams at sub-second latency to any digital user.

WATS - Test Data Management

WATS - Test Data Management: WATS for electronics and electromechanics manufacturers features global data acquisition from your own and sub-contracted manufacturing and a top-down approach to data analytics that guides your investigation to the most pressing issues.

Webalo Appliance for Azure

Webalo Appliance for Azure: Webalo's patented technology enables companies to easily transform their enterprise applications into actionable, persona-based applications, where each user has the information needed on their device to help get their job done.

Windows Server - End of Support

Windows Server - End of Support: Avoid business disruptions and use this as an opportunity to modernize your environment in Azure.

xGenCloud

xGenCloud: This cloud service provides a full cycle of work with genetic data, ranging from differential diagnosis and recommendations for genetic testing to clinical interpretation of the results of genetic tests.

Zoho Analytics

Zoho Analytics: Zoho Analytics is a self-service business intelligence, reporting, and analytics service that enables users to visually analyze large amounts of data.

Zoho Connect

Zoho Connect: With a host of built-in tools and integrations with other apps, Zoho Connect simplifies your team's work and increases productivity.

Consulting services

1 Day Workshop - Windows as a Service Planning

1 Day Workshop: Windows as a Service Planning: This is a 1-day planning and knowledge transfer engagement to review the current state of Windows-as-a-Service and deploying Windows 10 on Azure.

AI Ideation Workshop - 5 Hour Workshop

AI Ideation Workshop: 5 Hour Workshop: Do you find yourself wondering how AI could amplify your business sites or applications? Our experts will work with your team to explore scenarios where AI solutions can add maximum benefit and support market competitiveness.

Azure Accelerator - 2-Week Implementation

Azure Accelerator: 2-Week Implementation: The Azure Accelerator combines planning, migration, mentorship, and management support to increase your cloud knowledge and set the foundation for future success.

Azure Accelerator - 2-Week Implementation (US)

Azure Accelerator: 2-Week Implementation (US): The Azure Accelerator combines planning, migration, mentorship, and management support to increase your cloud knowledge and set the foundation for future success.

Azure Foundation 4-Week Implementation

Azure Foundation 4-Week Implementation: Azure Foundation solves some of the biggest challenges organizations face when using and consuming services on the platform. The three pillars of this offering are Azure security, Azure governance, and Azure cost management.

Azure Infrastructure Design 1-Wk Assessment

Azure Infrastructure Design: 1-Wk Assessment: This assessment is based on outputs of two workshops, delivered on-site over five days, where consideration is given to how Azure will assist with consolidation of IT assets, including decommissioning superfluous assets.

AzureStart Oracle to Azure 4 Week Implementation

AzureStart: Oracle to Azure 8-Wk Implementation: AzureStart is typically a short-term, high-value engagement based on a series of repeatable steps that can be used to incrementally, securely, and safely migrate selected Oracle workloads to the cloud.

BizTalk Migrate-Upgrade to Azure Cloud - 3-Day POC

BizTalk Migrate/Upgrade to Azure Cloud - 3-Day POC: By focusing on your integration workloads, you will leave this POC with an understanding of what it takes to migrate all your BizTalk applications to Azure and a preliminary plan to help get you started.

Data Science & ML  - 1-2 day Assessment

Data Science & ML - 1/2 day Assessment: Understand the suitability of your data and architecture, and validate your business challenge to prove that data science can drive value for your business.

Economic Assessment - Modern Workplace 4 weeks

Economic Assessment - Modern Workplace (4 weeks): The Stratum engagement model is designed to be agile and act as an extension of your team, allowing your team to quickly realize ROI and improve profitability by increasing operational throughput and scalability.

Economic Assessment & Migration - Azure  4 Week

Economic Assessment & Migration - Azure (4 Weeks): Our process allows your team to quickly realize ROI and improve profitability by increasing operational throughput and scalability.

Image Recognition Application 8-Week Imp

Image Recognition Application: 8-Week Imp: This is an application that enables companies to automate and scale the process of data extraction from physical documents such as CNH, CPF, and any other document through an API using image recognition.

Microsoft Azure AI Chatbot Development

Microsoft Azure AI Chatbot Development: While the benefits of chatbots may seem clear, how your company should use chatbots is more complicated as it depends on the company’s line of business. That’s where we come in to provide you with best suggestions and solutions.

ML & AI Accelerator Assessment 2-Hr Assessment

ML & AI Accelerator Assessment: 2-Hr Assessment: This assessment will review a customer's environment to determine its present state, including business and technology maturity around AI adoption.

Predictive Analytics ML AI 2-Hr Briefing

Predictive Analytics, ML, AI: 2-Hr Briefing: We invite you to spend a couple of hours with our machine learning experts to see how the process works and discuss how we can help you utilize predictive analytics to optimize your business performance.

Azure Security Expert Series: Best practices from Ann Johnson

$
0
0

June 19th 10 am – 11 am PT (1 pm – 2 pm ET)

With more computing environments moving to the cloud, the need for stronger cloud security has never been greater. But what constitutes effective cloud security, and what best practices should you be following?

We are excited to launch the Azure security expert series on June 19th, for security operations and IT professionals. Kicking off with Ann Johnson, CVP of Cybersecurity for Microsoft, and other industry experts in discussions on a wide range of cloud security topics.



Save the date
image

What is included in the event?

You'll hear from Ann Johnson on the following:

  • Cloud security best practices with Hayden Hainsworth, GM and Partner, Cybersecurity Engineering at Microsoft.
  • The latest Azure security innovations.
  • Partnership with Ran Nahmias, Head of Cloud Security at Check Point Software Technologies.
  • Security principles from a Microsoft enterprise customer.

During the streaming event, you'll have the opportunity to participate in a live chat with Microsoft cloud security experts.

You will also have access to watch our new on-demand sessions at your own pace, all led by Microsoft security product experts, to gain practical knowledge from these topics:

  • Get started with Azure Sentinel a cloud-native SIEM.
  • What is cloud-native Azure Network Security?
  • Securing the hybrid cloud with Security Center .
  • What makes IoT Security different?

That’s not all! You will have a chance to win a Microsoft Xbox One S after the event. All you have to do is: watch all the sessions, complete the knowledge check questions on the sweepstakes form, submit – and you are entered!**

Tune in

So mark your calendars today, and we'll see you online on Wednesday June 19th at 10 am PT (1 pm ET).

Ask Us Anything with Azure security experts

Have more questions? Azure security team will be hosting an ‘Ask Us Anything’ session on Twitter, on Monday June 24, 2019 from 10 am – 11:30 am PT (1 pm – 2:30 pm ET).  Members of the product and engineering teams will be available to answer questions about Azure security services.

Post your questions to Twitter by mentioning @AzureSupport and using the hashtag #AzureSecuritySeries.

If there are follow-ups or additional questions that come up after the Twitter session, no problem! We’re happy to continue the dialogue afterwards through Twitter or send your questions to Azuresecurityexpert@microsoft.com.

How do I learn more about Azure security and connect with the tech community?

There are several ways to stay connected and access new executive talks, on-demand sessions, or other types of valuable content covering a range of cloud security topics to help you get started or accelerate your cloud security plan.

  • Watch for content on: Azure Security Expert Series Page.
  • Visit Microsoft Azure for product details.
  • Follow the social channel for Azure security news and updates on Microsoft Azure Twitter.
  • Join our Security Community to connect with the engineering teams and participate in previews, group discussions, give feedback etc.
  • Accelerate your knowledge on security capabilities within Azure with hands-on training courses on Microsoft Learn (watch out for new security training sessions in the coming months).
  • Attend Microsoft Ignite for specialized security learning paths to learn from the experts and connect with your peers.

**The Sweepstakes will run exclusively between June 19 – June 26 11:59 PM Pacific Time. No purchase necessary. To enter, you must be a legal resident of the 50 United States (including the District of Columbia), and be 18 years of age or older. You will need to complete all the Knowledge Check questions in the entry form to quality for the sweepstakes. Please refer to our official rules for more details.


Why hospitality industry leaders are focusing on employee engagement to enhance the guest experience

Announcing the preview of Microsoft Azure Bastion

$
0
0

For many customers around the world, securely connecting from the outside to workloads and virtual machines on private networks can be challenging. Exposing virtual machines to the public Internet to enable connectivity through Remote Desktop Protocol (RDP) and Secure Shell (SSH), increases the perimeter, rendering your critical networks and attached virtual machines more open and harder to manage.

RDP and SSH are both a fundamental approach through which customers connect to their Azure workloads. To connect to their virtual machines, most customers either expose their virtual machines to the public Internet or deploy a bastion host, such as jump-server or jump-boxes.

So today, I'm excited to announce the preview of Azure Bastion.

Azure Bastion is a new managed PaaS service that provides seamless RDP and SSH connectivity to your virtual machines over the Secure Sockets Layer (SSL). This is completed without any exposure of the public IPs on your virtual machines. Azure Bastion provisions directly in your Azure Virtual Network, providing bastion host or jump server as-a-service and integrated connectivity to all virtual machines in your virtual networking using RDP/SSH directly from and through your browser and the Azure portal experience. This can be executed with just two clicks and without the need to worry about managing network security policies.

Leading up to the preview, we have worked with hundreds of customers across a wide area of industries. The interest to join the preview has been immense, and similar to other unique Azure services such as Azure Firewall, the feedback has been very consistent: We need an easy and integrated way to deploy, run, and scale jump-servers or bastion hosts within our Azure infrastructure.

For example, what we heard directly from a cloud foundation team manager for a German premium car manufacturer is that they had concerns about exposing cloud virtual machines with RDP/SSH ports directly to the Internet due to the potential of experiencing a number of security and connectivity issues. During the preview of Azure Bastion, they were able to use RDP/SSH over SSL to our virtual machines which allowed them to traverse corporate firewalls effortlessly and at the same time, restrict Azure Virtual Machines to only private IPs.

Deploying a stand-alone dedicated jump-server often entails manually deploying and managing specialized IaaS based solutions and workloads, such as relational database service (RDS) gateway, the configuration and managing of authentication, security policies and access control lists (ACLs), as well as managing availability, redundancy, and scalability of the solution. Additionally, monitoring and auditing along with the ongoing requirement to remain compliant with corporate policies can quickly make the setup and management of jump servers an involving, costly, and less desirable task.

Azure Bastion is deployed in your virtual network providing RDP/SSH access for all authorized virtual machines connected to the virtual network.

Top-level Azure Bastion architecture

Key features available with the preview include:

  • RDP and SSH from the Azure portal: Initiate RDP and SSH sessions directly in the Azure portal with a single-click seamless experience.
  • Remote session over SSL and firewall traversal for RDP/SSH: HTML5 based web clients are automatically streamed to your local device providing the RDP/SSH session over SSL on port 443. This allows easy and securely traversal of corporate firewalls.
  • No public IP required on Azure Virtual Machines: Azure Bastion opens the RDP/SSH connection to your Azure virtual machine using a private IP, limiting exposure of your infrastructure to the public Internet.
  • Simplified secure rules management: Simple one-time configuration of Network Security Groups (NSGs) to allow RDP/SSH from only Azure Bastion.
  • Increased protection against port scanning: The limited exposure of virtual machines to the public Internet will help protect against threats, such as external port scanning.
  • Hardening in one place to protect against zero-day exploits: Azure Bastion is a managed service maintained by Microsoft. It’s continuously hardened by automatically patching and keeping up to date against known vulnerabilities.

Azure Bastion–The road ahead

Like with all other Azure networking services, we look forward to building out Azure Bastion and adding more great capabilities as we march towards general availability.

The future brings Azure Active Directory integration, adding seamless single-sign-on capabilities using Azure Active Directory identities and Azure Multi-Factor Authentication, and effectively extending two-factor authentication to your RDP/SSH connections. We are also looking to add support for native RDP/SSH clients so that you can use your favorite client applications to securely connect to your Azure Virtual Machines using Azure Bastion, while at the same time enhance the auditing experience for RDP sessions with full session video recording.

We encourage you all to try out the Azure Bastion and look forward to hearing and incorporating your feedback.

Using Azure Search custom skills to create personalized job recommendations

$
0
0

This blog post was co-authored by Kabir Khan, Software Engineer II , Learning Engineering Research and Developement.

The Microsoft Worldwide Learning Innovation lab is an idea incubation lab within Microsoft that focuses on developing personalized learning and career experiences. One of the recent experiences that the lab developed focused on offering skills-based personalized job recommendations. Research shows that job search is one of the most stressful times in someone’s life. Everyone remembers at some point looking for their next career move and how stressful it was to find a job that aligns with their various skills.

Harnessing Azure Search custom skills together with our library of technical capabilities, we were able to build a feature that offers personalized job recommendations based on identified capabilities from resumes. The feature parses a resume to identify technical skills (highlighted and checkmarked in the figure below.) It then ranks jobs based on the skills most relevant to the capabilities in the resume. Another helpful ability is in the UI layout, where the user can view the gaps in their skills (non-highlighted skills in the figure below) for jobs they’re interested in and work towards building those skills.

An image of the Worldwide Learning personalized jobs search demo UI

Figure one: Worldwide Learning Personalized Jobs Search Demo UI

In this example, our user is interested in transitioning from a Software Engineering role to Program Management. Displayed in the image above you can see how the top jobs for our user are in Program Management but they are ranked based on our user’s unique capabilities in areas like AI, Machine Learning and Cloud Computing, resulting in the top ranked job on the Bing Search and AI team which deals with all three.

How we used Azure Search

Image of the Worldwide Learning personalized jobs search architecture.

Figure two: Worldwide Learning Personalized Jobs Search Architecture

The above architecture diagram shows the data flow for our application. We started with around 2000 job openings pulled directly from the Microsoft Careers website as an example. We then indexed these jobs, adding a custom Azure Search cognitive skill to extract capabilities from the descriptions of each job. This allows a user to search for a job based on a capability like “Machine Learning”. Then, when a user uploads a resume, we upload it to Azure Blob storage and run an Azure Search indexer. Leveraging a mix of cognitive skills provided by Azure and our custom skill to extract capabilities, we end up with a good representation of the user’s capabilities.

To personalize the job search, we leverage the tag boosting scoring profile built into Azure Search. Tag boosting ranks search results by the user’s search query and the number of matching “tags” (in this case capabilities) with the target index. So, in our example, we pass the user’s capabilities along with their search query and get jobs that best match our user’s unique set of capabilities.

With Azure Search skills, our team was able to make the personalization of job search, a desirable capability among job seekers and recruiters, possible through this proof of concept. You can use the same process we followed to achieve the same goal for your own careers site. We open sourced the Skills Extractor Library that we used in this example and made it available in a container.

Please be aware that before running this sample, you must have the following:

  • Install the Azure CLI. This article requires the Azure CLI version 2.0 or later. Run az --version to find the version you have.
  • You can also use the Azure Cloud Shell.

To learn more about this feature, you can view the live demo (starts at timecode 00:50:00) and read more in our GitHub repository.

Feedback and support

We’re eager to improve, so please take a couple of minutes to answer some questions about your experience using this survey. For support requests, please contact us at WWL_Skills_Service@microsoft.com.

PowerPoint AI gets an upgrade and Designer surpasses a major milestone of 1 billion slides

Building hybrid applications with the WebView2 developer preview

$
0
0

Last month at Build, we introduced the new WebView2 coming to Windows, powered by the upcoming Chromium-based Microsoft Edge. Today, we’re releasing a new update to the WebView2 SDK, and with it we’re ready to encourage a broader set of app developers to try the WebView2 preview and give us early feedback.

The WebView 2 preview has a limited scope, with support for an initial set of Win32 C++ APIs on Windows 10. Support for other Windows versions (Windows 7+ and Windows Server 2012 R2+) and UWP/WFP/WinForms support will become available in the future.

Screen capture from WebView Roadmap video at Build 2019

Learn more about WebView2 in our session from Build 2019: “Moving the web forward with Microsoft Edge

With today’s SDK updates, WebView2 addresses a number of popular requests we heard during the initial preview, including supporting 32-bit WebView on 64-bit machines, the ability to disable devtools, the ability to disable status bar, and more. Going forward, our initial plan is to update the SDK roughly every six weeks, with our cadence and roadmap primarily driven by your feedback. You can receive information about new releases via our release notes or via the RSS feed of the WebView2 NuGet package.

About the control

The WebView2 control allows developers to host web content within your native apps. This hybrid approach lets you to share code with similar controls on other platforms or with your websites, to inject dynamic content into your native apps, and to leverage the rich and growing ecosystem of tools, frameworks, and talent around web technologies, among other benefits.

WebView2 can power a wide spectrum of apps and use cases. For example, we’re working closely with the Office team to bring a WebView2-powered Add-ins experience to future versions of apps like Excel, which will allow add-ins to leverage the full-fidelity Chromium engine that powers Microsoft Edge.

Screen capture showing an Office Add-in experience powered by the new WebView2.

Office add-ins such as this one by Lucidchart will be able to leverage the modern capabilities of Microsoft Edge, enabled by WebView2

A consistent foundation for web apps on all Windows devices

By building on the same Chromium-based foundation as the next version of Microsoft Edge, the new WebView2 control is compatible with the latest web platform capabilities and broadly interoperable with the web experiences you have already built. WebView2 is by default powered by the always up-to-date Microsoft Edge, so you can build your web content against the latest and most secure platform without worrying about fragmentation across Windows versions, or across your web content running in the browser and in your app.

For developers that need a fully locked-down web platform, we’re working on a bring-your-own mode, which will allow bundling a redistributable version of the browser with a WebView2 app. The redistributable browser powering WebView2 gets no automatic update; in this case, app developers are responsible for servicing and updating the WebView to receive security updates and new capabilities.

We recognize that many developers have invested in EdgeHTML and MSHTML-based web apps and hybrid apps over time. As we build out WebView2, we expect it to be a compelling successor for these developers, but we hear loud and clear that not every app is ready to move forward. Existing Windows applications built with Windows web technologies, such as EdgeHTML/MSHTML-based WebViews or WWA/HWA/PWA built on top of the UWP platform, will continue to work as-is without modification.

Getting started the preview

We’re early in our journey to provide a modern, robust WebView for Windows devices, and the future direction of WebView2 will be heavily influenced by your feedback. Visit our documentation to learn more about the developer preview, check out the getting-started tutorial, and share your feedback, suggestions, and details on your scenarios over at our feedback repo.

Have fun building!

Limin Zhu, Program Manager, WebView

The post Building hybrid applications with the WebView2 developer preview appeared first on Microsoft Edge Blog.

Viewing all 5971 articles
Browse latest View live