Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

Microsoft and SWIFT extend partnership to make native payments a reality

$
0
0

This blog post is co-authored by George Zinn, Corporate VP, Microsoft Treasurer.

This week at Sibos, the world’s largest financial services event, Microsoft and SWIFT are showcasing the evolution of the cloud-native proof of concept (POC) announced at last year’s event. Building off the relationship between Microsoft Azure, SWIFT, and the work with Microsoft treasury, the companies are entering a long-term strategic partnership to bring to market SWIFT Cloud Connect on Azure. Together we have built out an end-to-end architecture that utilizes various Azure services to ensure SWIFT Cloud Connect achieves the resilience, security, and compliance demands for material workloads in the financial services industry. Microsoft is the first cloud provider working with SWIFT to build public cloud connectivity and will soon make this solution available to the industry. 

SWIFT is the world’s leading provider of secure financial messaging services used and trusted by more than 11,000 financial institutions in more than 200 countries and territories. Today, enterprises and banks conduct these transactions by sending payment messages over the highly secure SWIFT network, leveraging on-premises installations of SWIFT technology. SWIFT Cloud Connect creates a bank-like wire transfer experience with the added operational, security, and intelligence benefits the Microsoft cloud offers.

To demonstrate the potential of the production-ready service, Microsoft Treasury has successfully run test payment transactions through the SWIFT production network to their counterparty Bank of New York-Mellon (BNY Mellon) for payment confirmations through SWIFT on Azure. BNY Mellon is a global investments company dedicated to helping its clients manage and service their financial assets throughout the investment lifecycle. The company’s Treasury Services group, which delivers high-quality performance in global payments, trade services and cash management, provides payments services for Microsoft Treasury.

“At BNY Mellon, we focus on delivering world class solutions that exceed our clients’ expectations,” said Bank of New York Mellon Treasury Services CEO Paul Camp. “Together with SWIFT, we continuously work to enhance the payments experience for clients around the world. We’re excited to join now with our Microsoft Treasury client and with SWIFT to help make Cloud Connect real, leveraging Microsoft’s cloud expertise to expand the frontiers of financial technology. Building on the positive experience with Cloud Connect, we look forward to exploring additional opportunities with Microsoft Treasury to advance their digital payments strategy.”

In response to the rapidly increasing cyber threat landscape, SWIFT introduced the customer security program (CSP). This introduces a set of mandatory security controls for which many financial institutions have a significant challenge to implement in their on-premise environment. To simplify and support control implementation and enable continuous monitoring and audit, Microsoft has developed a blueprint for the CSP framework. Azure Blueprint is a free service that enables customers to define a repeatable set of Azure resources and policies that implement and adhere to standards, patterns and control requirements.  Azure Blueprints allow customers to set up governed Azure environments at scale to aid secure and compliant production implementations. The SWIFT CSP Blueprint is now available in preview.

Microsoft treasury has performed their testing with SWIFT by leveraging the Azure Logic Apps service to process payment transactions. Such an implementation used to take months but instead was completed in just a few weeks. Treasury integrated their backend SAP systems via Logic Apps to SWIFT to process payment transactions and business acknowledgments. As part of this processing, the transactions are validated and checked for duplicates or anomalies using the rich capabilities of Logic Apps.

Logic Apps is Microsoft Azure’s integration platform as a service (iPaaS) and now provides native understanding of SWIFT messaging, enabling customers to accelerate the modernization of their payments infrastructure by leveraging the cloud. With hybrid VNet-connected integration capabilities to on-premises applications as well as a wide array of Azure services, Logic Apps provides more than 300 connectors for intelligent automation, integration, data movement, and more to harness the power of Azure.

Microsoft treasury is able to quickly leverage the power of Azure to enable a seamless transfer of payment transactions. With Azure Monitor and Log Analytics they are also able to monitor, manage, and correlate their payment transactions for full end-to-end process visibility.

We are thrilled to extend our partnership with SWIFT as we believe this will become an integral offering for the industry. We thank BNY Mellon for their part in confirming the potential of SWIFT Cloud Connect. To see it in action, stop by the Microsoft booth in the North Event Hall, Z131.


Microsoft 365 improves web and desktop productivity with new features in Office that help people brainstorm, create, and present

What’s new in Azure DevOps Sprint 157

$
0
0

Sprint 157 has just finished rolling out to all organizations and you can check out all the new features in the release notes. Here are some of the features that you can start using today.

Azure Boards: rollup

You can now track the progress of parent items using the Rollup feature on Azure Boards backlog. For instance, you can track the progress on a work item by seeing what percentage of its descendant items has been closed.

Azure Repos: add more than one reviewer in a group

When adding a group of reviewers for a pull request, you can now set policies that require more than one reviewer from that group to approve it. You can also add a policy to prevent requestors from approving their own changes.

Azure Repos: resolve work items via commits using keywords

You can now resolve work items via commits made to the default branch by using keywords like fix, fixes, or fixed. For example, you can write – “this change fixed #476” in your commit message and work item #476 will be completed when the commit is pushed or merged into the default branch.

Azure Artifacts: get latest Universal Package with semantic versioning wildcard

Inspired by a popular developer community request, we added the ability to get the latest Universal Package version using SemVer wildcard syntax, instead of pinning the specific version. The feature is available in both the Universal Package Azure Pipelines task and the Azure CLI.

Azure Pipelines: service hooks for YAML pipelines

Service hooks now have deeper integration with multi-stage YAML pipelines, allowing you to set up triggers for pipeline stage-specific states. You can, for instance, trigger an alert when a stage approval is required.

General: Get personalized notifications with @mention support in Slack apps

Now, the Azure Boards app for Slack, Azure Repos app for Slack and Azure Pipelines app for Slack will use @mentions in notifications to alert users, such as work item owners, build creators and release approvers.

These are just the tip of the iceberg, and there are plenty more features that we’ve released in Sprint 157. Check out the full list of features for this sprint in the release notes.

The post What’s new in Azure DevOps Sprint 157 appeared first on Azure DevOps Blog.

Azure Sentinel general availability: A modern SIEM reimagined in the cloud

$
0
0

Earlier this week, we announced that Azure Sentinel is now generally available. This marks an important milestone in our journey to redefine Security Information and Event Management (SIEM) for the cloud era. With Azure Sentinel, enterprises worldwide can now keep pace with the exponential growth in security data, improve security outcomes without adding analyst resources, and reduce hardware and operational costs.

With the help of customers and partners, including feedback from over 12,000 trials during the preview, we have designed Azure Sentinel to bring together the power of Azure and AI to enable Security Operations Centers to achieve more. There are lots of new capabilities coming online this week. I’ll walk you through several of them here.

Collect and analyze nearly limitless volume of security data

With Azure Sentinel, we are on a mission to improve security for the whole enterprise. Many Microsoft and non-Microsoft data sources are built right in and can be enabled in a single click. New connectors for Microsoft services like Cloud App Security and Information Protection join a growing list of third-party connectors to make it easier than ever to ingest and analyze data from across your digital estate.

Workbooks offer rich visualization options for gaining insights into your data. Use or modify an existing workbook or create your own.

image

Apply analytics, including Machine Learning, to detect threats

You can now choose from more than 100 built-in alert rules or use the new alert wizard to create your own. Alerts can be triggered by a single event or based on a threshold, or by correlating different datasets (e.g., events that match threat indicators) or by using built-in machine learning algorithms.

image

We’re previewing two new Machine Learning approaches that offer customers the benefits of AI without the complexity. First, we apply proven off-the-shelf Machine Learning models for identifying suspicious logins across Microsoft identity services to discover malicious SSH accesses. By using transferred learning from existing Machine Learning models, Azure Sentinel can detect anomalies from a single dataset with accuracy. In addition, we use a Machine Learning technique called fusion to connect data from multiple sources, like Azure AD anomalous logins and suspicious Office 365 activities, to detect 35 different threats that span different points on the kill chain.

Expedite threat hunting, incident investigation, and response

Proactive threat hunting is a critical yet time-consuming task for Security Operations Centers. Azure Sentinel makes hunting easier with a rich hunting interface that features a growing collection of hunting queries, exploratory queries, and python libraries for use in Jupyter Notebooks. Use these to identify events of interest and bookmark them for later reference.

image

Incidents (formerly cases) contain one or more alerts that require further investigation. Incidents now support tagging, comments, and assignments. A new rules wizard allows you to decide which Microsoft alerts trigger the creation of incidents.

image

Using the new investigation graph preview, you can visualize and traverse the connections between entities like users, assets, applications, or URLs and related activities like logins, data transfers, or application usage to rapidly understand the scope and impact of an incident.

image

New actions and playbooks simplify the process of incident automation and remediation using Azure Logic Apps. Send an email to validate a user action, enrich an incident with geolocation data, block a suspicious user, and isolate a Windows machine.

image

Build on the expertise of Microsoft and community members

The Azure Sentinel GitHub repository has grown to over 400 detection, exploratory, and hunting queries, plus Azure Notebooks samples and related Python libraries, playbooks samples, and parsers. The bulk of these were developed by our MSTIC security researchers based on their vast global security experience and threat intelligence.

image

Support managed Security Services Providers and complex customer instances

Azure Sentinel now works with Azure Lighthouse, empowering customers and managed security services providers (MSSPs) to view Azure Sentinel for multiple tenants without the need to navigate between tenants. We have worked closely with our partners to jointly develop a solution that addresses their requirements for a modern SIEM. 

DXC Technology, one of the largest global MSSPs is a great example of this design partnership:

“Through our strategic partnership with Microsoft, and as a member of the Microsoft Security Partner Advisory Council, DXC will integrate and deploy Azure Sentinel into the cyber defense solutions and intelligent security operations we deliver to our clients.” said Mark Hughes, senior vice president and general manager, Security, DXC. “Our integrated solution leverages the cloud native capabilities and assets of Azure Sentinel to orchestrate and automate large volumes of security incidents, enabling our security experts to focus on the forensic investigation of high priority incidents and threats.”

Get started

It really is easy to get started. We have a lot of information available to help you, from great documentation to connecting with us via Yammer and e-mail.

Please join us for a webinar on Thursday, September 26 at 10:00 AM Pacific Time to learn more about these innovations and see real-life examples of how Azure Sentinel helped detect previously undiscovered threats.

What’s next

Azure Sentinel is our SOC platform for the future, and we will continue to evolve it to better meet the security needs of the complex world we live in. Let’s stay in touch:

Stay on top of best practices with Azure Advisor alerts

$
0
0

To get the most out of your Azure investment and run as efficiently as possible, we recommend that you regularly review and optimize your resources for high availability, security, performance, and cost. That’s why we created Azure Advisor, a free Azure service that helps you quickly and easily optimize your Azure resources with personalized recommendations based on your usage and configurations.

But with so many priorities vying for your attention, it can be easy to miss remediating your Advisor recommendations. So, what’s a good way to stay on top of these critical optimizations that can save you money, boost performance, strengthen your security posture, and increase uptime?

Get notified about new recommendations with Advisor alerts

Advisor now offers user-configurable alerts so you can get automatically notified as soon as your best practice recommendations become available. Advisor alerts will allow you to act more quickly and efficiently to optimize your Azure resources and stay on top of your new recommendations.

An image of the Microsoft Azure Advisor Alerts (preview) page.
You can configure these alerts to be triggered based on several factors:

  • Recommendation category – high availability, performance, or cost.
  • Business impact – high, medium, or low.
  • Recommendation type – for example, right-size or shutdown underutilized virtual machines (VMs,) enable VM backup, or use availability sets to improve fault tolerance.

You can also choose from a wide range of notification options, including email, SMS, push notification, webhook, IT service management integration with popular tools like ServiceNow, Automation runbooks, and more. Your notification preferences are configured using action groups, so you can repurpose any action groups you’ve already set up, such as those for your custom Azure Monitor alerts or Azure Service Health alerts.

Best practices for your Advisor alerts

As you get started with Advisor alerts, we have three tips for you.

First, start simple by choosing a few high impact recommendations that are important to your organization, based on your business goals and priorities. For example, you might have a leadership mandate to reduce costs by a certain percentage, in which case you might decide that “Right-size or shutdown underutilized VMs” is a critical recommendation for you. Then create an alert for that set of recommendations. You can always change your alert or add more later.
 

An image of the Microsoft Azure Advisor alert creation page.
Second, consider who is right person to notify about new recommendations and the best way to notify them. It’s best to notify the individual or team who has the permission and authority to remediate the recommendation, to streamline the process. In keeping with the “start simple” principle, you may wish to begin with email notifications, which are the most basic to configure and the least intrusive to receive. Again, you can always modify your preferences later.

Finally, once you’ve tackled the first two tips and are comfortable with Advisor alerts, start to explore automation scenarios. For example, you can automatically route a new best practice recommendation through your ticketing system and assign it to the right team for remediation. In some cases, you can even use a combination of Advisor alerts and Automation runbooks to automatically remediate the recommendation.

Get started with Advisor alerts

Visit Advisor in the Azure portal to review your recommendations and start setting up your Advisor alerts. For more in-depth guidance, visit the Advisor documentation. Let us know if you have a suggestion for Advisor by submitting an idea in our forums here.

Announcing Azure Storage Explorer 1.10.0

$
0
0

This month we released a new version of Azure Storage Explorer, 1.10.0. This latest version of Storage Explorer introduces several exciting new features and delivers significant updates to existing functionality. These features and changes are all designed to make users more efficient and productive when working with Azure Storage, CosmosDB, ADLS Gen2, and, starting with 1.10.0, managed disks. If you’ve never used Storage Explorer before, you can download it for Windows, macOS, or Linux on the product page here.

Storage Explorer adds support for managed disks

One of the most challenging parts of migrating on-premises virtual machines (VMs) to Azure is moving the data for these VMs into Azure. Storage Explorer 1.10.0 makes this process much easier by adding support for managed disks. The new features we’ve added for managed disks lets you create and manage VM disks using the easy to use Storage Explorer GUI. Using Storage Explorer also gives you an incredibly performant workflow. When you upload a VHD to a Managed Disk, Storage Explorer is leveraging the power and speed of AzCopy v10 to quickly get your data into Azure. Storage Explorer’s support for managed disks also includes the ability to create snapshots of, copy, download, and delete your managed disks. You can learn more about the latest disk support capabilities on our recent blog.

Storage Explorer introduces new user settings

Ever since Storage Explorer was first released, users have asked for a variety of settings that would allow them to configure how Storage Explorer behaves. As more settings have been added though, managing and discovering these settings has proved increasingly difficult. To help alleviate those problems, we are excited to introduce a centralized settings user interface (UI.) From this UI, you can configure many of Storage Explorer’s existing setting, such as proxy and application theme. We’ve also added settings which allow you to logout on exit and to toggle the refresh mode of the data explorers.

We have a long list of user requested settings in our backlog which will make their way to the settings UI in future updates. And if you have a suggestion for a setting you’d like to see, feel free to let us know by opening an issue at our GitHub repo.

Storage Explorer now available on the Snap Store

The last major change we’d like to highlight for 1.10.0 is the addition of Storage Explorer to the Canonical Snap Store. Installing Storage Explorer on Linux has always been a challenge for users, but when you install from the Snap Store things become as easy as installing on any other platform. The Snap platform will install all dependencies for you, and help you keep Storage Explorer up to date and secure. If you’d like to install Storage Explorer from the Snap Store, you can find it listed on the store.

Looking forward

Over the coming months, we have plans to add even more new features and capabilities to Storage Explorer. In the near future, we will be making AzCopy the default transfer engine for all Blob transfers, and we’ll start work on using AzCopy for File Shares. We’ve also been hard at work localizing Storage Explorer into additional languages so more people all over the world can effectively use the product. We’re going to improve on and bring additional features to ADLS Gen 2, including enhanced ACL management and increased parity with Blob features. And of course, we’ll be looking at GitHub for any user requests for new features, so if there’s something you would like to see then we highly encourage you to to open an issue.

Install Storage Explorer now

Download Storage Explorer 1.10.0 today to take advantage of all of these new features. If you have any feedback, please make sure to open a new issue on our GitHub repo. If you are experiencing difficulties using the product, please open a support ticket following these instructions.

SAP on Azure Architecture – Designing for performance and scalability

$
0
0

This is the second in a four-part blog series on designing a SAP on Azure Architecture. In the first part of our blog series we have covered the topic of designing for security. Robust SAP on Azure Architectures are built on the pillars of security, performance and scalability, availability and recoverability, and efficiency and operations. This blog will focus on designing for performance and scalability. 

Microsoft support in network and storage for SAP

imageMicrosoft Azure is the eminent public cloud for running SAP applications. Mission critical SAP applications run reliably on Azure, which is a hyperscale, enterprise proven platform offering scale, agility, and cost savings for your SAP estate.

With the largest portfolio of SAP HANA certified IaaS cloud offerings customers can run their SAP HANA Production scale-up applications on certified virtual machines ranging from 192GB to 6TB of memory. Additionally, for SAP HANA scale-out applications such as BW on HANA and BW/4HANA, Azure supports virtual machines of 2TB memory and up to 16 nodes, for a total of up to 32TB. For customers that require extreme scale today, Azure offers bare-metal HANA large instances for SAP HANA scale-up to 20TB (24TB with TDIv5) and SAP HANA scale-out to 60TB (120TB with TDIv5).

Our customers such as CONA Services are running some of the largest SAP HANA workloads of any public cloud with a 28TB SAP HANA scale out implementation. 

Designing for performance

Performance is a key driver for digitizing business processes and accelerating digital transformation. Production SAP applications such as SAP ERP or S/4HANA need to be performant to maximize efficiency and ensure a positive end-user experience. As such, it is essential to perform a detailed sizing exercise on compute, storage and network for your SAP applications on Azure.

Designing compute for performance

In general, there are two ways to determine the proper size of SAP systems to be implemented in Azure, by using reference sizing or through the SAP Quick Sizer.

For existing on-premises systems, you should reference system configuration and resource utilization data. The system utilization information is collected by the SAP OS Collector and can be reported via SAP transaction OS07N as well as the EarlyWatch Alert. Similar information can be retrieved by leveraging any system performance and statistics gathering tools. For new systems, you should use SAP quick sizer.

Within the links below you can also attain the network and storage throughput per Azure Virtual Machines type:

Designing highly performant storage

In addition to selecting an appropriate database virtual machine based on the SAPS and memory requirements, it is important to ensure that the storage configuration is designed to meet the IOPS and throughput requirements of the SAP database. Be mindful, that the chosen virtual machine has the capability to drive IOPS and throughput requirements. Azure premium managed disks can be striped to aggregate IOPS and throughput values, for example 5 x P30 disks would offer 25K IOPS and 1000 MB/s throughput.

In the case of SAP HANA databases, we have published a storage configuration guideline covering production scenarios and also a cost-conscious non-production variant. Following our recommendation for production will ensure that the storage is configured to successfully pass all SAP HCMT KPIs, it is imperative to enable write accelerator on the disks associated with the /hana/log volume as this facilitates sub millisecond writes latency for 4KB and 16KB blocks sizes.

imageUltra Disks is designed to deliver consistent performance and low latency for I/O-intensive workloads such as SAP HANA and any database (SQL, Oracle, etc.) With ultra disk you can reach maximum virtual machine I/O limits with a single Ultra DISKS, without having to stripe multiple disks as is required with premium disks.

At September 2019, Azure Ultra Disk Storage is generally available in East US 2, South East Asia, North Europe regions. and supported on DSv3 and ESv3 VM types. Refer to the FAQ for the latest on supported VM sizes for both Windows and Linux OS hosts. This video demonstrates the leading performance of Ultra Disk Storage.

Designing network for performance

As the Azure footprint grows, a single availability zone may span multiple physical data centers, which can result in network latency impacting your SAP application performance. A proximity placement group (PPG) is a logical grouping to ensure that Azure compute resources are physically located close to each and achieving the lowest possible network latency i.e. co-location of your SAP Application and Database VMs. For more information, refer to our detailed documentation for deploying your SAP application with PPGs.

We recommend you consider PPGs within your SAP deployment architecture and that you enable Accelerated Networking on your SAP Application and Database VMs. Accelerated Networking enables single root I/O virtualization (SR-IOV) to your virtual machine which improves networking performance, bypassing the host from the data-path. SAP application server to database server latency can be tested with ABAPMeter report /SSA/CAT.

ExpressRoute Global Reach allows you to link ExpressRoute circuits from on-premise to Azure in different regions together to make a private network between your on-premises networks. Global Reach can be used for your SAP HANA Large Instance deployment to enable direct access from on-premise to your HANA Large Instance units deployed in different regions. Additionally, GlobalReach can enable direct communication between your HANA Large Instance units deployed in different regions

Designing for scalability

With Azure Mv2 VMs, you can scale up to 208 vCPUs/6TB now and 12 TB shortly. For databases that require more than 12 TB, we offer SAP HANA Large Instances (HLI), purpose-built bare metal offering that are dedicated to you. The server hardware is embedded in larger stamps that contains HANA TDI certified compute, network and storage infrastructure, in various sizes from 36 Intel CPU cores/768 GB of memory up to a maximum size of 480 s CPU cores and 24 TB of memory.

Azure global regions at HyperScale

Azure has more global regions than any other cloud provider, offering the scale needed to bring applications closer to users around the world, preserving data residency, and offering comprehensive compliance and resiliency options for customers. As of Q3 2019, Azure spans a total of 54 regions and is available in 140 countries.

Customers like the Carlsberg Group, transformed IT into a platform for innovation through a migration to Azure centered on its essential SAP applications. The Carlsberg migration to Azure encompassed 700 servers and 350 applications—including the essential SAP applications—involving 1.6 petabytes of data, including 8 terabytes for the main SAP database.

Within this blog we have touched upon several topics relating to designing highly performant and scalable architectures for SAP on Azure.
As customers embark on their SAP to Azure journey, in order to methodically deploy highly performant, and scalable architectures, during various phases of the deployment, it is recommended to deep dive into , the SAP on Azure documentation to deepen their understanding of using Azure for hosting and running their SAP applications. The SAP workload on Azure planning and deployment checklist can be used as a compass to navigate through the various phases of a customer’s SAP Greenfield deployment or on-premises to Azure migration project.

In blog #3 in our series we will cover Designing for Availability and Recoverability.

Watch the latest Visual Studio extensibility videos

$
0
0

We have been posting several short videos about Visual Studio extensibility to our YouTube channel in the past couple of months. We chose the topics for the first videos, but now it’s time for you to tell us what videos to record next.

The idea behind the short videos is to introduce extensibility concepts you may find useful. Perhaps you’ll even go ahead and implement some of them in your own extensions.

Often the concepts aren’t new but relatively unknown to Visual Studio extension authors. There’s always something to learn and we’ve found that short videos are a great way to show and tell.

If you haven’t yet, make sure to check out the first batch of extensibility related videos, which include:

Find these and more videos on the Visual Studio YouTube channel in the Tips & Tricks playlist.

What videos should we do next?

We’d love to make more videos about Visual Studio extensibility and could use your help to come up with the right topics. So, if you want us to make a video covering any specific topic, let us know in the comments below. Please remember that the topic should fit a 5-minute video format.

The post Watch the latest Visual Studio extensibility videos appeared first on Visual Studio Blog.


Azure DevOps Demo Generator is now open source

$
0
0

The Azure DevOps Demo Generator is a community operated service which can provision template-based projects inside your Azure DevOps organization. It was originally created for Microsoft as an educational resource supporting the Azure DevOps Labs. Although it started as a way populate lab projects, it has evolved to support integrations with various third-party DevOps tools including GitHub, Jenkins, Terraform, and many others.

Azure DevOps Demo Generator, select template

Over the years since its introduction, the generator has also been used to support new offerings from Microsoft Learning and more recently the Microsoft Cloud Adoption Framework. Various other teams inside Microsoft also use the generator for custom training and hackathons.

A common customer request has been for more real-world examples of automating Azure DevOps using REST APIs. To help with this, today we have published the source code for Azure DevOps Demo Generator on GitHub under the MIT License.

Another common request from customers, partners and trainers is the ability to create custom templates for provisioning their own training environments. Today we are also making Custom Templates available in the Azure DevOps Demo Generator service.

Azure DevOps Demo Generator - Extract Template

You can find more information on custom templates here on GitHub. You are welcome to create and use custom templates for your own purposes. Create training, challenges, hackathons, for your own purposes.

Once the templates have been created, they can be provisioned by uploading them from the Private tab when choosing a template.

NOTE: Custom templates are not designed for “project copy” or “migration.” The generator only supports a subset of Azure DevOps artifact types and does not scale to production-level projects (e.g., large numbers of work items).

Now that the source for the Azure DevOps Demo Generator is public, you can fork it and use it to learn the Azure DevOps APIs or as a foundation for you own Azure DevOps automation tools. We look forward to your feedback on this community operated service as GitHub issues and your other contributions as pull requests. Please Refer to the project contribution guidance for more information.

The post Azure DevOps Demo Generator is now open source appeared first on Azure DevOps Blog.

.NET Framework September 2019 Preview of Quality Rollup

$
0
0

We have released the September 2019 Preview of Quality Rollup and Cumulative Updates for .NET Framework for Windows 10

Quality and Reliability

This release contains the following quality and reliability improvements.

BCL1

  • Addresses an issue that affects thread contention that occurs in BinaryFormatter.GetTypeInformation by using ConcurrentDictionary to handle multi-thread access.

CLR2

  • Addresses an issue that might cause handle leaks to occur in scenarios that repeatedly load and unload Mscoree.dll.
  • Addresses rare cases that incorrectly cause process termination instead of delivering the expected NullReferenceException result.

WPF3

  • Addresses an issue that affects a WPF ComboBox (or any Selector) within a DataGrid cell that can try to change its selection properties (SelectedIndex, SelectedItem, SelectedValue) when the cell’s data item is re-virtualized or removed from the underlying collection. This can occur if the Selector’s ItemSource property is data bound through the cell’s DataContext setting. Depending on the virtualization mode and the bindings that are declared for the selection properties, the symptoms can include unexpected changes (to null) of the data item’s properties, and unexpected displays (as null) of other data items that re-use the UI that was previously attached to the re-virtualized item.
  • Addresses an issue in which a WPF TextBox or RichTextBox element that has spell checking enabled crashes and returns an “ExecutionEngineException” error in some situations, including inserting text near a hyperlink.
  • Addresses an issue that affects Per-Monitor Aware WPF applications that host System-Aware or Unaware child windows and that run on .NET Framework 4.8. This .NET version occasionally crashes and returns a “System.Collections.Generic.KeyNotFoundException” exception.

1 Base Class Library (BCL)
2 Common Language Runtime (CLR)
3 Windows Presentation Foundation (WPF)

Getting the Update

The Preview of Quality Rollup is available via Windows Update, Windows Server Update Services, and Microsoft Update Catalog.

Microsoft Update Catalog

You can get the update via the Microsoft Update Catalog. For Windows 10, NET Framework 4.8 updates are available via Windows Update, Windows Server Update Services, Microsoft Update Catalog.  Updates for other versions of .NET Framework are part of the Windows 10 Monthly Cumulative Update.

Note: Customers that rely on Windows Update and Windows Server Update Services will automatically receive the .NET Framework version-specific updates. Advanced system administrators can also take use of the below direct Microsoft Update Catalog download links to .NET Framework-specific updates. Before applying these updates, please ensure that you carefully review the .NET Framework version applicability, to ensure that you only install updates on systems where they apply. The following table is for Windows 10 and Windows Server 2016+ versions.

Product Version Cumulative Update
Windows 10 1903 and Windows Server, version 1903 Catalog 4522738
.NET Framework 3.5, 4.8 Catalog 4515871
Windows 10 1809 (October 2018 Update) Windows Server 2019 Catalog 4516550
.NET Framework 3.5, 4.7.2 Catalog 4515855
.NET Framework 3.5, 4.8 Catalog 4515843
Windows 10 1803 (April 2018 Update)    
.NET Framework 3.5, 4.7.2 Catalog 4516045
.NET Framework 4.8 Catalog 4515842
Windows 10 1709 (Fall Creators Update)
.NET Framework 3.5, 4.7.1, 4.7.2 Catalog 4516071
.NET Framework 4.8 Catalog 4515841
Windows 10 1703 (Creators Update)  
.NET Framework 3.5, 4.7, 4.7.1, 4.7.2 Catalog 4516059
.NET Framework 4.8 Catalog 4515840
Windows 10 1607 (Anniversary Update) Windows Server 2016  
.NET Framework 3.5, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog 4516061
.NET Framework 4.8 Catalog 4515839

 

The following table is for earlier Windows and Windows Server versions.

Product Version Preview of Quality Rollup
Windows 8.1, Windows RT 8.1 and Windows Server 2012 R2 Catalog 4516553
.NET Framework 3.5 Catalog 4507005
.NET Framework 4.5.2 Catalog 4506999
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog 4515853
.NET Framework 4.8 Catalog 4515846
Windows Server 2012 Catalog 4516552
.NET Framework 3.5 Catalog 4507002
.NET Framework 4.5.2 Catalog 4507000
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog 4515852
.NET Framework 4.8 Catalog 4515845
Windows 7 SP1 Windows Server 2008 R2 SP1 Catalog 4516551
.NET Framework 3.5.1 Catalog 4507004
.NET Framework 4.5.2 Catalog 4507001
.NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog 4515854
.NET Framework 4.8 Catalog 4515847
Windows Server 2008 Catalog 4516554
.NET Framework 2.0, 3.0 Catalog 4507003
.NET Framework 4.5.2 Catalog 4507001
.NET Framework 4.6 Catalog 4515854

 

Previous Monthly Rollups

The last few .NET Framework Monthly updates are listed below for your convenience:

The post .NET Framework September 2019 Preview of Quality Rollup appeared first on .NET Blog.

Handling dates and times in R: a free online course

$
0
0

If you ever need to work with data involving dates, times or durations in R, then take a look at this free course on LinkedIn Learning presented by Mark Niemann-Ross: R Programming in Data Science: Dates and Times. Here's the course overview from the introductory video:

When did Mount St Helens last erupt? How many birds migrated south this month? Which month last year was the coldest on average? These are questions related to time. And if you're going to perform calculations on this type of data, then you need date and time data structures. I

I'll show you how to store dates and times, how to retrieve dates and times, and how to add and subtract dates and times to discover the duration of an event or the difference between two dates. I'll show how to handle dates for business research and for financial forecasting. We'll explore functions include with base R for dates and time calculations, and we'll explore packages that provide special ways to work with dates and times. 

The course does assume some basic familiarity with R syntax, but does start from the beginning with the base R date and time classes before moving on to the newer Tidyverse packages for handling date and time data. Each chapter also includes a quiz to test your knowledge.

The course content is free to view, and all the R scripts associated with the course are available to download as well. 

LinkedIn Learning: R Programming in Data Science: Dates and Times

Top Stories from the Microsoft DevOps Community – 2019.09.27

$
0
0

Whether or not you are an optimist, many jobs in technology teach you to anticipate and protect against the worst-case scenario. In this week’s community stories, we learn more about the importance of automated testing and the perils of edge cases in Site Reliability Engineering. Let’s get started!

Terraform on Microsoft Azure – Part 6: Continuous Integration using Docker and Azure Pipeline
Terraform is rapidly growing in popularity, and we’ve featured Terraform integrations a number of times in this newsletter. Testing your infrastructure-as-code, however, is still relatively difficult. In this blog post (part of a series),  Julien Corioland walks us through the process of testing the Terraform code using Terratest in Azure Pipelines. Thank you Julien!

Cypress in Azure DevOps Pipeline for Fast, Easy and Reliable Test Automation
Speaking of testing frameworks, this article features the integration between Azure Pipelines and Cypress, a JavaScript-based testing framework providing an end-to-end testing experience for modern web applications. It is certainly important to not just deliver great features, but also make sure that your UX is working as expected!

Developing a PySpark Application – Creating a CI Build
while While we are on the topic of Continuous Integration, we might as well discuss a third very different programming language and use case. In this post, (part of a series) Simon D’Morias walks through configuring CI for a PySpark application built for DataBricks, using Azure Pipelines. Thank you Simon for sharing your expertise!

Global DevOps Bootcamp 2019 – Keynote
I was aware that a 2012 Azure outage was due to a certificate issue, but I never knew it was caused by an edge-case! In this recording of the keynote from the Global DevOps Bootcamp 2019 Niall Murphy, a leading site reliability engineer at Microsoft and the co-author of the The Site Reliability Workbook talks about the professional pessimism of the SRE, and everything that could go wrong when running a production applications in distributed computing environments. Backed by examples of some astonishing outages!

Azure DevOps Demo Generator is now open source
Typically, we don’t feature our own blog on this newsletter, but I had to make an exception for this one. The Azure DevOps Demo Generator is now Open Source! This project has been an example of the power of cross-team collaboration and volunteer contribution from its very beginning. It has enabled a quicker path to learning and implementing POCs of common use cases of Azure DevOps, simplifying the onboarding experience. And now it is going to allow for an even broader impact, as it can be leveraged by organizations outside Microsoft. A huge thank you to my teammates – Dave McKinstry, for all the hard work that went into the open-sourcing, and Sachin Raj, for making the demo generator happen!

If you’ve written an article about Azure DevOps or find some great content about DevOps on Azure, please share it with the #AzureDevOps hashtag on Twitter!

The post Top Stories from the Microsoft DevOps Community – 2019.09.27 appeared first on Azure DevOps Blog.

How to download over 80 free 101-level C#, .NET, and ASP.NET for beginners videos for offline viewing

$
0
0

Earlier this week I announced over 80 new free videos in our .NET Core 3.0 launch video series - Announcing free C#, .NET, and ASP.NET for beginners video courses and tutorials

Three questions came up consistently:

  • My work or country blocks YouTube! What about me?
  • How can I download these and watch them offline?
  • I have very low bandwidth. Can I get smaller versions I can download over 3G/4G?

Here's some answers for you!

My work or country blocks YouTube! What about me?

First, we have updated http://dot.net/videos to include links to BOTH YouTube *and* to the same videos hosted on Microsoft's Channel 9, which shouldn't be blocked by you country or company!

How can I download these and watch them offline?

Good question! Here's how to download a whole series with PowerShell! Let's say I want to download "C# 101."

First, head over to https://dot.net/videos.

C# 101 videos

Second, click "Watch on Channel 9" there at the bottom of the series you want, in my case, C# 101.

Note that there's a link there in the corner that says "RSS" - that's Really Simple Syndication!

RSS Videos for Channel 9 Videos

Right click on the one you want, for example MP4 Low for people on low bandwidth connections! (Question #3 gets answered too, two for one!), and say "Copy Link Address." Now that link is in your clipboard!

Next, I made a little PowerShell script and put it here in a Gist. If you want, you can right click on this link here and Save Link As and name it something like DownloadVideos.ps1. Maybe save it in C:temp or c:usersYOURNAMEDesktopDotNetVideos. Whatever makes you happy. Make sure you saved it with a *.ps1 extension.

Finally, open up the PS1 file in a text editor and check lines 2 and 3. Put in a path that's correct for YOUR computer, again, like C:temp, or your downloads folder.

#CHECK THE PATH ON LINE 2 and the FEED on LINE 3

cd "C:usersscottDownloads"
$a = ([xml](new-object net.webclient).downloadstring("https://channel9.msdn.com/Series/CSharp-101/feed/mp4"))
$a.rss.channel.item | foreach{
$url = New-Object System.Uri($_.enclosure.url)
$file = $url.Segments[-1]
$file
if (!(test-path $file)) {
(New-Object System.Net.WebClient).DownloadFile($url, $file)
}
}

Make sure the RSS link that you copied earlier above is correct on line 3. We need a Local Folder and we need our Remote RSS Link.

NOTE: If you're an expert you might think this PowerShell script isn't fancy enough or doesn't do x or y, but it'll do pretty nicely for this project. You're welcome to fork it or improve it here.

And finally, open up PowerShell on your machine from the Start Menu and run your downloadvideos.ps1 script like in this screenshot.

What about getting Low Bandwidth videos I can download on a slow connection!

The low bandwidth videos are super small, some smaller than a JPEG! The largest is just 20 megs, so the full C# course is under 200 megs total.

You can download ALL the videos for EACH playlist by visiting http://dot.net/videos, getting the RSS URL for the video playlist you want, and running the PS1 script again with the changed URL on line 3.

Hope this helps!


Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger... With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!


© 2019 Scott Hanselman. All rights reserved.
     

Windows Virtual Desktop is now generally available worldwide

$
0
0

Since we announced the preview of Windows Virtual Desktop in March, thousands of customers have piloted the service, providing valuable feedback and insights for Microsoft to integrate into the service. Today, we are excited to announce the worldwide general availability of Windows Virtual Desktop. It is the only service that delivers simplified management, a multi-session Windows 10 experience, optimizations for Office 365 ProPlus, and support for Windows Server Remote Desktop Services (RDS) desktops and apps. With Windows Virtual Desktop, you can deploy and scale your Windows desktops and apps on Azure in minutes.

Now available in all geographies, customers will be able to deploy scalable Azure-based virtualization solutions with a number of operating systems, including Windows 10 multi-session, Windows Server, and Windows 7 desktops with free Extended Security Updates for up to three years for customers still completing their move to Windows 10.  We have also been working closely to build out an extensive partner ecosystem that can extend the benefits and capabilities of Windows Virtual Desktop to enable customers to get the most of out their virtualization investments.

With the appropriate license, simply set up an Azure subscription to get started today. You can choose the type of virtual machines and storage for your environment and optimize your costs by taking advantage of Azure Virtual Machine Reserved Instances to save up to 80 percent. Use the new Windows Virtual Desktop pricing calculator to estimate the cost of operating your environment on Azure.

Read more about the announcement in our Microsoft 365 General Availability blog published today by Brad Anderson and Takeshi Numoto.

Learn more about Windows Virtual Desktop and get started today.

Built-in Jupyter notebooks in Azure Cosmos DB are now available

$
0
0

Earlier this year, we announced a preview of built-in Jupyter notebooks for Azure Cosmos DB. These notebooks, running inside Azure Cosmos DB, are now available.

Overview of built-in Jupyter notebooks in Azure Cosmos DB.

Cosmic notebooks are available for all data models and APIs including Cassandra, MongoDB, SQL (Core), Gremlin, and Spark to enhance the developer experience in Azure Cosmos DB. These notebooks are directly integrated into the Azure Portal and your Cosmos accounts, making them convenient and easy to use. Developers, data scientists, engineers and analysts can use the familiar Jupyter notebooks experience to:

  • Interactively run queries
  • Explore and analyze data
  • Visualize data
  • Build, train, and run machine learning and AI models

In this blog post, we’ll explore how notebooks make it easy for you to work with and visualize your Azure Cosmos DB data.

Easily query your data

With notebooks, we’ve included built-in commands to make it easy to query your data for ad-hoc or exploratory analysis. From the Portal, you can use the %%sql magic command to run a SQL query against any container in your account, no configuration needed. The results are returned immediately in the notebook.

SQL query using built-in Azure Cosmos DB notebook magic command.

Improved developer productivity

We’ve also bundled in version 4 of our Azure Cosmos DB Python SDK for SQL API, which has our latest performance and usability improvements. The SDK can be used directly from notebooks without having to install any packages. You can perform any SDK operation including creating new databases, containers, importing data, and more.

Create new database and container with built-in Python SDK in notebook.

Visualize your data

Azure Cosmos DB notebooks comes with a built-in set of packages, including Pandas, a popular Python data analysis library, Matplotlib, a Python plotting library, and more. You can customize your environment by installing any package you need.

Install custom package using pip install.

For example, to build interactive visualizations, we can install bokeh and use it to build an interactive chart of our data.

Histogram of data stored in Azure Cosmos DB, showing users who viewed, added, and purchased an item.

Users with geospatial data in Azure Cosmos DB can also use the built-in GeoPandas library, along with their visualization library of choice to more easily visualize their data.

Choropleth world map of data stored in Azure Cosmos DB, showing revenue by country.

Getting started

  1. Follow our documentation to create a new Cosmos account with notebooks enabled or enable notebooks on an existing account. Create account with notebooks or enable notebooks on existing account in Azure portal.
  2. Start with one of the notebooks included in the sample gallery in Azure Cosmos Explorer or Data Explorer.Azure Cosmos DB notebooks sample gallery.
  3. Share your favorite notebooks with the community by sending them to the Azure Cosmos DB notebooks GitHub repo.
  4. Tag your notebooks with #CosmosDB, #CosmicNotebooks, #PoweredByCosmos on social media. We will feature the best and most popular Cosmic notebooks globally!

Stay up-to-date on the latest Azure #CosmosDB news and features by following us on Twitter or LinkedIn. We’d love to hear your feedback and see your best notebooks built with Azure Cosmos DB!


Azure IoT Tools September Update: Azure IoT Edge remote debug and more!

$
0
0

Welcome to the September update of Azure IoT Tools!

In this September release, you will see the improved remote debugging experience and Azure IoT Device Provisioning Service support in Visual Studio. Additionally, an iotedge-compose tool has been released for you to port compose-based apps to Azure IoT Edge.

Debug Azure IoT Edge C# remote Linux container

With the version 16.3 release of Visual Studio 2019, the remote debugging experience in Linux docker containers has been improved. The feature is intended to help generic container debugging. We can leverage it in Azure IoT Edge module scenario to help developers debug remote Azure IoT Edge C# Linux module container with ease.

For more details, you can check out this blog post to see how to use this feature in your Azure IoT Edge module debugging with the step-by-step instructions.

Support Azure IoT Hub Device Provisioning Service in Cloud Explorer

The Azure IoT Hub Device Provisioning Service is a helper service for Azure IoT Hub that enables zero-touch, just-in-time provisioning to the correct Azure IoT hub without requiring human intervention, enabling customers to provision millions of devices in a secure and scalable manner.

We’re pleased to announce that Visual Studio Cloud Explorer now supports the Azure IoT Hub Device Provisioning Service. You can now access your Azure IoT Hub Device Provisioning Services without leaving Visual Studio. You can check out this blog post for more details.

Convert Docker Compose projects to Azure IoT Edge solutions

If you are a container expert, you may want to port your existing Docker Compose application to Azure IoT Edge. We are glad to introduce iotedge-compose to help you convert your Docker Compose project to an Azure IoT Edge solution. iotedge-compose is a CLI tool written in Python. After you have installed Python, all you need to do is to install it through pip:

pip install iotedge-compose

Then, check out this blog post to see how to convert your Docker Compose file or project.

Try it out

Please don’t hesitate to give it a try! If you have any feedback, feel free to reach us at https://github.com/microsoft/vscode-azure-iot-tools/issues. We will continuously improve our IoT developer experience to empower every IoT developers on the planet to achieve more!

The post Azure IoT Tools September Update: Azure IoT Edge remote debug and more! appeared first on Visual Studio Blog.

Windows 10 WinRT API Packs released

$
0
0

With the announcement of the release of .NET Core 3.0, we are pleased to announce we have posted on nuget.org the released versions of the Windows 10 WinRT API Pack. The Windows 10 WinRT API Pack allows your WPF or Winforms application to quickly and easily access Windows functionality like Geolocation, Windows AI, Machine Learning, Bluetooth and much more. 

Accessing these APIs in your project is as simple as adding the NuGet to your project.

Getting Started

Step 1: Configure your project to support Package Reference 

Step 2: Add the Microsoft.Windows.SDK.Contracts NuGet package to your project 

  1. Open the NuGet Package Manager Console 
  2. Install the package that includes the Windows 10 Contracts you want to target. Currently the following are supported:

Windows 10 version 1803 

  

Install-Package Microsoft.Windows.SDK.Contracts -Version 10.0.17134.1000  

Windows 10 version 1809 

  

Install-Package Microsoft.Windows.SDK.Contracts -Version 10.0.17763.1000  

Windows 10 version 1903 

  

Install-Package Microsoft.Windows.SDK.Contracts -Version 10.0.18362.2005  

Step 3: Get coding 

By adding one of the above NuGet packages, you now have access to calling the Windows Runtime (WinRT) APIs in your project.  

For example, this snippet shows a WPF Message box displaying the latitude and longitude coordinates: 

 

private async void Button_Click(object sender, RoutedEventArgs e)
{
    var locator = new Windows.Devices.Geolocation.Geolocator();
    var location = await locator.GetGeopositionAsync();
    var position = location.Coordinate.Point.Position;
    var latlong = string.Format("lat:{0}, long:{1}", position.Latitude, position.Longitude);
    var result = MessageBox.Show(latlong);
}

Many partners are already using these NuGet files. If you check out a project that is already using these NuGet packages, see the Microsoft.Toolkit.

The post Windows 10 WinRT API Packs released appeared first on Windows Developer Blog.

Windows Virtual Desktop is now generally available worldwide

New to Microsoft 365 in September—updates to Microsoft To Do, PowerPoint, OneNote, and more

Azure Cosmos DB recommendations keep you on the right track

$
0
0

The tech world is fast-paced, and cloud services like Azure Cosmos DB get frequent updates with new features, capabilities, and improvements. It’s important—but also challenging—to keep up with the latest performance and security updates and assess whether they apply to your applications. To make it easier, we’ve introduced automatic and tailored recommendations for all Azure Cosmos DB users. A large spectrum of personalized recommendations now show up in the Azure portal when you browse your Azure Cosmos DB accounts.

Some of the recommendations we’re currently dispatching cover the following topics

  • SDK upgrades: When we detect the usage of an old version of our SDKs, we recommend upgrading to a newer version to benefit from our latest bug fixes and performance improvements.
  • Fixed to partitioned collections: To fully leverage Azure Cosmos DB’s massive scalability, we encourage users of legacy, fixed-sized containers that are approaching the limit of their storage quota to migrate these containers to partitioned ones.
  • Query page size: We recommend using a query page size of -1 for users that define a specific value instead.
  • Composite indexes: Composite indexes can dramatically improve the performance and RU consumption of some queries, so we suggest their usage whenever our telemetry detects queries that can benefit from them.
  • Incorrect SDK usage: It’s possible for us to detect when our SDKs are incorrectly used, like when a client instance is created for each request instead of being used as a singleton throughout the application; corresponding recommendations are provided in these cases.
  • Lazy indexing: The purpose of Azure Cosmos DB’s lazy indexing mode is rather limited and can impact the freshness of query results in some situations. We advise using the (default) consistent indexing mode instead of lazy indexing.
  • Transient errors: In rare occurrences, some transient errors can happen when a database or collection gets created. SDKs usually retry operations whenever a transient error occurs, but if that’s not the case, we notify our users that they can safely retry the corresponding operation.

Each of our recommendations includes a link that brings you directly to the relevant section of our documentation, so it’s easy for you to take action.

3 ways to find your Azure Cosmos DB recommendations

1.    Click on this message at the top of the Azure Cosmos DB blade:

A pop-up message in Azure Cosmos DB saying that new notifications are available.
2.    Head directly to the new “Notifications” section of your Cosmos DB accounts:

The Notifications section showing all received Cosmos DB recommendations.
3.    Or even find them through Azure Advisor, making it easier to receive our recommendations for users who don’t routinely visit the Azure portal.

Over the coming weeks and months, we’ll expand the coverage of these notifications to include topics like partitioning, indexing, network security, and more. We also plan to surface general best practices to ensure you’re making the most out of Azure Cosmos DB.

Have ideas or suggestions for more recommendations? Email us or leave feedback using the smiley on the top-right corner of the Azure portal!

Viewing all 5971 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>