Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

Real World Cloud Migrations: CDNs are an easy improvement to legacy apps

$
0
0

I'm doing a quiet backend migration/update to my family of sites. If I do it right, there will be minimal disruption. Even though I'm a one person show (plus Mandy my podcast editor) the Cloud lets me manage digital assets like I'm a whole company with an IT department. Sure, I could FTP files directly into production, but why do that when I've got free/cheap stuff like Azure DevOps.

As I'm one person, I do want to keep costs down whenever possible and I've said so in my "Penny Pinching in the Cloud" series. However, I do pay for value, so I'll try to keep costs down whenever possible, but I will pay for things I feel are useful or provide me with a quality product and/or save me time and hassle. For example, Azure Pipelines gives one free Microsoft-hosted CI/CD pipeline and 1 hosted job with 1800 minutes a month. This should be more than enough for my set up.

Additionally, sometimes I'll shift costs in order to both save money and improve performance as when I moved my Podcast's image hosting over to an Azure CDN in 2013. Fast forward 6 years and since I'm doing this larger migration, I wanted to see how easy it would be to migrate even more of my static assets to a CDN.

Make a Storage Account and ensure that its Access Level is Public if you intend folks to be able to read from it over HTTP GETs.

Public Access Level

With in the Azure Portal you can make a CDN Profile that uses Microsoft, Akamai, or Verizon for the actual Content Distribution Network. I picked Microsoft's because I have experience with Akamai and Verizon (they were solid) and I wanted to see how the new Microsoft one does. It claims to put content within 50ms of 60 countries.

You'll have a CDN endpoint host name that points to an Origin. That Origin is the Origin of your content. The Origin can be an existing place you have stuff so the CDN will basically check there first, cache things, then serve the content. Or it can be an existing WebApp or in my case, Storage.

Azure Storage Accounts for my blog

I didn't want to make things too complex, so I have a single Storage Account with a few containers. Then I mapped custom endpoints for both my blog AND my podcast, but they share the same storage. I also took advantage of the free SSL Certs so images.hanselman.com and images.hanselminutes.com are both SSL. Gotta get those "A" grades from https://securityheaders.com right?

Custom domains for my CDN

Then, just grab the Azure Storage Explorer. It's free and cross platform. In fact, you can get Storage Explorer as an app that runs locally, or you can check it out in the Azure  Portal/in-browser as well, here. I've uploaded all my main assets (images used in my CSS, blog, headers, etc).

I've settled on a basic scheme where anything that was "/images/foo.png" is now "https://images.hanselman.com/blog/foo.png" or /main/ or /podcast/ depending on who is using it. This made search and replaces reliable and easy.

It's truly a less-than-an-hour operation to enable a CDN on an existing site. While I've chose to use Azure Storage as my backing store, you can just use your existing site's /images folder and just change your markup to pull from the CDN's URL. I recommend you make a simple cdn.yourdomain.com or images.yourdomain.com. This can also easily be enabled with a CNAME in less than an hour. Having your images at another URL via a subdomain CNAME also allows for even more download parallelism from the browser.

This is all in my Staging so you won't see it quite yet until I flip the switch on the whole migration.

Exploring future possibilities for dynamic image content

I haven't yet moved all my blog content images (a few gigs, but many thousands of images) to a CDN as I don't want to change my existing publishing workflow with Open Live Writer. I'm considering a flow that keeps the images uploading to the Web App but then using the Custom Origin Path options in the Azure CDN that will have images get picked up Web App then get served by the CDN. In order to manage 17 years of images though, I'd need to catch URLs like this:

https://www.hanselman.com/blog/content/binary/Windows-Live-Writer/0409a9d5fae6_F552/image_9.png

and redirect them to

https://images.hanselman.com/blog/content/binary/Windows-Live-Writer/0409a9d5fae6_F552/image_9.png

As I see it, there's a few options to get images from a GET of /blog/content/binary to be served by a CDN:

  • Dynamically Rewrite the URLs on the way OUT (as the HTML is generated)
    • CPU expensive, but ya it's cached in my WebApp
  • Rewriting Middleware to redirect (301?) requests to the new images location
    • Easiest option, but costs everyone a 301 on all image GETs
  • Programmatically change the stored markup (basically forloop over my "database," search and replace, AND ensure future images use this new URL)
    • A hassle, but a one time hassle
    • Not sure about future images, I might have to change my publishing flow AND run the process on new posts occasionally.

What are your thoughts on if I should go all the way and manage EVERY blog image vs the low hanging fruit of shared static assets? Worth it, or overkill?

The learning process continues!


Sponsor: Seq delivers the diagnostics, dashboarding, and alerting capabilities needed by modern development teams - all on your infrastructure. Download now.


© 2019 Scott Hanselman. All rights reserved.
     

Azure Data Box Heavy is now generally available

$
0
0

Our customers continue to use the Azure Data Box family to move massive amounts of data into Azure. One of the regular requests that we receive is for a larger capacity option that retains the simplicity, security, and speed of the original Data Box. Last year at Ignite, we announced a new addition to the Data Box family that did just that – a preview of the petabyte-scale Data Box Heavy

With thanks to those customers who provided feedback during the preview phase, I’m excited to announce that Azure Data Box Heavy has reached general availability in the US and EU!

How Data Box Heavy works

In many ways, Data Box Heavy is just like the original Data Box. You can order Data Box Heavy directly from the Azure portal, and copy data to Data Box Heavy using standard files or object protocols. Data is automatically secured on the appliance using AES 256-bit encryption. After your data is transferred to Azure, the appliance is wiped clean according to National Institute of Standards and Technology (NIST) standards.

But Data Box Heavy is also designed for a much larger scale than the original Data Box. Data Box Heavy’s one petabyte of raw capacity and multiple 40 Gbps connectors mean that a datacenter’s worth of data can be moved into Azure in just a few weeks.

Data Box Heavy

  • 1 PB per order
  • 770 TB usable capacity per order
  • Supports Block Blobs, Page Blobs, Azure Files, and Managed Disk
  • Copy to 10 storage accounts
  • 4 x RJ45 10/1 Gbps, 4 x QSFP 10/40 Gbps Ethernet
  • Copy data using standard NAS protocols (SMB/CIFS, NFS, Azure Blob Storage)

Data Box

  • 100 TB per order
  • 80 TB usable capacity per order
  • Supports Block Blobs, Page Blobs, Azure Files, and Managed Disk
  • Copy to 10 storage accounts
  • 2 x RJ45 10/1 Gbps, 2 x SFP+ 10 Gbps Ethernet
  • Copy data using standard NAS protocols (SMB/CIFS, NFS, Azure Blob Storage)

 

Data Box Disk

  • 40 TB per order/8 TB per disk
  • 35 TB usable capacity per order
  • Supports Block Blobs, Page Blobs, Azure Files, and Managed Disk
  • Copy to 10 storage accounts
  • USB 3.1, SATA II or III
  • Copy data using standard NAS protocols (SMB/CIFS, NFS, Azure Blob Storage)

 

Expanded regional availability

We’re also expanding regional availability for Data Box and Data Box Disk.

Data Box Heavy US, EU
Data Box US, EU, Japan, Canada, and Australia
Data Box Disk US, EU, Japan, Canada, Australia, Korea, Southeast Asia, and US Government

    Sign up today

    Here’s how you can get started with Data Box Heavy:

    We’ll be at Microsoft Inspire again this year, so stop by our booth to say hello to the team!

    Thanks for 10 years and welcome to a new chapter in SQL innovation

    $
    0
    0

    Tomorrow, July 9, 2019, marks the end of extended support for SQL Server 2008 and 2008 R2. These releases transformed the database industry, with all the core components of a database platform built-in at a fraction of the cost of other databases. We saw broad adoption across applications, data marts, data warehousing, and business intelligence. Thank you for the ten amazing years we’ve had together.

    But now support for the SQL Server 2008 and R2 versions is ending. Whether you prefer the evergreen SQL of Azure SQL Database managed instance which never needs to be patched or upgraded, or if you need the flexibility and configurability of SQL Server hosted on a Azure Virtual Machine with three free years of Extended Security Updates, Azure provides the best choice of destinations to secure and modernize your database.

    Customers are moving critical SQL Server workloads to Azure

    Customers like Allscripts, Komatsu, Paychex, and Willis Towers Watson are taking advantage of these innovative destinations and migrating their SQL Server databases to Azure. Danish IT solutions provider KMD needed a home for their legacy SQL Server in the cloud. They had to migrate an 8-terabyte production database to the cloud quickly and without interruption to its service. Azure SQL Database managed instance allowed KMD to transfer their production data with minimal downtime and no code changes.

    “We moved our SQL Server 2008 to Azure SQL Database managed instance, and it has been a great move for us. Not only do we spend less time on maintenance, but we now run a version of SQL that is always current with no need for upgrade and patching.”

    – Charlotte Lindahl, Project Manager, KMD

    Azure SQL Database offers differentiated value to customers including:

    • Chose the only cloud with evergreen SQL. Azure SQL Database compatibility levels mean that you can move your on-premises workloads to managed SQL without worrying about application compatibility or performance changes. Customers who move to SQL Database never have to worry about patching, upgrades, or end of support again.
    • Host larger SQL databases than any other cloud with Azure SQL Database Hyperscale. Hyperscale is a highly scalable service tier for SQL databases that adapts on-demand to your workload's needs. With Hyperscale, databases can achieve the best performance for workloads of unlimited size and scale.
    • Harness the power of artificial intelligence to monitor and secure your workloads. Trained on millions of databases, the intelligent security and performance features in Azure SQL Database mean consistent and predictable workload performance. In addition to intelligent performance, SQL database customers get peace of mind with automatic threat detection, which identifies unusual log-in attempts or potential SQL injection attacks.
    • Move to the most economical cloud database for SQL Server, Azure SQL Database managed instance. With the full surface area of your on-premises SQL Server database engine and with an anticipated ROI of 212 percent and a payback period of as little as 6 months1, only SQL Database managed instance cements its status as the most cost effective service for running SQL in the cloud. SEB is a technology company providing software, solutions, and services specializing in managing group benefit solutions and healthcare claims processing. They chose Azure not only for its cost reduction compared to on-premises, but its more than 90 compliance offerings as well.

    "With SQL Server 2008 approaching end of support, SEB needed to migrate two critical business applications that contained sensitive health and PII information. In Azure, we were able to get three years of Extended Security Updates for application VMs, and move the data to Azure SQL Database which significantly decreased both management and infrastructure spend. Azure's compliance certifications for HIPAA, PCI and ISO-27k, as well as data residency in Canada, were critical in meeting our regulatory requirements.”

    – Mario Correia, Chief Technology Officer, SEB Inc.

    See how Hyperscale in Azure SQL Database is enabling customer innovation.

    Hyperscale video thumbnail with small play button

    SQL innovation remains our focus now and in the future

    Microsoft continues to invest in innovation with SQL Server 2019 and Azure SQL Database. Our priority is to future proof your database workloads. Today, I am excited to announce new innovation across on-premises and in the cloud:

    • Preview of Azure SQL, a simplified portal experience for SQL databases in Azure: Coming soon, Azure SQL will provide a single pane of glass through which you can manage Azure SQL Databases and SQL Server on Azure Virtual Machines. In Azure SQL, customers will be able to register their self-installed (custom image) SQL VMs using the Resource Provider to access benefits like auto-patching, auto-backup, and new license management options.
    • Preview of SQL Server 2019 big data clusters: Available later this month, the SQL Server 2019 big data clusters preview combines SQL Server with Apache Spark and Hadoop Distributed File System for a unified data platform that enables analytics and artificial intelligence (AI) over all data, relational and non-relational. Early Adoption Program participants like Startup Systems Imagination Inc. are already using big data cluster to solve challenging AI and machine learning problems.  

    “With SQL Server 2019 big data clusters, we can solve for on-demand big data experiments. We can analyze cancer research data coming from dozens of different data sources, mine interesting graph features, and carry out analysis at scale.”

    – Pieter Derdeyn, Knowledge Engineer, Systems Imagination Inc.

    Get started with SQL in Azure

    As we reach end of support for SQL Server 2008 and 2008 R2, and with just six more months until the end of support for Windows Server 2008 and 2008 R2, there’s never been a better time to secure and modernize these older workloads by moving them to Azure. Secure, manage, and transform your SQL Server workloads with the latest data and AI capabilities:

     

    1The Total Economic Impact™ of Microsoft Azure SQL Database Managed Instance, a Forrester Consulting Study, 10/25/2018. https://azure.microsoft.com/en-us/resources/forrester-tei-sql-database-managed-instance/en-us/

    Scale action groups and suppress notifications for Azure alerts

    $
    0
    0

    In Azure Monitor, defining what to monitor while configuring alerts can be challenging. Customers need to be capable of defining when actions and notifications should trigger for their alerts, and more importantly, when they shouldn’t. The action rules feature for Azure Monitor, available in preview, allows you to define actions for your alerts at scale, and allows you to suppress alerts for scenarios such as maintenance windows.

    Screenshot of Azure Monitor and the alerts dashboard

    Let’s take a closer look at how action rules (preview) can help you in your monitoring setup!

    Defining actions at scale

    Previously you could define what action groups trigger for your alerts while defining an alert rule. However, the actions that get triggered, whether it is an email that is sent or a ticket created in a ticketing tool, are usually associated with resource on which the alert is generated rather than the individual alert rule.

    For example, for all alerts generated on the virtual machine contosoVM, I would typically want the following.

    • The same email address to be notified (e.g. contosoITteam@contoso.com)
    • Tickets to be created in the same ITSM tool

    While you could define a single action group such as contosoAG and associate it with each and every alert rule authored on contosoVM, wouldn’t it be easier if you could easily associate contosoAG for every alert generated on contosoVM, without any additional configuration?

    That’s precisely what action rules (preview) allow you to do. They allow you to define an action group to trigger for all alerts generated on the defined scope, this could be a subscription, resource group, or resource so that you no longer have to define them for individual alert rules!

    Suppressing notifications for your alerts

    There are often many scenarios where it would be useful to suppress the notifications generated by your alerts. This could be a planned maintenance window or even the suppression of notifications during non-business hours. You could possibly do this by disabling each and every alert rule individually, with complicated logic that accounts for time windows and recurrence patterns or you can get all of this out of the box by using action rules (preview).

    Working on the same principle as before, action rules (preview) also allow you to define the suppression of actions and notifications for all alerts generated on a defined scope, which could be a subscription, resource group, or resource, while the underlying alert rules would continue to monitor. Furthermore, you have the capability to configure both the period as well as recurrence for the suppression, all out of the box! With this you could easily setup notification suppression based on your business requirements, which could be anything from suppression for all weekends such as a maintenance window, to suppression between 5pm – 9am everyday or normal business hours.

    Filters for more flexibility

    While you can easily define action rules (preview) to either author actions at scale or suppress them, action rules come with additional knobs and levers in the form of filters that allow you to fine tune what specific subset of your alerts the action rule acts on.

    For example, going back to the example of suppressing notifications during non-business hours. Perhaps you might still want to receive notifications if there is an alert with severity zero or one, while the rest are suppressed. In such a scenario, I can define a severity filter as part of my action rule, which defines that the rule does not apply to alerts with severity of zero or one, and thus only applies to rules with severity of two, three or four.

    Similarly, there are additional filters that provide even more granular definitions from the description of the alert to string matching within the alert’s payload. You can learn more about the supported filters by visiting our documentation, “Action rules (preview).”

    Screenshot of creating a new action rule

    Next steps

    To best leverage action rules, we recommend reading the documentation which goes into more detail about how to configure action rules (preview), example scenarios, best practices, and FAQs. It is recommended to use action rules (preview) in conjunction with action groups, which have the common alert schema enabled to define consistent alert consumption experiences across different alert types. You can also learn more by reading our documentation, “How to integrate the common alert schema with Logic Apps” which goes into more details on how you can setup an action group with a logic app using the common schema, that integrates with all your alerts.

    We are just getting started with action rules (preview), and we look forward to hearing more from you as we evolve the feature towards general availability and beyond. Keep the feedback coming to azurealertsfeedback@microsoft.com.

    Azure Marketplace new offers–Volume 40

    $
    0
    0

    We continue to expand the Azure Marketplace ecosystem. For this volume, 212 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

    Applications

    2Care

    2Care: 2CARE is a healthcare solution that easily captures the services provided to patients or inhabitants. This application is available only in Germany, Switzerland, and Austria.

    A10 Lightning ADC

    A10 Lightning ADC: A10 Lightning Application Delivery Controller (ADC) is a cloud-native solution that optimizes the delivery and security of cloud-native applications and services.

    Actian Vector Analytic Database Community Edition

    Actian Vector Analytic Database Community Edition: Actian Vector is a developer-friendly high-performance analytics engine that requires minimal setup, provides automatic tuning to reduce database administration effort, and enables highly responsive end-user BI reporting.

    Advanced Load Balancer ADC for Azure

    Advanced Load Balancer ADC for Azure: Loadbalancer.org Advanced Load Balancer ADC for Azure provides layer 4/7 load balancing for Azure, automatically distributing incoming application traffic across Azure-hosted workloads.

    AIMS 14 Day Free Trial

    AIMS 14 Day Free Trial: AIMS simplifies and automates the monitoring and analytics of Microsoft enterprise IT products and technologies using AI and machine learning.

    AIMS AIOps Free Performance Assessment

    AIMS AIOps Free Performance Assessment: AIMS provides a deep IT system and application performance assessment report delivered as a 15-20-page PowerPoint presentation identifying the state of your environment with recommended actions based on AI findings.

    AIMS Innovation

    AIMS Innovation: AIMS simplifies and automates the monitoring and analytics of Microsoft enterprise IT products and technologies using AI and machine learning.

    aiScaler Load Balancer & Site Acceleration

    aiScaler Load Balancer & Site Acceleration: aiScaler is an application delivery controller with an emphasis on accelerating dynamic sites. It is a single, easily configured virtual appliance that provides traffic management, dynamic site acceleration, and DDoS protection.

    Allscripts dbMotion

    Allscripts dbMotion: The dbMotion solution aggregates data from disparate source systems, harmonizes the information, and delivers it in an actionable format at the point of care, within the provider’s native workflow.

    Alteryx Designer

    Alteryx Designer: Alteryx Designer empowers data analysts by combining data preparation, data blending, and advanced analytics – predictive, statistical, and spatial – in an easy-to-use, drag-and-drop user interface.

    AMO CA

    AMO CA: AMO CA, formerly known as AuthentiCA, is a cloud-based certificate authority service that allows users to easily issue certificates and manage queries, renewals, and revocations.

    AnalyticsCreator data warehouse automation

    AnalyticsCreator data warehouse automation: AnalyticsCreator automates the design, creation, maintenance, and deployment of Azure SQL Data Warehouse.

    Appcelerator Arrow API Builder

    Appcelerator Arrow API Builder: Appcelerator Arrow API Builder is an opinionated framework for building and deploying Node.js applications to the cloud. With Arrow Builder, developers can quickly connect, model, transform, and optimize data from any source.

    AppDynamics Application Performance Management

    AppDynamics Application Performance Management: AppDynamics is a leading application performance management solution for modern applications in the cloud and data center that allows organizations to rapidly troubleshoot, diagnose, and scale apps in Azure.

    AppScale

    AppScale: The AppScale platform enables developers to focus solely on business logic in order to rapidly build scalable apps, cleanly separating it from deployment and scaling logic.

    Apptimized Packaging Tool

    Apptimized Packaging Tool: Hosted on Azure, the Apptimized Packaging Tool delivers everything needed to discover, create, test, and manage application packages in the cloud.

    Aras Innovator PLM Suite

    Aras Innovator PLM Suite: Aras Innovator is an enterprise-scale product lifecycle management (PLM) suite on Azure providing complete item and document management, configuration and change management, project and program management, and more.

    Asianux Server 4 SP6

    Asianux Server 4 SP6: Asianux Server 4 integrates advanced Linux technologies to deliver more value to enterprise users with high capability, reliability, and improved security.

    Asitis_logo_workfile

    Asitis Cloudware: Asitis Cloudware on Azure is a cloud-based factoring platform that helps your factoring business succeed – whether you have a startup or an already established company and want to grow.

    Atomic Secure Docker for Ubuntu

    Atomic Secure Docker for Ubuntu: With Atomic Secure Docker for Ubuntu, the operating system, device, application, and containers all inherit protection from the hardened kernel for entire classes of cyberattacks.

    AudioCodes One Voice Operations Center

    AudioCodes One Voice Operations Center: AudioCodes One Voice Operations Center is a voice network management solution that combines the management of voice network devices and quality of experience monitoring into a single, intuitive web-based application.

    Autopilot BPM Solution

    Autopilot BPM Solution: Autopilot is an Azure-based platform that allows companies to speed up the implementation and roll-out of electronic form and workflow solutions.

    Avalanche TTX

    Avalanche TTX: Avalanche TTX allows you to easily create and deliver engaging scenarios so that employees can experience and learn from difficult situations before they happen in real life. The system has wide application across many industries.

    Avi Vantage ADC Load Balancer, GSLB and WAF

    Avi Vantage ADC: Load Balancer, GSLB and WAF: Avi Vantage delivers an enterprise-grade, software-defined solution composed of a smart load balancer, an intelligent web application firewall, and an elastic service mesh for container applications.

    Aviatrix Companion Gateway-v2

    Aviatrix Companion Gateway-v2: This model removes the requirement to download the Aviatrix Gateway image to your Azure account to greatly reduce deployment time. Aviatrix Companion Gateway works with Aviatrix Cloud Controller version 3.1 or later.

    Avik Cloud

    Avik Cloud: Avik Cloud is a data pipeline as a service for big data management, including data integration, data profiles, data replication, machine learning, and more.

    Axway MailGate

    Axway MailGate: Axway MailGate assists in fully protecting your enterprise – from sanitizing inbound email, filtering and encrypting outbound messages, securing and managing shared files, and securing mobile device channels.

    Azul Zulu for Azure - Java 7 on Ubuntu 1804

    Azul Zulu for Azure - Java 7 on Ubuntu 18.04: Azul Zulu for Azure is a collection of certified builds of OpenJDK that are fully compliant and compatible with the Java SE standard on x64 reference architecture systems.

    Azul Zulu for Azure - Java 7 on Windows 2019

    Azul Zulu for Azure - Java 7 on Windows 2019: Azul Zulu for Azure is a collection of certified builds of OpenJDK that are fully compliant and compatible with the Java SE standard on x64 reference architecture systems.

    Azul Zulu for Azure - Java 8 on Ubuntu 1804

    Azul Zulu for Azure - Java 8 on Ubuntu 18.04: Azul Zulu for Azure is a collection of certified builds of OpenJDK that are fully compliant and compatible with the Java SE standard on x64 reference architecture systems.

    Azul Zulu for Azure - Java 8 on Windows 2019

    Azul Zulu for Azure - Java 8 on Windows 2019: Azul Zulu for Azure is a collection of certified builds of OpenJDK that are fully compliant and compatible with the Java SE standard on x64 reference architecture systems.

    Azul Zulu for Azure - Java 11 on Ubuntu 1804

    Azul Zulu for Azure - Java 11 on Ubuntu 18.04: Azul Zulu for Azure is a collection of certified builds of OpenJDK that are fully compliant and compatible with the Java SE standard on x64 reference architecture systems.

    Azul Zulu for Azure - Java 11 on Windows 2019

    Azul Zulu for Azure - Java 11 on Windows 2019: Azul Zulu for Azure is a collection of certified builds of OpenJDK that are fully compliant and compatible with the Java SE standard on x64 reference architecture systems.

    Barracuda CloudGen Firewall for Azure

    Barracuda CloudGen Firewall for Azure: The Barracuda CloudGen Firewall for Azure fills the functional gaps between cloud infrastructure security and a defense-in-depth strategy by providing protection where the application and data reside.

    BioGraph Engine

    BioGraph Engine: The BioGraph Engine by Spiral Genetics is designed to address the issues associated with large-scale genome sequencing projects using Illumina next-generation whole genome sequencing data.

    BlockApps Multinode Blockchain - EnterpriseEdition

    BlockApps Multinode Blockchain – Enterprise Edition: BlockApps provides enterprise blockchain development solutions based on Ethereum. Its client, written in Haskell, provides a highly scalable Ethereum-compliant blockchain with an industry standard RESTful API.

    BPM - Document Management

    BPM - Document Management: AuraPortal Helium's fast, easy, mobile, and secure document management module is designed for the cloud and built with the user in mind.

    CB CRM to SharePoint Permissions Replicator

    CB CRM to SharePoint Permissions Replicator: CB Dynamics CRM to SharePoint Permissions Replicator closes security gaps and keeps your documents safe by automatically synchronizing Dynamics CRM privileges with SharePoint permissions.

    CERTIFY

    CERTIFY: Certify helps you optimize, control, and follow your certification process while respecting the methodologies imposed by ISO 14971 and IEC 62366 and creating standards reports in line with the expectations of testing laboratories and auditors.

    CFD Direct From the Cloud

    CFD Direct From the Cloud: CFD Direct From the Cloud is a complete platform providing OpenFOAM (open source computational fluid dynamics) and supporting software running on the latest long-term support version of Ubuntu Linux.

    Chartio

    Chartio: Chartio is a cloud-based business analytics solution designed to enable data analysts and business users in marketing, sales, operations, and finance to access, explore, transform, and visualize their data.

    ChartWise CDI

    ChartWise CDI: ChartWise CDI is an award-winning clinical documentation improvement solution integrated across the healthcare continuum that increases efficiency and reduces risk with a proven, fast ROI.

    Cloud Web Environment

    Cloud Web Environment: Genesys Cloud Web Environment is a preconfigured cloud infrastructure and Open .NET stack for developers and teams to rapidly provision a working development environment in 15 minutes and start coding.

    CloudAMQP

    CloudAMQP: CloudAMQP supplies fully managed RabbitMQ servers and clusters, ranging from virtual hosts on a shared cluster to dedicated high-performance, high-availability clusters.

    CloudCheckr - The Cloud Management Company

    CloudCheckr - The Cloud Management Company: CloudCheckr is a unified cloud governance solution for Azure offering control and clarity for leading organizations to manage and optimize their public cloud investments.

    CloudForte for Azure

    CloudForte for Azure: CloudForte for Azure enables the management and governance of Azure public cloud environments, the acceleration and adoption of Azure and Azure-native services, and cloud-native modernization and development.

    CloudMonix

    CloudMonix: CloudMonix extends Azure by providing fine-grained control over monitoring, auto-scaling, and automation (self-healing) services in a simple, powerful experience.

    ComUnity Citizen Engagement Platform

    ComUnity Citizen Engagement Platform: The ComUnity Citizen Engagement Platform simplifies the way government organizations build and operate inclusive digital ecosystems that are data driven and run on all mobile devices and digital channels.

    Conduit

    Conduit: Make data-driven decisions with Conduit from Blueprint Technologies, a lightweight and secure solution that connects data sources to any BI visualization. Conduit unifies your data analytics initiatives and enables real-time delivery of your data.

    D'Amo

    D'Amo: D’Amo is a database encryption solution suitable for any operation environment that provides secured encryption key management and addresses compliance issues.

    DataCore Cloud Replication

    DataCore Cloud Replication: DataCore Cloud Replication uses Azure as a remote replication location to safeguard your highly available data center, helping you take advantage of the scalability, agility, and cost-efficiencies of Azure.

    DayOne Collaboration Solution

    DayOne Collaboration Solution: DayOne is a user-friendly, intuitive portal based on a Windows service running on Azure that connects to and synchronizes objects between Office 365 tenants you target.

    Derdack Enterprise Alert

    Derdack Enterprise Alert: Derdack Enterprise Alert automates alert notification processes and response workflows, enabling fast, reliable, and effective response to incidents threatening the continuity of services and operations.

    DEvent SaaS

    DEvent SaaS: DEvent is an innovative modeling solution that allows insurance companies to efficiently calculate their reserves with respect to Solvency II, ORSA, or IFRS 17.

    DeviceHive

    DeviceHive: DeviceHive provides the communication layer, control software, and multi-platform libraries to bootstrap development of smart energy, home automation, remote sensing, telemetry, remote control and monitoring software, and more.

    Digital Signup

    Digital Signup: Digital Signup is a cloud-based class registration solution designed for educational organizations that have one or more workarounds associated with their class registration, enrollment, and payment processing programs.

    DisasterNet

    DisasterNet: DisasterNet is a tool to easily unify and share information in the event of a disaster. Consolidate and centralize disaster information reported from various means and sources to help prevent delayed decision making. This application is available only in Japanese.

    Doberman Managed Service

    Doberman Managed Service: Emyode Doberman enables you to retain control over your business applications’ rapid evolution despite the exponential increase in technological complexity. This application is available only in English and French.

    Docker Engine - Enterprise for Azure

    Docker Engine - Enterprise for Azure: Docker Engine - Enterprise for Azure bootstraps all recommended infrastructure to start using Docker on Azure automatically so that users don't need to think about rolling their own instances, setting up security groups, or load balancing.

    Dyadic EKM Server Image

    Dyadic EKM Server Image: Dyadic Enterprise Key Management (EKM) lets you manage and control keys in any application deployed on Azure. It is easy to deploy and maintain and delivers greater levels of security and control for your crypto keys in the cloud.

    Dynatrace OneAgent

    Dynatrace OneAgent: Dynatrace OneAgent provides immediate information on application, infrastructure, and service problems by analyzing how application services interact and visualizing the impact on overall application performance.

    easydoct

    easydoct: easydoct is a web platform for managing appointments and boosting efficiency and productivity in healthcare organizations. This application is available only in French.

    EJBCA Enterprise Cloud Edition

    EJBCA Enterprise Cloud Edition: EJBCA Enterprise Cloud Edition is a powerful certificate issuance and management system for issuing and enabling full lifecycle control of digital certificates and certificate authorities, registration authorities, and validation authorities.

    Enterprise Password Vault

    Enterprise Password Vault: Employing the latest Azure technology, Enterprise Password Vault (EPV) securely stores your enterprise passwords with the ability to control permission levels within your organization. EPV also audits access to your security resources.

    Essentia

    Essentia: Essentia is a big data toolkit that provides improved data visibility, management, and analytics of cloud-based data lakes. Use cases include log analytics, web analytics, Internet of Things analytics, and hybrid-cloud data visibility.

    FidelityEHR

    FidelityEHR: FidelityEHR facilitates communication and sharing of information between team members; assists in the supervision and evaluation of staff performance; and provides real-time monitoring of costs, services, implementation, and outcomes.

    flexHR

    flexHR: flexHR on Azure delivers comprehensive "hire to retire" functions for human capital management with insightful analytics such as wage versus revenue, output per man-hour trend, and more.

    Gomedisys

    Gomedisys: Gomedisys helps centralize, organize, and manage the operational and administrative information of health service providers (clinics, hospitals, medical centers, doctors' offices, etc.), at any level of complexity. This application is available only in Spanish.

    GrowthEnabler Personalized Intelligence

    GrowthEnabler Personalized Intelligence: The GrowthEnabler AI- and algorithm-based global intelligence platform provides enterprises a fast way to leverage disruptive technologies, business models, and digital product innovations born out of the global startup economy.

    H2Oai Sparkling Water for HDInsight

    H2O.ai Sparkling Water for HDInsight: With Sparkling Water, users can drive computation from Scala/R/Python and utilize the H2O Flow UI, providing an ideal machine learning platform for application developers.

    Habilect

    Habilect: Habileсt is a multifunctional medical system that provides biomechanical diagnostics of body movements to prescribe therapy exercises and motivate patients both in the clinic and at home.

    Hanu Managed Azure

    Hanu Managed Azure: Get peace of mind and make the most of your Azure investment with Hanu Managed Azure services, including 24/7 proactive monitoring and notification, configuration management, and security management.

    HEDDA-IO

    HEDDA.IO: HEDDA.IO enables you to build a knowledge base and use it to perform a variety of critical data quality tasks, including the correction, enrichment, and standardization of data.

    HERE Maps & Location Services

    HERE Maps & Location Services: This template deploys HERE’s enterprise-grade maps and location services as serverless functions.

    HERE Maps & Location Services for Data Streams

    HERE Maps & Location Services for Data Streams: This template deploys HERE’s enterprise-grade maps and location services as serverless functions with an Azure Event Hub and Azure Cosmos DB used in real-time data streaming and processing for data-intensive applications.

    HERE Maps & Location Services for WebApp backends

    HERE Maps & Location Services for WebApp backends: This template deploys HERE’s enterprise-grade maps and location services as serverless functions with a service bus and Azure Cosmos DB used as data processing back-ends for web applications.

    Hinemos ver 612 with Cloud Management feature

    Hinemos ver.6.1.2 with Cloud Management Feature: Hinemos is an integrated operations management solution that provides the features you need to manage complex IT systems.

    Human Capital Management Software Suite

    Human Capital Management Software Suite: The CatalystOne Human Capital Management Software Suite is ideal for midsize to large organizations and supports processes throughout the entire employee lifecycle.

    humanifyinsightsplatform

    Humanify Insights Platform: The Humanify Insights Platform is a cloud-based customer and employee data platform that provides brands with a 360-degree view of customers’ needs, behaviors, and preferences.

    Hydra Holistic Conservation

    Hydra Holistic Conservation: Hydra Holistic Conservation helps you transform data into value across the entire conservation community, including park managers, security teams, tourism departments, ecologists and scientists, anti-poaching units, and more.

    Hyperglance - Automatic cloud architecture diagram

    Hyperglance: Automatic cloud architecture diagram: Hyperglance provides complete Azure cloud visibility with a built-in rules engine. See identified resources and their dependencies on a real-time, interactive Azure diagram that is scalable and easy to understand.

    ICE - Hybrid Cloud Advisory Service Assessment UAE

    ICE: Hybrid Cloud Advisory Service Assessment UAE: Intertec will divide your application ecosystem into private managed, Azure reserved instances, VM scale sets, app scale sets, app services, and DevOps sandbox environments for cost savings and a comprehensive migration.

    ICE - NetApp (BYOL) Storage on Azure as a Service

    ICE: NetApp (BYOL) Storage on Azure as a Service: Intertec is pleased to present storage architecture in partnership with NetApp that gives you enterprise-grade data management features and cost savings along with the financial flexibility and reliability of Azure.

    ICE - Remote Workplace Suite

    ICE: Remote Workplace Suite: Intertec's ICE: Remote Workplace Suite helps improve business efficiency while ensuring full compliance on data retention.

    Icinga 2

    Icinga 2: Shadow-Soft Icinga 2 master on CentOS is community code built on CentOS and installed and configured to monitor directly connected Linux and Windows hosts.

    Iguazio Data Science Platform

    Iguazio Data Science Platform: The Iguazio platform streamlines data science to production and drives fast time to value for application development based on machine learning, enabling data scientists to focus on delivering better and more powerful solutions.

    Imperva-Incapsula

    Imperva-Incapsula: Imperva Incapsula secures and accelerates websites by bolstering the built-in security features of Azure with a web application firewall, DDoS mitigation, and intelligent bot protection for any website or application in a pure or hybrid Azure environment.

    Improvement Platform-Nuventive

    Improvement Platform-Nuventive: The Improvement Platform utilizes Microsoft Power BI and Office 365 as key architectural components. It is designed to help higher education communities plan more consistently, collaborate more effectively, and make smarter decisions.

    Info ASPEN

    Info ASPEN: Info ASPEN is a log data anonymization and data storage service on Azure that provides organizations with a simple solution for GDPR requirements for log data anonymization, pseudonymization, and log storage.

    Informatica Big Data Management 10.2.2 BYOL

    Informatica Big Data Management 10.2.2 BYOL: Informatica Big Data Management provides data management solutions to quickly and holistically integrate, govern, and secure big data for your business.

    Informatica Enterprise Data Catalog 10.2.2 BYOL

    Informatica Enterprise Data Catalog 10.2.2 BYOL: Informatica Enterprise Data Catalog provides a machine-learning-based discovery engine to scan and catalog data assets across the enterprise – across clouds, on-premises, and big data anywhere.

    Intellicus BI Server V18.1 (100 Users)

    Intellicus BI Server V18.1 (100 Users): Intellicus BI Server on Azure is an end-to-end self-service BI platform that provides advanced reporting and analytics capabilities; a semantic layer; and integrated extract, transform, and load capabilities.

    Intellicus BI Server V18.1 (50 Users)

    Intellicus BI Server V18.1 (50 Users): Intellicus BI Server on Azure is an end-to-end self-service BI platform that provides advanced reporting and analytics capabilities; a semantic layer; and integrated extract, transform, and load capabilities.

    Ishlangu Load Balancer ADC BYOL

    Ishlangu Load Balancer ADC BYOL: The Ishlangu Load Balancer ADC offers advanced application delivery and security for organizations using Azure. Ishlangu is simple to deploy, easy to manage, and provides instant results.

    Jenkins With Ubuntu Server 18.04 Lts

    Jenkins With Ubuntu Server 18.04 Lts: Jenkins is an open source automation server written in Java that helps automate the non-human part of the software development process, with continuous integration and facilitating the technical aspects of continuous delivery.

    KACE Asset Management Appliance

    KACE Asset Management Appliance: The KACE Asset Management Appliance provides comprehensive, flexible software and hardware asset management capabilities across a variety of platforms and helps you play a strategic role in supporting cost controls.

    Kemp 360 Central (BYOL)

    Kemp 360 Central (BYOL): Kemp 360 Central application delivery management and analytics offers a single point to manage your application delivery infrastructure.

    Kinetica Active Analytics Platform

    Kinetica Active Analytics Platform: Kinetica is a GPU-accelerated, in-memory analytics database that delivers real-time response to queries on large, complex, and streaming data sets. Its fully distributed architecture and simplified data structures lead to more predictable scale-out.

    LANSA Scalable License

    LANSA Scalable License: LANSA Scalable License provides users with a machine image and an Azure Resource Manager template for constructing a production-ready Windows stack to deliver LANSA web, mobile, and desktop capabilities.

    Learn It Live - Population Health and Wellness

    Learn It Live - Population Health and Wellness: Learn It Live is a complete health wellness learning platform for HR/benefits departments, health-focused organizations, insurance companies, and content distribution networks to leverage for their members.

    Linux Chef Extension (1.2.3)

    Linux Chef Extension (1.2.3): Install the Chef client on your virtual machine and bootstrap to your Chef server.

    Magazine

    Magazine: Mavention Magazine is an easy way to create a digital magazine for your customers and employees. The platform is based on SharePoint as its content management system.

    Magento on Ubuntu powered by IAANSYS

    Magento on Ubuntu powered by IAANSYS: Magento is a feature-rich e-commerce platform built on open source technology that provides online merchants with flexibility and control over the look, content, and functionality of their e-commerce store.

    Mail2Cloud Archive & Collaboration

    Mail2Cloud Archive & Collaboration: Mail2Cloud Archive & Collaboration provides manual or automatic capture of emails (message and attachments) from one or more accounts in an organization to cloud storage.

    Mailjet Email Service

    Mailjet Email Service: Mailjet's RESTful API and intuitive toolkit give developers and marketers a scalable cloud-based email infrastructure that includes sending, deliverability, and comprehensive analytics to help get the most out of each email, recipient, and inbox.

    Managed cPanel Server

    Managed cPanel Server: Media3 Managed cPanel Server provides a fully managed Linux server with the latest version of cPanel installed running Centos 7 as the operating system.

    Master Data Management - Maestro Server V6

    Master Data Management: Maestro Server V6: The Profisee Maestro Server V6 virtual machine delivers all the power of Profisee Base Server with Golden Record Management, SDK, workflows, and integrator.

    Matillion ETL for Snowflake

    Matillion ETL for Snowflake: Expect a significant reduction in development time and maintenance with Matillion’s intuitive UI and data transformation approach designed to take full advantage of Snowflake’s storage capacity and multi-clustered processing power.

    Mavention Make

    Mavention Make: Create new sites and groups in Office 365 based on templates and update existing SharePoint sites and groups with Mavention Make. This application is available only in Dutch.

    Mavention Screen

    Mavention Screen: Mavention Screen uses SharePoint as a content management solution, making it easy to display existing information on screens while narrowcasting without the need to re-enter content. This application is available only in Dutch.

    Mavention Workspace

    Mavention Workspace: Bring all your Microsoft Teams and SharePoint sites and groups to one place with Mavention Workspace to give users a clear overview of all their collaboration spaces. This application is available only in Dutch.

    ME PasswordManagerPro 100 admins 25 keys

    ME Password Manager Pro 100 admins, 25 keys: ManageEngine Password Manager Pro is a web-based privileged identity management solution that lets you manage privileged identity passwords, SSH keys, and SSL certificates from a simple, consolidated platform.

    ME PasswordManagerPro 25 admins 25 keys

    ME Password Manager Pro 25 admins, 25 keys: ManageEngine Password Manager Pro is a web-based privileged identity management solution that lets you manage privileged identity passwords, SSH keys, and SSL certificates from a simple, consolidated platform.

    Meetingsbooker Simple Meetings

    Meetingsbooker Simple Meetings: Meetingsbooker Simple Meetings moves booking and tracking meeting spend from an offline, labor-intensive process to a fast online solution, transforming the way your company books venues and saves on meetings.

    Mendix Pro

    Mendix Pro: Mendix's visual, model-driven development approach enables more users, from professional developers to citizen developers, to rapidly build web and mobile applications.

    M-Files

    M-Files: M-Files provides a next-generation intelligent information management platform that improves business performance by helping people find and use information more effectively.

    Migrato Intelligent Content Classifier

    Migrato Intelligent Content Classifier: The Migrato Intelligent Content Classifier enables you to classify scanned documents, documents on shares, and more. It uses AI and machine learning to allow for minimal user intervention. This application is available only in the Netherlands.

    mijin BaaS on Azure 2

    mijin BaaS on Azure 2: The mijin blockchain platform creates a secure data-sharing environment with high performance, zero downtime, and unfalsifiable data while reducing the cost of conventional infrastructure.

    miyabi

    miyabi: The miyabi blockchain platform detects abnormal transactions before block generation, thereby enhancing the security as a distributed ledger. This application is available only in Japanese.

    MyGet - Hosted NuGet, NPM, Bower, Maven and Vsix

    MyGet - Hosted NuGet, NPM, Bower, Maven and Vsix: MyGet provides hosted NuGet, NPM, Bower, Maven, and VSIX feeds for individual developers, open source projects, and corporate development teams.

    Netgate TNSR Secure Network Platform

    Netgate TNSR Secure Network Platform: TNSR is a high-performance, software-based packet processing platform that enables a broad array of secure networking use cases. It features a highly scalable open source firewall, router, and VPN platform.

    NVIDIA Quadro Virtual Workstation - WinServer 2019

    NVIDIA Quadro Virtual Workstation - WinServer 2019: Quickly spin up a VM running on Windows Server 2019 and easily configure it with the NVIDIA GPU instance, vCPU, memory, and storage you need without having to purchase physical hardware and infrastructure.

    OFAC Search API

    OFAC Search API: OFAC Search API offers fast, flexible, and accurate SDN name search through a simple API. Results can be formatted in XML or JSON.

    Official MEAN Machine - Mean-io app out-of-the-box

    Official MEAN Machine - Mean.io app out-of-the-box: This leading web development framework allows for rapid set up and deployment of web apps, websites, web services, and APIs, offering a set of hand-picked tools enabling developers to focus on development.

    OpenFrame

    OpenFrame: OpenFrame is a complete mainframe rehosting solution that lets you take (lift) your existing mainframe assets and move (shift) them to an open system platform quickly and with minimal risk.

    OPNsense

    OPNsense Firewall/Router/VPN/IDPS: OPNsense is a fully featured security platform that secures your network with features such as inline intrusion prevention, virtual private networking, two-factor authentication, captive portal, and filtering web proxy.

    OpsLogix OMS Oracle Collector

    OpsLogix OMS Oracle Collector: Consolidate Oracle monitoring, auditing, and log analytics into Microsoft Operations Management Suite with the OpsLogix OMS Oracle Collector for a holistic view of your Oracle environment.

    OutSystems Azure AD

    OutSystems Azure AD: Enable enterprise single sign-on throughout all OutSystems applications in your organization, allowing users to sign in using their organizational accounts hosted in Azure Active Directory.

    Perfect Finder

    Perfect Finder: Manage a diverse device ecosystem and reduce the work of IT personnel. Perfect Finder has three main functions: procurement, asset management, and support. This application is available only in Japanese.

    Perfion PIM - Add-in for Business Central

    Perfion PIM – Add-in for Business Central: Perfion offers product information management (PIM) fully integrated into Dynamics 365 Business Central. Work directly with real-time ERP data such as prices and inventories from the same place you manage all other product data.

    Plesk WordPress Edition Preminum

    Plesk WordPress Edition Premium: Plesk WordPress Edition enables you to create a managed WordPress offer with everything you need. It is ready to deploy out-of-the-box configurations that come with preconfigured settings and our top extensions pre-installed.

    Power Analytics

    Power Analytics: Power Analytics is an online app for managers in midsize and large businesses who plan to implement a KPI-based BI system. The application provides insights and interactive dashboards based on corporate KPIs. This application is available only in Russian.

    PrestaShop Kickstart Template

    PrestaShop Kickstart Template: PrestaShop is an open source e-commerce web application that helps provide a better shopping cart experience for merchants and their customers. The solution is extendable and written in PHP to provide unlimited product customization.

    ProActive Workflows & Scheduling

    ProActive Workflows & Scheduling: ProActive Workflows & Scheduling is a job scheduler based on workflows. It enables enterprise batch processing and job scheduling to optimize the execution of all types of workloads.

    Project Hub

    Project Hub: Project Hub helps organizations structure documentation for current and past projects with a well-organized portal where customers see their projects presented visually. This application is available only in Dutch.

    ProjectTools Business Productivity Products

    ProjectTools Business Productivity Products: ProjectTools comprises Engage Plus, Empower Plus, and Optimise Plus, providing a cloud-based method of gathering project or asset data in real time and storing it for remote access and analysis.

    ptarmigan for Azure

    ptarmigan for Azure: ptarmigan is a solution that is compliant with the Lightning Network specification (BOLT) and published on Azure Marketplace for easy deployment on Azure.

    PubSub  Cloud Enterprise Accounts

    PubSub+ Cloud Enterprise Accounts: Leading enterprises use PubSub+ Cloud messaging as a service to enable event-driven microservices, stream data from on-premises to the cloud, build IoT applications, integrate web streaming or mobile apps, and more.

    PubSub  Cloud Standard Accounts

    PubSub+ Cloud Standard Accounts: Leading enterprises use PubSub+ Cloud messaging as a service to enable event-driven microservices, stream data from on-premises to the cloud, build IoT applications, integrate web streaming or mobile apps, and more.

    quasardb cluster

    quasardb cluster: Supporting a large selection of languages with open-source APIs, quasardb enables users to work on terabytes of data effortlessly thanks to its simple key/value model combined with instant tag-based lookups.

    Questica Budget for Healthcare

    Questica Budget for Healthcare: Questica works with hospitals and healthcare facilities to better enable data-driven budgeting and decision-making while increasing data accuracy, saving time, and improving stakeholder trust.

    Questica Budget for Higher Education

    Questica Budget for Higher Education: Questica Budget is a fully featured, multi-user web-based operating, salary, and capital budgeting and performance measurement tool that integrates seamlessly with your existing financial, HR, and student information systems.

    Questica Budget for Local Government

    Questica Budget for Local Government: By seamlessly integrating with existing financial, ERP, and other systems, Questica Budget is an easy-to-use, comprehensive, and collaborative cloud-based solution for operating, capital, and salary budget preparation and management.

    Questica Budget for Primary & Secondary Edu K-12

    Questica Budget for Primary & Secondary Edu / K-12: Questica’s multi-user school budgeting and performance management software simplifies the assembly, tracking, analyzing, and reporting of operating, salary, and capital budgets.

    RancherOS

    RancherOS: RancherOS is a Linux distribution that runs the Docker daemon as process ID 1 and all services as system containers.

    RapidMiner Server

    RapidMiner Server: RapidMiner Server makes it easy to share, reuse, and operationalize models and results created in RM Studio. Its central repository, dedicated computational power, and flexible deployment options support analytic teamwork and help rapidly put results into action.

    RavenHQ

    RavenHQ: RavenHQ is the official RavenDB hosting platform. Its fully managed cloud of RavenDB servers and scalable plans mean you won’t have to worry about installation, updates, availability, performance, security, or backups again.

    Raygun

    Raygun: Raygun provides effortless error tracking for your applications, helping you find problems and fix bugs faster. With just a line or two of code, Raygun works with all major mobile and web programming languages in a matter of minutes.

    Riverbed SteelCentral Controller for SteelHead

    Riverbed SteelCentral Controller for SteelHead: Manage, configure, and monitor Riverbed products from a single web application and centralize enterprise-level policy management and reporting for greater control over global network deployments.

    Safetica DLP

    Safetica DLP: Safetica is a cost-effective, easy-to-use data loss prevention (DLP) solution. It protects sensitive data and covers all major data leak channels (email, cloud, external devices, websites, social media, clipboard, and printer media).

    Saviynt for Azure

    Saviynt for Azure: Saviynt's comprehensive cloud identity and access governance solution secures all data, infrastructure, and critical application assets under a single, unified platform.

    SendGrid Email Delivery

    SendGrid Email Delivery: SendGrid's proven platform successfully delivers over 18 billion transactional and marketing related emails each month for Internet and mobile-based customers as well as more traditional enterprises.

    Signature 365

    Signature 365: Signature 365 is a marketing tool that offers efficient central management of employee e-signatures in Microsoft Outlook, with all employee information stored and selected from Active Directory.

    Sitecore Managed Service on Azure

    Sitecore Managed Service on Azure: RDA’s Sitecore Managed Service on Azure is a comprehensive offering that keeps your digital solutions performing at the highest levels, allowing you to focus on providing exceptional experiences to your customers.

    Solix Common Data Platform

    Solix Common Data Platform: The Solix Common Data Platform is big data management solution for modern data-driven enterprises. It leverages leading open source big data technologies to help companies better manage and process structured and unstructured data.

    SureSpot

    SureSpot: SureSpot allows motorists to easily make reservations at parking facilities without cash or waiting in long lines. Motorists can reserve a parking spot at any of SureSpot's participating facilities and rest assured that they will have a place to park when they arrive.

    System Answer

    System Answer: Targeting the education sector, System Answer features centralized management of hybrid cloud environments, problem analysis/predictive detection, and security measures including integrated log management. This application is available only in Japanese.

    TAAP Healthcare  - Discharge Management System

    TAAP Healthcare - Discharge Management System: The TAAP Healthcare - Discharge Management System removes paper and spreadsheets, allows data to be captured in real time, and enables healthcare professionals to make actionable decisions based on real-time data.

    Taxcel BI App

    Taxcel BI App: The Taxcel BI web app automatically converts SPED files to dashboards without the need for integration with your ERP system, helping you gain insight into your business through Power BI visualizations. This application is available only in Portuguese.

    Teradata Server Management

    Teradata Server Management: Teradata Server Management monitors Teradata Vantage instances and generates alerts related to database and operating system errors and operational state changes, increasing efficiency in Teradata environment server management.

    Teradata Unity with IntelliSphere

    Teradata Unity with IntelliSphere: Teradata Unity is a key enabling technology for synchronization and workload routing between Teradata Vantage systems that provides active-active database availability with near-zero recovery time objectives/recovery point objectives.

    The Deep Learning Reference Stack

    The Deep Learning Reference Stack: The Deep Learning Reference Stack is an integrated, open source stack optimized for Intel Xeon Scalable platforms. This community release helps ensure AI developers have easy access to all features and functionalities of Intel platforms.

    theStore2 Retail Lab

    theStore2 Retail Lab: theStore2 is a virtual studio for fast prototyping of trade and category management concepts, consumer pretests, advanced analytics, and sales force training.

    Tibero 6 Enterprise Edition

    Tibero 6 Enterprise Edition: Tibero 6 is a high-performance, highly secure, highly scalable relational database management system for enterprises that want to fully leverage their mission-critical data.

    Tibero 6 Standard Edition

    Tibero 6 Standard Edition: Tibero 6 is a high-performance, highly secure, highly scalable relational database management system for enterprises that want to fully leverage their mission-critical data.

    Tieto Intelligent Wellbeing

    Tieto Intelligent Wellbeing: Tieto Intelligent Wellbeing is an AI-based data-driven product family for healthcare and welfare that makes decision making faster and more accurate while increasing the efficiency and reliability of health and wellbeing services.

    TmaxSoft WebtoB 5 Enterprise Edition

    TmaxSoft WebtoB 5 Enterprise Edition: The WebtoB web server delivers fast, stable processing by consuming fewer resources through optimized service request distribution (multiplexing) via a single process.

    Understand IoT

    UnderStand IoT: Edge UnderStand is a standalone device equipped with high quality sensors that constantly monitor and measure multiple parameters. The readings are then transmitted to a cloud-based server and are available immediately.

    UniKix Mainframe Re-hosting Software

    UniKix Mainframe Re-hosting Software: With UniKix Mainframe Re-hosting Software, you can run mission-critical online and batch workloads on cost-effective Azure platforms. The solution preserves existing application investments and extends the benefits of distributed platforms.

    UXP Security Server

    UXP Security Server: UXP Security Server acts as a gateway between an organization’s information system and the UXP infrastructure. Supporting SOAP and REST messages, it enables you to establish seamless, secure communication between organizations.

    ViewPoint Cloud

    ViewPoint Cloud: ViewPoint Cloud is cloud-based software for permitting, licensing, and code enforcement operations. The all-in-one platform offers a user-friendly experience for public applicants integrated with workflow automation for department staff.

    Vishaka - PoSH Compliance

    Vishaka - PoSH Compliance: Vishaka is a regulatory solution designed to help organizations based in India comply with India’s PoSH Act 2013 mandate. Vishaka boosts awareness through online trainings, policies, surveys, and annual compliance reports.

    VNS3 (Firewall-Router-VPN)

    VNS3 (Firewall/Router/VPN): Cohesive Networks VNS3 is a security and network routing virtual appliance built for Azure to support most Internet Protocol Security data center solutions.

    Vormetric Tokenization Server 2.3.0.400

    Vormetric Tokenization Server 2.3.0.400: Vormetric Vaultless Tokenization with Dynamic Data Masking dramatically reduces the cost and effort required to comply with security policies and regulatory mandates while making it easy to protect other sensitive data.

    WAPPLES SA

    WAPPLES SA: WAPPLES SA is a cloud-based web security solution for protecting business applications from attacks by analyzing HTTP/HTTPS traffic using the COCEP logic-based detection engine.

    WaWa Office

    WaWa Office: Targeting the education sector, WaWa Office is highly expandable groupware that uses a corporate LAN to streamline information sharing and communication. This application is available only in Japanese.

    WellLine SaaS

    WellLine SaaS: WellLine is a knowledge graph for wells that establishes and enriches asset timelines, enabling oil and gas companies to create a powerful digital knowledge layer for wells.

    whoog

    WHOOG: A management solution for staff replacement in the health sector based on employee volunteering, WHOOG is revolutionizing the management of planned or urgent replacement needs. This application is available only in French.

    Wormhole Campus

    Wormhole Campus: Distance learning and training platform Wormhole Campus is used by hundreds of companies, governments, and educational institutions to manage, develop, and market online training programs. This application is available only in Spanish.

    Xytech MediaPulse

    Xytech MediaPulse: The MediaPulse facility management system has over 30 integrated and highly configurable modules designed to support project management, rental management, asset management, and personnel scheduling.

    zaml

    ZAML: Zest Automated Machine Learning (ZAML) empowers financial institutions with the tools to build, document, and monitor machine learning credit underwriting models in production, enabling them to be compliant with federal regulations.

    Zerto

    Zerto: Zerto protects your private cloud infrastructure by deploying disaster recovery in Azure and helps create hybrid cloud environments by integrating your on-premises infrastructure with public clouds.

    Zerto Virtual Replication

    Zerto Virtual Replication: Zerto Virtual Replication protects your private cloud infrastructure by deploying disaster recovery in Azure and helps create hybrid cloud environments by integrating your on-premises infrastructure with public clouds.

    Consulting Services

    Advanced Azure Identity Management 1-Hr Briefing

    Advanced Azure Identity Management: 1-Hr Briefing: Identity management in the cloud offers a wide range of benefits. This 1-hour briefing will show you how to integrate your on-premises identity with Azure.

    Azure Cloud Roadmap   BluePrint 4-Week Assessment

    Azure Cloud Roadmap + BluePrint: 4-Week Assessment: Work with BlueSilverShift to build a customized roadmap for your journey to the Azure cloud. More than a simple cloud readiness assessment, this solution delivers a blueprint for implementation and long-term success.

    Azure Data Center Migration 1-Hr Briefing

    Azure Data Center Migration: 1-Hr Briefing: Microsoft Gold partner TWE Solutions will brief you on the pros and cons of moving infrastructure and services to the cloud, providing a high-level overview of the benefits and best practices for migration.

    Azure Disaster Recovery 1-Hr Briefing

    Azure Disaster Recovery: 1-Hr Briefing: This briefing from partner TWE Solutions provides information on the benefits of using Azure to implement a disaster recovery plan for your data and IT services, touching on cloud security and failover testing.

    AzureReady Managed Services 1-Wk Briefing

    AzureReady Managed Services: 1-Wk Briefing: Microsoft Gold partner 3Cloud will manage your Azure services with a focus on efficiency, cost control, and support. This briefing will review your infrastructure and deliver a design of your base Azure architecture.

    Citrix on Azure 5 Day Proof of Concept

    Citrix on Azure: 5 Day Proof of Concept: Coretek Services will work with you to implement or extend your use of Azure with a proof-of-concept Citrix environment. By the end of the engagement, you will have a PoC app or desktop environment to leverage for testing.

    Data Migration to Power BI 3-Day Proof of Concept

    Data Migration to Power BI: 3-Day Proof of Concept: TwoConnect can help you migrate your data to Azure and create Power BI dashboards that streamline all your sources and enable you to easily visualize and analyze data with greater speed, efficiency, and understanding.

    Data Science & Engineering 1-Hr Briefing

    Data Science & Engineering: 1-Hr Briefing: The Data Analysis Bureau offers a discovery session to help you understand how data science can drive value to your business using your Azure architecture, helping to improve operational collaboration and business efficiency.

    Data Science & Machine Learning 1 Day Workshop

    Data Science & Machine Learning: 1 Day Workshop: The Data Analysis Bureau delivers a tailored, one-day data discovery workshop to accelerate your data science and machine learning projects with your key business stakeholders.

    Device Management System 2-Weeks Assessment

    Device Management System: 2-Weeks Assessment: This offer from Microsoft Gold partner Communication Square delivers an assessment of application requirements, user personas, and business goals to help IT directors work toward a more productive user experience.

    Device Management System 2-Weeks Proof of Concept

    Device Management System: 2-Weeks Proof of Concept: This PoC from Microsoft Gold partner Communication Square delivers a production environment demo to help IT directors support a diverse mobile ecosystem, protect data, and deliver a productive user experience.

    First Line Workers System-1 Hour Briefing

    First Line Workers System-1 Hour Briefing: This briefing from Communication Square will help you better understand how to promote first line workers' remote participation in town halls, enable employees to build on the work of others, and tap into data for new insights.

    First Line Workers System-1 Week Assessment

    First Line Workers System-1 Week Assessment: This assessment by Communication Square will deliver a production environment demo of an ideal use case of First Line Workers System, a scalable Azure solution that helps connect a workforce utilizing voice, video, and chat.

    First Line Workers System-2 Week Proof of Concept

    First Line Workers System-2 Week Proof of Concept: This PoC from Communication Square delivers a fully functional, five-user setup of First Line Workers System, a productive Azure-based solution that helps connect a workforce utilizing voice, video, and chat.

    First Line Workers System-4 Week Implementation

    First Line Workers System-4 Week Implementation: This offer from Communication Square implements First Line Workers System, an Azure- and cloud-based productivity solution that helps IT directors connect to their workforce utilizing voice, video, and chat.

    HIPPA Ready Healthcare System 2-Weeks PoC

    HIPPA Ready Healthcare System: 2-Weeks PoC: This proof of concept from Communication Square shows how businesses can securely work with remote customers and define who can access data and what they can do with it. This is a fully functional setup for up to five users.

    HIPPA Ready Healthcare System 1 Week Assessment

    HIPPA Ready Healthcare System: 1 Week Assessment: This assessment from Communication Square will help a business explore HIPAA Ready Healthcare System as a solution to ensure communication and collaboration are compliant per industry-defined standards.

    Integration Scoping Workshop - 6 Day

    Integration Scoping Workshop - 6 Day: Black Marble consultants will assess existing integration interfaces, review how to move into the cloud with Azure Logic Apps, and develop a technical document outlining an adoption plan.

    IoT - AI 80 Hour Assessment in Manufacturing

    IoT / AI: 80 Hour Assessment in Manufacturing: System One Digital will visit your plant to assess areas of opportunity, summarize the ROI of an IoT/AI project, and demonstrate how Azure technologies can be used.

    Machine Learning 1-Hr Briefing

    Machine Learning: 1-Hr Briefing: This discovery session from The Data Analysis Bureau will help you understand how to utilize your Azure architecture to accelerate your machine learning projects to return value to your business and improve your business productivity.

    Mainframe Migration 6-Wk Assessment

    Mainframe Migration: 6-Wk Assessment: This assessment from Asysco enables the analysis of mainframe systems and planning for Azure migration along with key requirements and risks. You will receive a migration plan and target architecture for required Azure components.

    Managed Azure Governance 2-Day Assessment

    Managed Azure Governance: 2-Day Assessment: Blue Silver Shift will show you in two days how to achieve a stress-free Azure governance and cost containment strategy, implementation, and monthly monitoring, with a focus on maximizing the return on your investment.

    Quick Azure Diagnostic - 3-Wk assessment

    Quick Azure Diagnostic - 3-Wk assessment: This assessment by FX Innovation will help you understand the impact of Azure, ensure the right resources are in place, and find out the security mechanisms and governance approach that can maximize your investment.

    Secure Communication System 4-Week Implementation

    Secure Communication System: 4-Week Implementation: This deep, Azure-based engagement from Communication Square leads to a robust implementation of Secure Communication System, a phone solution that utilizes a cloud-based unified communication solution.

    SQL 2008 Migration Assessment - 1 to 5 days

    SQL 2008 Migration Assessment - 1 to 5 days: Emyode’s Migration Assessment is a complete review of your context, challenges, databases, and applications, creating a high-level architecture and global action plan for a successful migration.

    SQL 2008 Migration w-real scenario (PoC) 1-5 days

    SQL 2008 Migration w/real scenario (PoC) 1-5 days: On July 9, 2019, all support for SQL 2008 will end. Emyode will demonstrate a scenario of migrating your database in Azure as a proof of concept, validating and ensuring all the steps work within your context.

    SQL 2008 Migration Workshop & Demo 1 day Briefing

    SQL 2008 Migration Workshop & Demo 1 day Briefing: Understand your options and the benefits of leveraging SQL 2008 EOS and migrating to Azure with Emyode's comprehensive one-on-one workshop and demonstration of the migration process.

    Value Discovery Approach 6 weeks assessment

    Value Discovery Approach: 6 weeks assessment: The Infosys Value Discovery Approach focuses on data collection and technical discovery to baseline the mainframe portfolio for Azure migration. The objective is evaluating the landscape from business and technical aspects.

    Website-eCommerce Azure Migration 5-hour Imp

    Website/eCommerce Azure Migration: 5-hour Imp.: Atmosera will help you evaluate website workloads, performance, and architecture to develop the optimal Azure environment. You will receive a detailed report and a roadmap that highlights investment options and tradeoffs.

    Larger, more powerful standard file shares for Azure Files now in preview

    $
    0
    0

    Better scale and more power for IT pros and developers.

    Azure Files has always delivered secure, fully managed cloud file shares with a full range of data redundancy options. While customers love the simplicity of Azure Files and the hybrid capabilities of Azure File Sync, until now, scaling cloud file shares beyond 5 TiB required changing the paradigm for accessing data.

    Today, we are excited to announce the preview of a larger and higher scale standard tier for Azure Files, now available to all Azure customers. This preview significantly improves your experience by increasing standard file shares’ capacity and performance limits. In select regions, standard file shares in general purpose accounts can support the following larger limits.

    Azure Files standard storage scale limits

    Azure Files Before (standard tier) New (standard tier)
    Capacity per share 5 TiB 100 TiB (20x increase)
    Max IOPS per share 1,000 IOPS 10,000 IOPS (10x increase)
    Max throughput per share Up to 60 MiB/s Up to 300 MiB/s (5x increase)

    Performance limits for a single file remain the same at 1 TiB, 1000 IOPS, and 60 MiB/s. Standard file shares are backed by hard disk drives. If your workload is latency sensitive, you should consider Azure Files premium tier, that is backed by solid-state drives. Visit Azure Files scale limits documentation to get more details.

    Pricing and availability

    There is no additional charge for enabling larger and increased scale for standard file shares on your storage accounts. Refer to the pricing page for further details.

    We are introducing standard larger file shares in preview with support for locally redundant and zone redundant storages in select regions. We are quickly expanding the region coverage. We advise you to visit region availability documentation to get the latest information.

    Getting started

    1. Sign up for the preview of larger file shares (standard tier): Follow the onboarding steps in the Azure Files documentation.
    2. Create a new storage account: In your registered subscription, create a new general purpose storage account in one of the supported regions with a supported redundancy option. See detailed steps on how to create a new storage account.
    3. Create a new standard file share: Create a new standard file share under the newly created storage account. See detailed steps on how to create a standard file share.

    Frequently asked questions

    Do larger file shares work with premium file shares?

    We recently announced general availability (GA) of premium files with 100 TiB support. All premium regions support large file shares as a GA offering.

    Does Azure File Sync support larger file shares?

    Absolutely. Azure File Sync performance is a function of the number of objects, files and folders, in the sync scope, rather than of the capacity of the file share in the cloud. Our team is constantly making investments in improving the performance and scale of sync and cloud tiering. So, keep your Azure File Sync agent updated with the latest version and visit Azure File Sync scalability targets to keep informed of the latest scale.

    What about larger file shares support for my existing standard shares?

    We are working on upgrading existing storage accounts. After the existing accounts are upgraded, you will be able to opt-in your accounts into the larger file shares feature without any disruption to your existing workloads, including Azure File Sync. Stay up to date on the latest status through the region availability section of Azure Files documentation.

    Share your experiences

    As always, you can share your feedback and experiences on the Azure Storage forum or just send us email at LFSFeedback@microsoft.com. Post your ideas and suggestions about Azure Storage on our feedback forum.

    Happy sharing!

    Enable receipt understanding with Form Recognizer’s new capability

    $
    0
    0

    One of the newest members of the Azure AI portfolio, Form Recognizer, applies advanced machine learning to accurately extract text, key-value pairs, and tables from documents. With just a few samples, it tailors its understanding to supplied documents, both on-premises and in the cloud. 

    Introducing the new pre-built receipt capability

    Form Recognizer focuses on making it simpler for companies to utilize the information hiding latent in business documents such as forms. Now we are making it easier to handle one the most commonplace documents in a business, receipts, “out of the box.”  Form Recognizer’s new pre-built receipt API identifies and extracts key information on sales receipts, such as the time and date of the transaction, merchant information, the amounts of taxes, totals, and more, with no training required.

    Sample receipt with extracted information from Form Recognizer’s new prebuilt receipt feature

    Streamlining expense reporting

    Business expense reporting can be cumbersome for everyone involved in the process. Manually filling out and approving expense reports is a significant time sink for both employees and managers. Aside from productivity lost to expense reporting, there are also pain points around auditing expense reports. A solution to automatically extract merchant and transaction information from receipts can significantly reduce the manual effort of reporting and auditing expenses.

    Given the proliferation of mobile cameras, modern expense reports often contain images of receipts that are faded, crumpled up, or taken in suboptimal lighting conditions. Existing receipt solutions often target high quality scanned images and are not robust enough to handle such real-world conditions.

    Enhance your expense reporting process using the pre-built capability

    Form Recognizer eases common pain points in expense reporting, delivering real value back to business. By using the receipt API to extract merchant and transaction information from receipts, developers can unlock new experiences in the workforce. And since the pre-built model for receipts works off the shelf without training, it reduces the speed to deployment.

    For employees, expense applications leveraging Form Recognizer can pre-populate expense reports with key information extracted from receipts. This saves employees time in managing expenses and travel that they can focus on their core roles. For central teams like finance within a company, it also helps expense auditing by using the key data extracted from receipts for verification. The optical character recognition (OCR) technology behind the service can handle receipts that are captured in a wide variety of conditions,  including smartphone cameras, reducing the amount of manual searching and reading of transaction documents required by auditors.

    Our customer: Microsoft’s internal finance operations

    The pre-built receipt functionality of Form Recognizer has already been deployed by Microsoft’s internal expense reporting tool, MSExpense, to help auditors identify potential anomalies. Using the data extracted, receipts are sorted into low, medium, or high risk of potential anomalies. This enables the auditing team to focus on high risk receipts and reduce the number of potential anomalies that go unchecked.

    MSExpense also plans to leverage receipt data extraction and risk scoring to modernize the expense reporting process. Instead of identifying risky expenses during auditing, such automated processing can flag potential issues earlier in the process during the reporting or approval of the expenses. This reduces the turnaround time for processing the expense and any reimbursement.

    “The pre-built receipt feature of Form Recognizer enables our application not only to scale from sampling 5 percent of receipts to 100 percent, but more importantly to streamline employee expense report experience by auto-populating/creating expense transactions, creating happy path to payment, receipt data insights to approver managers, giving employees time back to be used in value-add activities for our company. The service was simple to integrate and start seeing value.“

    —Luciana Siciliano, Microsoft FinOps (MSExpense)

    Learn more

    To learn more about Form Recognizer and the rest of the Azure AI ecosystem, please visit our website and read the documentation.

    Get started by contacting us.

    For additional questions please reach out to us at formrecog_contact@microsoft.com

    Use Desktop Analytics and machine learning to get current and stay current


    CGG speeds geoscience insights on Azure HPC

    $
    0
    0

    A leader in geosciences, CGG has been imaging the earth’s subsurface for more than 85 years. Their products and solutions help clients locate natural resources. When a customer asked whether the cloud could speed the creation of the high-resolution reservoir models that they needed, CGG turned to Azure high performance computing (HPC).

    CGG is a team of acknowledged imaging experts in the oil and gas industry. Their subsurface imaging centers are internationally acclaimed, and their researchers are regularly recognized with prestigious international awards for their outstanding technical contributions to the industry. CGG was an early innovator in seismic inversion, the technique of converting seismic survey reflection data into a quantitative description of the rock properties in an oil and gas reservoir. From this data, highly detailed models can be made, like the following image showing the classification of rock types in a 3D grid for an unconventional reservoir interval. These images are invaluable in oil and gas exploration. Accurate models help geoscientists reduce risk, drill the best wells, and forecast production better.

    The challenge is to get the best possible look beneath the surface. The specialized software is compute-intensive, and complex models can take hours or even days to render.

    Image displaying seismic model

    Figure 1. An advanced seismic model, using CGG’s Jason RockMod.

    Azure provides the elasticity of compute that enables our clients to rapidly generate multiple high-resolution rock property realizations with Jason RockMod for detailed reservoir models.”

    - Joe Jacquot, Strategic Marketing Manager, CGG GeoSoftware

    The challenge: prove it in the cloud

    An oil and gas company in Southeast Asia asked CGG to work with them to find the optimum way to deploy reservoir characterization technology for their international team. The customer also wondered if they could take advantage of the cloud to save on hardware costs in their datacenter, where they already ran leading-edge CGG geoscience software solutions. They knew the theory of the cloud’s elasticity and on-demand scalability were familiar with its use in oil exploration, but they didn’t know if their applications would perform as well in the cloud as they did on-premises.

    In a rapidly moving industry, performance gains translate to speedier decision making, accelerated exploration, and development timelines, so the company asked CGG to provide a demonstration. As geoscience experts, CGG wanted to show their software in the best possible light. They turned to the customer advisors at AzureCAT (led by Tony Wu) for help creating a proof of concept that demonstrated how the cloud capacity could help the company get insights faster.

    Demo 1: CGG Jason RockMod

    CGG Jason RockMod overcomes the limitations of conventional reservoir characterization solutions with a geostatistical approach to seismic inversion. The outcome is accurate reservoir models so in depth that geoscientists can use them to predict field reserves, fluid flow patterns, and future production.

    The company wanted to better understand the scale of the RockMod simulation in the cloud, and wondered whether the cloud offered any real benefits compared to running the simulations on their own workstations.

    Technical teams from CGG and AzureCAT looked at RockMod as a big-compute workload. In this type of architecture, computationally intensive operations such as simulations are split across CPUs in multiple computers (10 to 1,000s). A common pattern is to administer a cluster of virtual machines, then schedule and monitor the HPC jobs. This do-it-yourself approach would enable the company to set up their own cluster environment in Azure virtual machine scale sets. The number of VM instances can automatically scale per demand or on a defined schedule.

    RockMod creates a family of multiple equi-probable simulation outputs, called realizations. In the proof of concept tests for their customer, CGG ran RockMod on a Microsoft Windows virtual machine on Azure. It served as the master node for job scheduling and distribution of multiple realizations across several Linux-based HPC nodes in a virtual machine scale set. Scale sets support up to 1,000 VM instances or 300 custom VM images, more than enough scale for multiple realizations.

    An HPC task generates the realizations, and the speed depends on the number of CPU cores or non-hyperthreading more than the processor speed. AzureCAT recommended non-hyperthreading cores as the best choice for application performance. From a storage perspective, the realizations are not especially demanding. During testing, the storage IO rate went up to a few hundred megabytes per second (MB/s). Based on these considerations, CGG chose the cost-effective DS14V2 virtual machine with accelerated networking, which has 16 cores and 7 GB per core.

    “Azure provided the ideal blend of HPC and storage performance and price, along with technical support that we needed to successfully demonstrate our high-end reservoir characterization technology on the cloud. It was a technical success and paved the way for full-scale commercial deployment.”

    - Joe Jacquot, Strategic Marketing Manager, CGG GeoSoftware

    Diagram image of the Azure architecture of CGG Jason RockModFigure 2. Azure architecture of CGG Jason RockMod.

    Benchmarks and benefits

    A typical project is about 15 GB in size and includes more than three million seismic data traces. Some of the in-house workstations could render a single realization within a day or two, but the goal was to run 30 realizations by taking advantage of the cloud’s scalability. The initial small scale test ran one realization on one HPC node, which took just under 12 hours to complete. That was within the target range, and now the CGG team needed to see what would happen when they ran at scale.

    To test the linear scalability of the job, they first ran eight realizations on eight nodes. The results were nearly identical to the small scale test run. The one realization to one node formula seemed to work. For the sake of comparison, they tried running 30 realizations on just eight nodes. That didn’t work so well, the tests were nearly four times slower.

    The final test ran 30 realizations on 30 nodes, one realization to one node. The results were similar to the small scale test, and the job was completed in just over 12 hours. This scenario was tested several times to validate the results which were consistent. The test was a success.

    Demo 2: CGG InsightEarth

    In their exploration and development efforts, the company also used CGG InsightEarth, a software suite that accelerates 3D interpretation and visualization.

    The interpretation teams at the company were using several InsightEarth applications to locate hydrocarbon deposits, faults and fractures, and salt bodies all of which can be difficult to interpret. They asked CGG to compare the performance of the InsightEarth suite on Azure to their workstations on premises. Their projects were typically 15 GB in size, with more than three million seismic data traces.

    InsightEarth is a powerful, memory-intensive application. To get the best performance, the application must run on a GPU-based computer, and to get the best performance from GPUs, efficient memory access is critical. To meet this requirement on Azure, the team of engineers from CGG and AzureCAT looked at the specialized, GPU-optimized VM sizes that are available. The Azure NV-series virtual machines are powered by NVIDIA Tesla M60 GPUs and the NVIDIA GRID technology with Intel Broadwell CPUs, which are suited for compute-intensive visualizations and simulations.

    The GRID license gives the company the flexibility to use an NV instance as a virtual workstation for a single user, or to give 25 users concurrent access to the VM running InsightEarth. The company wanted the collaborative benefits of the second model. Unlike a physical workstation, a cloud-based GPU virtual workstation would allow them all to view the data after running a pre-processing or interpretation step because all the data was in the cloud. This would streamline their workflow, eliminating the need to move the data back on premises for that task.

    The initial performance tests ran InsightEarth on an NV24 VM on Azure. The storage IO demand was moderate, with rates around 100 MB/s. However, the VM’s memory size proved to be a bottleneck. The amount of data that could be loaded from the memory was limited, and the performance wasn’t as good as the more powerful GPU setup used on premises.

    Next, the team ran a similar test using an ND-series VM. This type is specifically designed to offer excellent performance for AI and deep learning workloads, where huge data volumes are used to train models. They chose an ND24-size VM with double the amount of the memory and the newer NVIDIA Tesla P40 GPUs. This time, the results were considerably better than NV24.

    From an infrastructure perspective, storage also matters when deploying high-performance applications on Azure. The team implemented SoftNAS, a type of storage that supports both the Windows and Linux VMs used in the overall solution. SoftNAS also suits the lower to mid-range storage IO demands for these simulation and interpretation workloads.

    Diagram image of the Azure Architecture for InsightEarth

    Figure 3. Azure Architecture for InsightEarth

    Summary

    GPUs and memory-optimized VMs provided the performance that the company needed for both Jason Rockmod and InsightEarth. Better yet, Azure gave them the scale they needed to run multiple realizations in parallel. In addition, they deployed GPUs to both software solutions, giving the CGG engineers a standard front end to work with. They set up access to the Azure resources through an easy-to-use portal based on Citrix Cloud, which also runs on Azure.

    CGG’s customer, the oil and gas company, was delighted with the results. Based on the results of the benchmark tests, the oil and gas company decided to move all of their current CGG application workload to Azure. The cloud’s elasticity and Azure’s pay-per-use model were a compelling combination. Not only did the Azure solution perform, it proved to be more cost-effective compared to the limited scalability they could achieve with their computing power on-premises.

    Company: CGG
    Microsoft Azure CAT Technical Lead: Tony Wu

    Previewing Azure SDKs following new Azure SDK API Standards

    $
    0
    0

    Today we’re happy to share a new set of libraries for working with Azure Storage,  Azure Cosmos DB, Azure Key Vault, and Azure Event Hubs in Java, Python, JavaScript or TypeScript, and .NET. These libraries provide access to new service features, and represent the first step towards applying a new set of standards across the Azure SDKs that we believe will make the libraries easier to learn and integrate into your software. You can get these libraries today from your favorite package manager, and we would love to hear your feedback on GitHub. To get started follow the instructions linked below:

    Why are we doing this?

    Much like moving software from the client or on-premises to the cloud is a paradigm shift, we too have been going through a period of rapid innovation in Azure’s capabilities and learning about how best to expose it to developers. Now that some Azure services have matured and been adopted into business-critical enterprise applications, we have been learning what patterns and practices were critical to developer productivity around these services. In addition, we’ve been listening to your feedback and we’ve made sure that our new effort has incorporated your suggestions and requests. Finally, we understand that consistency, ease of use, and discoverability are equally important to the identified patterns when it comes to working with the Azure SDKs.

    What’s different?

    The team will go into much of what I am about to outline in follow-up blog posts but to get started, the big changes came from a set of objectives we defined based on your feedback. Those were:

    • Create easy to use APIs with productivity on par with the best libraries of the language ecosystems.
    • Provide APIs that are idiomatic to the language and ecosystem they are used in.
    • Evolve over time in a very compatible fashion.
    • Focus as much on documentation and samples, as on APIs.
    • Change how we create the libraries at their core.

    Ease of use and productivity

    Productivity is a multifaceted topic on its own, but two main elements of it are consistency and usability.

    To help reach consistency, we codified the things we learned while working with Azure developers into a set of API design guidelines. The guidelines themselves are built in the open on our GitHub repository and consist of a section of principles that goes into more detail on how we approached this space, a set of general guidelines, and language specific guidelines for Java, Python, .NET, and JavaScript. By applying the guidelines we believe that these libraries will be easier to use and easier to learn. When you learn a pattern or API shape in one library you should be able to count on it being the same in others.

    To help drive usability, we tweaked how we gather user feedback. We continue to do many of the standard practices in the industry such as releasing previews, working directly with developers on their projects, and responding to issues in many different community forums but the next step for us was to usability test the libraries. For each of the libraries we are releasing today, we have brought developers into a lab and had them work through different use cases while we observed them. That feedback was instrumental to shape both the guidelines as well as the API shape of the libraries.

    Idiomatic libraries

    A key piece of feedback we heard while talking with developers was that our APIs didn’t always feel ergonomic in a language. To fix that, we explicitly established as one of our core principles that the libraries we author should follow the patterns of that language. In addition, as we update each service’s libraries to follow guidelines, we are ensuring that we always release libraries for those services in each of the following languages: Java, Python, JavaScript, and .NET.

    Compatibility

    Compatibility has always been a value at Microsoft. Developers put significant time and money into solutions and should be able to count on them continuing to work. There is a tension here. For some cases we have had to make breaking changes to get to a better foundation. We believe aligning on that foundation will help meet the productivity goals outlined above, and once it’s set we intend to provide a high degree of compatibility. As a final note on compatibility, we’ve looked at the dependencies that we took and tried to minimize them as much as possible to reduce future incompatibilities and versioning complexities which should make upgrading libraries and using other pieces of software alongside these libraries easier.

    Documentation

    Having good documentation and samples could be considered an aspect of productivity but we wanted to call it out as its own goal because many developers rate it as the top factor in choosing what technologies to use. Much like the usability studies we’re doing on the APIs themselves, we have been doing the same on Azure Quickstarts to ensure that new developers can begin to experiment with Azure Services quickly. We have also heard feedback that both API Reference code snippets and samples get out-of-date and we have been building the tooling to build API Reference code snippets and samples regularly from GitHub and publish those into documentation.

    Change how we build

    We also found we needed to change how we work with and engage the community and what we build on top of. To that end, we’ve begun work to restructure and centralize our development effort into a few key repositories:

    Finally, we built a new core library that is helping us provide common features like identity and authentication, both synchronous and asynchronous APIs, logging, error handling, networking retries and more across all libraries.

    What should I do?

    We are very pleased to be sharing the preview release of the new Azure libraries that conform to many of the principles outlined above. We would like to encourage you to download and try the new SDKs today. To help you along the way, we are providing release notes describing what’s new in each library, how to get the packages, and how to file GitHub issues specific to the previews.

    In addition to filing GitHub issues, feel free to also follow and tweet us @AzureSDK. We look forward to receiving your feedback so we can improve the libraries, and make it easier for you to create great software and solve problems with Azure.

    Printing in the cloud—Lexmark digitizes its future with a modern workplace and Microsoft 365

    June Security Release: Patches available for Azure DevOps Server and Team Foundation Server

    $
    0
    0

    For the July security release, we are releasing fixes for vulnerabilities that impact Azure DevOps Server 2019, TFS 2018, TFS 2017, TFS 2015, TFS 2013, TFS 2012, and TFS 2010. Thanks to everyone who has been participating in our Azure DevOps Bounty Program.

    CVE-2019-1072: remote code execution vulnerability in work item tracking

    CVE-2019-1076: cross site scripting (XSS) vulnerability in Pull Requests

    Functional bug fix: Email notifications may have incorrect dates

    Azure DevOps Server 2019.0.1 Patch 1

    If you have Azure DevOps Server 2019, you should first update to Azure DevOps Server 2019.0.1. Once on 2019.0.1, install Azure DevOps Server 2019.0.1 Patch 1.

    Verifying Installation

    To verify if you have this update installed, you can check the version of the following file: [INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Server.WebAccess.VersionControl.dll. Azure DevOps Server 2019 is installed to c:Program FilesAzure DevOps Server 2019 by default.

    After installing Azure DevOps Server 2019.0.1 Patch 1, the version will be 17.143.29019.5.

    TFS 2018 Update 3.2 Patch 5

    If you have TFS 2018 Update 2 or Update 3, you should first update to TFS 2018 Update 3.2. Once on Update 3.2, install TFS 2018 Update 3.2 Patch 5.

    Verifying Installation

    To verify if you have this update installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.WorkItemTracking.Web.dll. TFS 2018 is installed to c:Program FilesMicrosoft Team Foundation Server 2018 by default.

    After installing TFS 2018 Update 3.2 Patch 5, the version will be 16.131.29019.4.

    TFS 2018 Update 1.2 Patch 5

    If you have TFS 2018 RTW or Update 1, you should first update to TFS 2018 Update 1.2. Once on Update 1.2, install TFS 2018 Update 1.2 Patch 5.

    Verifying Installation

    To verify if you have this update installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Server.WebAccess.Admin.dll. TFS 2018 is installed to c:Program FilesMicrosoft Team Foundation Server 2018 by default.

    After installing TFS 2018 Update 1.2 Patch 5, the version will be 16.122.29017.5.

    TFS 2017 Update 3.1 Patch 6

    If you have TFS 2017, you should first update to TFS 2017 Update 3.1. Once on Update 3.1, install TFS 2017 Update 3.1 Patch 6.

    Verifying Installation

    To verify if you have a patch installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Server.WebAccess.Admin.dll. TFS 2017 is installed to c:Program FilesMicrosoft Team Foundation Server 15.0 by default.

    After installing TFS 2017 Update 3.1 Patch 6, the version will be 15.117.29024.0.

    TFS 2015 Update 4.2 Patch 2

    If you have TFS 2015, you should first update to TFS 2015 Update 4.2. Once on Update 4.2, install TFS 2015 Update 4.2 Patch 2.

    Verifying Installation

    To verify if you have a patch installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Framework.Server.dll. TFS 2015 is installed to c:Program FilesMicrosoft Team Foundation Server 14.0 by default.

    After installing TFS 2015 Update 4.2 Patch 1, the version will be 14.114.29025.0.

    TFS 2013 Update 5 Patch 1

    If you have TFS 2013, you should first update to TFS 2013 Update 5, which you can get at https://my.visualstudio.com. Once on Update 5, install TFS 2013 Update 5 Patch 1.

    Verifying Installation

    To verify if you have a patch installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Framework.Server.dll. TFS 2013 is installed to c:Program FilesMicrosoft Team Foundation Server 12.0 by default.

    After installing TFS 2013 Update 5 Patch 1, the version will be 12.0.40681.0

    TFS 2012 Update 4 Patch 1

    If you have TFS 2012, you should first update to TFS 2012 Update 4, which you can get at https://my.visualstudio.com. Once on Update 4, install TFS 2012 Update 4 Patch 1.

    Verifying Installation

    To verify if you have a patch installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Framework.Server.dll. TFS 2012 is installed to c:Program FilesMicrosoft Team Foundation Server 11.0 by default.

    After installing TFS 2012 Update 4 Patch 1, the version will be 11.0.61243.0.

    TFS 2010 SP1 Patch 1

    If you have TFS 2010, you should first update to Service Pack 1, which you can get at https://my.visualstudio.com. Once on SP1, install TFS 2010 SP1 Patch 1 64-bit or TFS 2010 SP1 Patch 1 32-bit.

    Verifying Installation

    To verify if you have a patch installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Framework.Server.dll. TFS 2010 is installed to c:Program FilesMicrosoft Team Foundation Server 2010 by default.

    After installing TFS 2010 SP1 Patch 1, the version will be 10.0.40219.504.

    The post June Security Release: Patches available for Azure DevOps Server and Team Foundation Server appeared first on Azure DevOps Blog.

    July Security Release: Patches available for Azure DevOps Server and Team Foundation Server

    $
    0
    0

    For the July security release, we are releasing fixes for vulnerabilities that impact Azure DevOps Server 2019, TFS 2018, TFS 2017, TFS 2015, TFS 2013, TFS 2012, and TFS 2010. Thanks to everyone who has been participating in our Azure DevOps Bounty Program.

    CVE-2019-1072: remote code execution vulnerability in work item tracking

    CVE-2019-1076: cross site scripting (XSS) vulnerability in Pull Requests

    Functional bug fix: Email notifications may have incorrect dates

    Azure DevOps Server 2019.0.1 Patch 1

    If you have Azure DevOps Server 2019, you should first update to Azure DevOps Server 2019.0.1. Once on 2019.0.1, install Azure DevOps Server 2019.0.1 Patch 1.

    Verifying Installation

    To verify if you have this update installed, you can check the version of the following file: [INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Server.WebAccess.VersionControl.dll. Azure DevOps Server 2019 is installed to c:Program FilesAzure DevOps Server 2019 by default.

    After installing Azure DevOps Server 2019.0.1 Patch 1, the version will be 17.143.29019.5.

    TFS 2018 Update 3.2 Patch 5

    If you have TFS 2018 Update 2 or Update 3, you should first update to TFS 2018 Update 3.2. Once on Update 3.2, install TFS 2018 Update 3.2 Patch 5.

    Verifying Installation

    To verify if you have this update installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.WorkItemTracking.Web.dll. TFS 2018 is installed to c:Program FilesMicrosoft Team Foundation Server 2018 by default.

    After installing TFS 2018 Update 3.2 Patch 5, the version will be 16.131.29019.4.

    TFS 2018 Update 1.2 Patch 5

    If you have TFS 2018 RTW or Update 1, you should first update to TFS 2018 Update 1.2. Once on Update 1.2, install TFS 2018 Update 1.2 Patch 5.

    Verifying Installation

    To verify if you have this update installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Server.WebAccess.Admin.dll. TFS 2018 is installed to c:Program FilesMicrosoft Team Foundation Server 2018 by default.

    After installing TFS 2018 Update 1.2 Patch 5, the version will be 16.122.29017.5.

    TFS 2017 Update 3.1 Patch 6

    If you have TFS 2017, you should first update to TFS 2017 Update 3.1. Once on Update 3.1, install TFS 2017 Update 3.1 Patch 6.

    Verifying Installation

    To verify if you have a patch installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Server.WebAccess.Admin.dll. TFS 2017 is installed to c:Program FilesMicrosoft Team Foundation Server 15.0 by default.

    After installing TFS 2017 Update 3.1 Patch 6, the version will be 15.117.29024.0.

    TFS 2015 Update 4.2 Patch 2

    If you have TFS 2015, you should first update to TFS 2015 Update 4.2. Once on Update 4.2, install TFS 2015 Update 4.2 Patch 2.

    Verifying Installation

    To verify if you have a patch installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Framework.Server.dll. TFS 2015 is installed to c:Program FilesMicrosoft Team Foundation Server 14.0 by default.

    After installing TFS 2015 Update 4.2 Patch 1, the version will be 14.114.29025.0.

    TFS 2013 Update 5 Patch 1

    If you have TFS 2013, you should first update to TFS 2013 Update 5, which you can get at https://my.visualstudio.com. Once on Update 5, install TFS 2013 Update 5 Patch 1.

    Verifying Installation

    To verify if you have a patch installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Framework.Server.dll. TFS 2013 is installed to c:Program FilesMicrosoft Team Foundation Server 12.0 by default.

    After installing TFS 2013 Update 5 Patch 1, the version will be 12.0.40681.0

    TFS 2012 Update 4 Patch 1

    If you have TFS 2012, you should first update to TFS 2012 Update 4, which you can get at https://my.visualstudio.com. Once on Update 4, install TFS 2012 Update 4 Patch 1.

    Verifying Installation

    To verify if you have a patch installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Framework.Server.dll. TFS 2012 is installed to c:Program FilesMicrosoft Team Foundation Server 11.0 by default.

    After installing TFS 2012 Update 4 Patch 1, the version will be 11.0.61243.0.

    TFS 2010 SP1 Patch 1

    If you have TFS 2010, you should first update to Service Pack 1, which you can get at https://my.visualstudio.com. Once on SP1, install TFS 2010 SP1 Patch 1 64-bit or TFS 2010 SP1 Patch 1 32-bit.

    Verifying Installation

    To verify if you have a patch installed, you can check the version of the following file: [TFS_INSTALL_DIR]Application TierWeb ServicesbinMicrosoft.TeamFoundation.Framework.Server.dll. TFS 2010 is installed to c:Program FilesMicrosoft Team Foundation Server 2010 by default.

    After installing TFS 2010 SP1 Patch 1, the version will be 10.0.40219.504.

    The post July Security Release: Patches available for Azure DevOps Server and Team Foundation Server appeared first on Azure DevOps Blog.

    .NET Framework July 2019 Security and Quality Rollup

    $
    0
    0

    Today, we are releasing the July 2019 Cumulative Update, Security and Quality Rollup, and Security Only Update for .NET Framework.

    Security

    CVE-2019-1006 – WCF/WIF SAML Token Authentication Bypass Vulnerability

    An authentication bypass vulnerability exists in Windows Communication Foundation (WCF) and Windows Identity Foundation (WIF), allowing signing of SAML tokens with arbitrary symmetric keys. This vulnerability allows an attacker to impersonate another user, which can lead to elevation of privileges. The vulnerability exists in WCF, WIF 3.5 and above in .NET Framework, WIF 1.0 component in Windows, WIF Nuget package, and WIF implementation in SharePoint. An unauthenticated attacker can exploit this by signing a SAML token with any arbitrary symmetric key.

    This security update addresses the issue by ensuring all versions of WCF and WIF validate the key used to sign SAML tokens correctly.

    CVE-2019-1006

     

    CVE-2019-1083 – .NET Denial of Service Vulnerability

    A denial of service vulnerability exists when Microsoft Common Object Runtime Library improperly handles web requests. An attacker who successfully exploited this vulnerability could cause a denial of service against a .NET web application. A remote unauthenticated attacker could exploit this vulnerability by issuing specially crafted requests to the .NET application.

    The update addresses the vulnerability by correcting how the .NET web application handles web requests.

    CVE-2019-1083

     

    CVE-2019-1113 – .NET Framework Remote Code Execution Vulnerability

    A remote code execution vulnerability exists in .NET software when the software fails to check the source markup of a file. An attacker who successfully exploited the vulnerability could run arbitrary code in the context of the current user. If the current user is logged on with administrative user rights, an attacker could take control of the affected system. An attacker could then install programs; view, change, or delete data; or create new accounts with full user rights. Users whose accounts are configured to have fewer user rights on the system could be less impacted than users who operate with administrative user rights. Exploitation of the vulnerability requires that a user open a specially crafted file with an affected version of .NET Framework. In an email attack scenario, an attacker could exploit the vulnerability by sending the specially crafted file to the user and convincing the user to open the file.

    The security update addresses the vulnerability by correcting how .NET Framework checks the source markup of a file.

    CVE-2019-1113

     

    Getting the Update

    The Cumulative Update and Security and Quality Rollup are available via Windows Update, Windows Server Update Services, Microsoft Update Catalog, and Docker.  The Security Only Update is available via Windows Server Update Services and Microsoft Update Catalog.

     

    Microsoft Update Catalog

    You can get the update via the Microsoft Update Catalog. For Windows 10, NET Framework 4.8 updates are available via Windows Update, Windows Server Update Services, Microsoft Update Catalog.  Updates for other versions of .NET Framework are part of the Windows 10 Monthly Cumulative Update.

     

    The following table is for Windows 10 and Windows Server 2016+ versions.

    Product Version Cumulative Update
    Windows 10 1903 (May 2019 Update)
    4506991
    .NET Framework 3.5, 4.8 Catalog
    4506991
    Windows 10 1809 (October 2018 Update)
    Windows Server 2019

    4507419
    .NET Framework 3.5, 4.7.2 Catalog
    4506998
    .NET Framework 3.5, 4.8 Catalog
    4506990
    Windows 10 1803 (April 2018 Update)
    4506989
    .NET Framework 3.5, 4.7.2 Catalog
    4507435
    .NET Framework 4.8 Catalog
    4506989
    Windows 10 1709 (Fall Creators Update)
    4506988
    .NET Framework 3.5, 4.7.1, 4.7.2 Catalog
    4507455
    .NET Framework 4.8 Catalog
    4506988
    Windows 10 1703 (Creators Update)
    4506987
    .NET Framework 3.5, 4.7, 4.7.1, 4.7.2 Catalog
    4507450
    .NET Framework 4.8 Catalog
    4506987
    Windows 10 1607 (Anniversary Update)
    Windows Server 2016

    4498141
    .NET Framework 3.5, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog
    4507460
    .NET Framework 4.8 Catalog
    4506986
    Windows 10 1507
    4507458
    .NET Framework 3.5, 4.6, 4.6.1, 4.6.2 Catalog
    4507458

     

    The following table is for earlier Windows and Windows Server versions.

    Product Version Security and Quality Rollup Security Only Update
    Windows 8.1
    Windows RT 8.1
    Windows Server 2012 R2

    Catalog
    4507422

    Catalog
    4507413
    .NET Framework 3.5 Catalog
    4507005

    Catalog
    4506977
    .NET Framework 4.5.2 Catalog
    4506999

    Catalog
    4506964
    .NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog
    4506996

    Catalog
    4506962
    .NET Framework 4.8 Catalog
    4506993

    Catalog
    4506955
    Windows Server 2012 Catalog
    4507421
    Catalog
    4507412
    .NET Framework 3.5 Catalog
    4507002

    Catalog
    4506974
    .NET Framework 4.5.2 Catalog
    4507000

    Catalog
    4506965
    .NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog
    4506995

    Catalog
    4506961
    .NET Framework 4.8 Catalog
    4506992

    Catalog
    4506954
    Windows 7 SP1
    Windows Server 2008 R2 SP1

    Catalog
    4507420

    Catalog
    4507411
    .NET Framework 3.5.1 Catalog
    4507004

    Catalog
    4506976
    .NET Framework 4.5.2 Catalog
    4507001

    Catalog
    4506966
    .NET Framework 4.6, 4.6.1, 4.6.2, 4.7, 4.7.1, 4.7.2 Catalog
    4506997

    Catalog
    4506963
    .NET Framework 4.8 Catalog
    4506994

    Catalog
    4506956
    Windows Server 2008
    Catalog
    4507423

    Catalog
    4507414
    .NET Framework 2.0, 3.0 Catalog
    4507003

    Catalog
    4506975
    .NET Framework 4.5.2 Catalog
    4507001

    Catalog
    4506966
    .NET Framework 4.6 Catalog
    4506997

    Catalog
    4506963

    Docker Images

    We will be updating the following .NET Framework container images later today:

    Note: You must re-pull base images in order to get updates. The Docker client does not pull updates automatically.

    Previous Monthly Rollups

    The last few .NET Framework Monthly updates are listed below for your convenience:

    The post .NET Framework July 2019 Security and Quality Rollup appeared first on .NET Blog.

    .NET Core July 2019 Updates – 2.1.12 and 2.2.6

    $
    0
    0

    Today, we are releasing the .NET Core July 2019 Update. These updates contain security and reliability fixes. See the individual release notes for details on updated packages.

    NOTE: If you are a Visual Studio user, there are MSBuild version requirements so use only the .NET Core SDK supported for each Visual Studio version. Information needed to make this choice will be seen on the download page. If you use other development environments, we recommend using the latest SDK release.

    Security

    CVE-2019-1075: ASP.NET Core Spoofing Vulnerability

    Microsoft is releasing this security advisory to provide information about a vulnerability in ASP.NET Core 2.1 and 2.2. This advisory also provides guidance on what developers can do to update their applications to remove this vulnerability.

    Microsoft is aware of a spoofing vulnerability that exists in ASP.NET Core that could lead to an open redirect. An attacker who successfully exploited the vulnerability could redirect a targeted user to a malicious website.

    To exploit the vulnerability, an attacker could send a link that has a specially crafted URL and convince the user to click the link. The update addresses the vulnerability by correcting how ASP.NET Core parses URLs.

    Getting the Update

    The latest .NET Core updates are available on the .NET Core download page. This update is also included in the Visual Studio 15.9.14, 16.0.6 and 16.1.6 updates. Choose Check for Updates in the Help menu.

    See the .NET Core release notes ( 2.1.12 | 2.2.6 ) for details on the release including issues fixed and affected packages.

    Support Lifecycle Update for .NET Core 1.0 and 1.1

    .NET Core 1.0 and 1.1 reached end of support on June 27, 2019 and will no longer receive updates going forward. The final updates for .NET Core 1.0 and 1.1 are 1.0.16 and 1.1.13, respectively.

    See .NET Core Support Policy to learn more about the .NET Core support lifecycle.

    Docker Images

    The following .NET container images will be updated later today.

    Note: Look at the “Tags” view in each repository to see the updated Docker image tags.

    Note: You must re-pull base images in order to get updates. The Docker client does not pull updates automatically.

    Azure App Services deployment

    Deployment of these updates on Azure App Services has been scheduled and it is expected to complete later in July 2019.

    The post .NET Core July 2019 Updates – 2.1.12 and 2.2.6 appeared first on .NET Blog.


    Migrate to Azure HDInsight in as little as 12 weeks

    $
    0
    0

    Recent announcements in the open source ecosystem have led customers of prominent open source analytics technology companies to explore options with Microsoft Azure and to this end, we now have new and compelling offers aimed at helping on-premises open source analytics workloads to migrate to Azure! We are always trying to find ways to make it easier for our customers to move to the cloud and with this offer, our customers can now realize even greater savings and accelerate migration.

    Azure HDInsight is an easy, cost-effective, fully managed and fully supported open source cloud-scale analytics service that can process massive amounts of data securely. We are pleased to offer customers who want to migrate their on-premises analytics workloads to the cloud, the HDInsight migration accelerator program which offers the following benefits:

    1. Mitigate risk - Engagement led by Microsoft engineering and bolstered by custom accelerators enabling rapid migration of on-premises workloads to Azure HDInsight with minimal risk 
    2. Get to production fast - Accelerated timeline of 12-17 weeks due to a structured engagement starting with an assessment all the way till go-live.

    image

    3. Discounted - A limited-time offer to take advantage of end-of-financial year discounts!

    R 3.6.1 is now available

    $
    0
    0

    On July 5, the R Core Group released the source code for the latest update to R, R 3.6.1, and binaries are now available to download for Windows, Linux and Mac from your local CRAN mirror.

    R 3.6.1 is a minor update to R that fixes a few bugs. As usual with a minor release, this version is backwards-compatible with R 3.6.0 and remains compatible with your installed packages. Nonetheless, an upgrade is recommended to address issues including:

    • A fix for "lock errors" when attempting to re-install packages on Windows
    • Efficiency improvements and other fixes when using "quasi" links with Generalized Linear Models
    • Fixed axis labels for boxplots
    • writeClipboard on Windows now supports Unicode, so you can copy text with special characters from R for pasting

    See the release notes for the complete list.

    By long tradition the code name for this release, "Action of the Toes", is likely a reference to the Peanuts comic. If you know specifically which one, let us know in the comments!

    R-announce mailing list: R 3.6.1 is released

    Dealing with Application Base URLs and Razor link generation while hosting ASP.NET web apps behind Reverse Proxies

    $
    0
    0

    Updating my site to run on AzureI'm quietly moving my Website from a physical machine to a number of Cloud Services hosted in Azure. This is an attempt to not just modernize the system - no reason to change things just to change them - but to take advantage of a number of benefits that a straight web host sometimes doesn't have. I want to have multiple microsites (the main page, the podcast, the blog, etc) with regular backups, CI/CD pipeline (check in code, go straight to staging), production swaps, a global CDN for content, etc.

    I'm also moving from an ASP.NET 4 (was ASP.NET 2 until recently) site to ASP.NET Core 2.x LTS and changing my URL structure. I am aiming to save money but I'm not doing this as a "spend basically nothing" project. Yes, I could convert my site to a static HTML generated blog using any number of great static site generators, or even a Headless CMS. Yes I could host it in Azure Storage fronted by a CMS, or even as a series of Azure Functions. But I have 17 years of content in DasBlog, I like DasBlog, and it's being actively updated to .NET Core and it's a fun app. I also have custom Razor sites in the form of my podcast site and they work great with a great workflow. I want to find a balance of cost effectiveness, features, ease of use, and reliability.  What I have now is a sinking feeling like my site is gonna die tomorrow and I'm not ready to deal with it. So, there you go.

    Currently my sites live on a real machine with real folders and it's fronted by IIS on a Windows Server. There's an app (an IIS Application, to be clear) leaving at so that means hanselman.com/ hits / which is likely c:inetpubwwwroot full stop.

    For historical reasons, when you hit hanselman.com/blog/ you're hitting the /blog IIS Application which could be at d:whatever but may be at c:inetpubwwwrootblog or even at c:blog. Who knows. The Application and ASP.NET within it knows that the site is at hanselman.com/blog.

    That's important, since I may write a URL like ~/about when writing code. If I'm in the hanselman.com/blog app, then ~/about means hanselman.com/blog/about. If I write /about, that means hanselman.com/about. So the ~ is a shorthand for "starting at this App's base URL." This is great and useful and makes Link generation super easy, but it only works if your app knows what it's server-side base URL is.

    To be clear, we are talking about the reality of the generated URL that's sent to and from the browser, not about any physical reality on the disk or server or app.

    I've moved my world to three Azure App Services called hanselminutes, hanselman, and hanselmanblog. They have names like http://hanselman.azurewebsites.net for example.

    ASIDE: You'll note that hitting hanselman.azurewebsites.com will hit an app that looks stopped. I don't want that site to serve traffic from there, I want it to be served from http://hanselman.com, right? Specifically only from Azure Front Door which I'll talk about in another post soon. So I'll use the Access Restrictions and Software Based Networking in Azure to deny all traffic to that site, except traffic from Azure - in this case, from the Azure Front Door Reverse Proxy I'll be using.

    That looks like this in this Access Restrictions part of the Azure Portal.

    Only allowing traffic from Azure

    Since the hanselman.com app will point to hanselman.azurewebsites.net (or one of its staging slots) there's no issue with URL generation. If I say / I mean /, the root of the site. If I generate a URL like "~/about" I'll get hanselman.com/about, right?

    But with http://hanselmanblog.azurewebsites.net it's different.

    I want hanselman.com/blog/ to point to hanselmanblog.azurewebsites.net.

    That means that the Azure Front Door will be receiving traffic, then forward it on to the Azure Web App. That means:

    • hanselman.com/blog/foo -> hanselmanblog.azurewebsites.net/foo
    • hanselman.com/blog/bar -> hanselmanblog.azurewebsites.net/foo
    • hanselman.com/blog/foo/bar/baz -> hanselmanblog.azurewebsites.net/foo/bar/baz

    There's a few things to consider when dealing with reverse proxies like this.

    Is part of the /path being removed or is a path being added?

    In the case of DasBlog, we have a configuration setting so that the app knows where it LOOKS like it is, from the Browser URL's perspective.

    My blog is at /blog so I add that in some middleware in my Startup.cs. Certainly YOU don't need to have this in config - do whatever works for you as long as context.Request.PathBase is set as the app should see it. I set this very early in my pipeline.

    That if statement is there because most folks don't install their blog at /blog, so it doesn't add the middleware.

    //if you've configured it at /blog or /whatever, set that pathbase so ~ will generate correctly
    
    Uri rootUri = new Uri(dasBlogSettings.SiteConfiguration.Root);
    string path = rootUri.AbsolutePath;

    //Deal with path base and proxies that change the request path
    if (path != "/")
    {
    app.Use((context, next) =>
    {
    context.Request.PathBase = new PathString(path);
    return next.Invoke();
    });
    }

    Sometimes you want the OPPOSITE of this. That would mean that I wanted, perhaps hanselman.com to point to hanselman.azurewebsites.net/blog/. In that case I'd do this in my Startup.cs's ConfigureServices:

    app.UsePathBase("/blog");

    Be aware that If you're hosting ASP.NET Core apps behind Nginx or Apache or really anything, you'll also want ASP.NET Core to respect  X-Forwarded-For and other X-Forwarded standard headers. You'll also likely want the app to refuse to speak to anyone who isn't a certain list of proxies or configured URLs.

    I configure these in Startup.cs's ConfigureServices from a semicolon delimited list in my config, but you can do this in a number of ways.

    services.Configure<ForwardedHeadersOptions>(options =>
    
    {
    options.ForwardedHeaders = ForwardedHeaders.All;
    options.AllowedHosts = Configuration.GetValue<string>("AllowedHosts")?.Split(';').ToList<string>();
    });

    Since Azure Front Door adds these headers as it forwards traffic, from my app's point of view it "just works" once I've added that above and then this in Configure()

    app.UseForwardedHeaders();
    

    There seems to be some confusion on hosting behind a reverse proxy in a few GitHub Issues. I'd like to see my scenario ( /foo -> / ) be a single line of code, as we see that the other scenario ( / -> /foo ) is a single line.

    Have you had any issues with URL generation when hosting your Apps behind a reverse proxy?


    Sponsor: Develop Xamarin applications without difficulty with the latest JetBrains Rider: Xcode integration, JetBrains Xamarin SDK, and manage the required SDKs for Android development, all right from the IDE. Get it today



    © 2019 Scott Hanselman. All rights reserved.
         

    What’s new in Azure Firewall

    $
    0
    0

    This post was co-authored by Anitha Adusumilli, Principal Program Manager, Azure Networking. 

    Today we are happy to share several key Azure Firewall capabilities as well as update on recent important releases into general availability (GA) and preview.

    • Multiple public IPs soon to be generally available
    • Availability Zones now generally available
    • SQL FQDN filtering now in preview
    • Azure HDInsight (HDI) FQDN tag now in preview
    • Central management using partner solutions

    Azure Firewall is a cloud native firewall-as-a-service offering which enables customers to centrally govern and log all their traffic flows using a DevOps approach. The service supports both application and network level filtering rules and is integrated with the Microsoft Threat Intelligence feed for filtering known malicious IP addresses and domains. Azure Firewall is highly available with built-in auto scaling.

    Multiple public IPs soon to be generally available

    You can now associate up to 100 public IP addresses with your firewall. This enables the following scenarios:

    • DNAT - You can translate multiple standard port instances to your backend servers. For example, if you have two public IP addresses, you can translate TCP port 3389 (RDP) for both IP addresses.
    • SNAT - Additional ports are available for outbound SNAT connections, reducing the potential for SNAT port exhaustion.

    An image showing sample Azure Firewall public IP configuration with multiple public IPs.

    Figure one – Sample Azure Firewall Public IP configuration with multiple public IPs.

    Currently, Azure Firewall randomly selects the source public IP address to use for a connection. If you have any downstream filtering on your network, you need to allow all public IP addresses associated with your firewall. Explicit SNAT configuration is on our roadmap. See our documentation "Deploy an Azure Firewall with multiple public IP addresses using Azure PowerShell" for more information.

    Multiple public IPs GA will be available in all public regions by July 12, 2019. It is currently supported using REST APIs, templates, PowerShell and Azure CLI. Portal support will be available shortly.

    Availability Zones now generally available

    Azure Firewall can be configured during deployment to span multiple Availability Zones for increased availability. With Availability Zones, your availability increases to 99.99 percent uptime. For more information, see the Azure Firewall Service Level Agreement (SLA). The 99.99 percent uptime SLA is offered when two or more Availability Zones are selected.

    You can also associate Azure Firewall to a specific zone just for proximity reasons, using the service standard 99.99 percent SLA.

    There's no additional cost for a firewall deployed in an Availability Zone. However, there are additional costs for inbound and outbound data transfers associated with Availability Zones. For more information, see Bandwidth pricing details.

      Image showing the creation of an Azure Firewall with 99.99 percent SLA.
    Figure two – Creating Azure Firewall with 99.99 percent SLA

     

    SQL FQDN filtering now in preview

    You can now configure SQL FQDNs in Azure Firewall application rules. This allows you to limit access from your VNets to only the specified SQL server instances. The capability is available as a preview in all Azure regions.

    Using this capability, you can filter traffic from your virtual networks (VNets) to Azure SQL Database, Azure SQL Data Warehouse, Azure SQL Managed Instance, or SQL IaaS instances deployed in your VNets.

    During preview, SQL FQDN filtering is supported in proxy-mode only, port 1433. If you are using non-default ports for SQL IaaS traffic, you can configure those ports in the Firewall application rules. If you are using SQL in redirect mode, which is default for clients connecting within Azure, you can filter access using the SQL service tag as part of Azure Firewall network rules.

    SQL FQDN filtering is currently available using REST APIs, templates, and Azure CLI. The portal will be available shortly.

    An image showing the creation of an Azure Firewall Application rule for SQL FQDN.

    Figure three – Creating Azure Firewall Application rule for SQL FQDN

    Azure HDInsight (HDI) FQDN tag now in preview

    We recently announced the availability of a FQDN Tag for Azure HDInsight (HDI). This tag is in public preview in all Azure public regions.

    VNet-deployed Azure services like HDI have outbound infrastructure dependencies on other Azure services, for example, Azure Storage. To protect your data from exfiltration risk, you might want to use Azure Firewall to restrict outbound access for HDI clusters and allow access to only your data.  In addition, you should also allow access to the HDI infrastructure traffic.

    FQDN tags for Azure Firewall allow services like HDI to pre-configure their infrastructure dependencies, for example, Azure Storage account FQDNs used by HDI. Instead of using network level service tags in the Azure Firewall to allow HDI outbound dependencies, you can get much more granular control to restrict outbound traffic for HDI by using the FQDN tags.

    An image showing the creation of an Azure Firewall Application rule for HDI FQDN tag.

    Figure four – Creating Azure Firewall Application rule for HDI FQDN tag

    Central management using partner solutions

    Azure Firewall public REST APIs can be used by third party security policy management tools to provide a centralized management experience for Azure Firewalls, Network Security Groups (NSGs), and network virtual appliances (NVAs).

    An image showing the logos for AlgoSec, Barracuda, and Tufin.

    Next steps

    For more information on everything we covered above please see the following blogs, documentation, and videos.

    Azure Firewall central management partners:

    Reducing overall storage costs with Azure Premium Blob Storage

    $
    0
    0

    In this blog post, we will take a closer look at pricing for Azure Premium Blob Storage, and its potential to reduce overall storage costs for some applications.

    Premium Blob Storage is Azure Blob Storage powered by solid-state drives (SSDs) for block blobs and append blobs. For more information see, “Azure Premium Blob Storage is now generally available.” It is ideal for workloads that require very fast storage response times and/or has a high rate of operations. For more details on performance see, “Premium Block Blob Storage - a new level of performance.”

    Azure Premium Blob Storage utilizes the same ‘pay-as-you-go’ pricing model used by standard general-purpose V2 (GPv2) hot, cool, and archive. This means customers only pay for the volume of data stored per month and the quantity of operations performed.

    The current blob pricing can be found on the Azure Storage pricing page. You will see, data storage gigabyte (GB) pricing decreases for colder tiers, while the inverse is true for operation prices where operations per 10,000 pricing decreases for hotter tiers. Premium data storage pricing is higher than hot data storage pricing. However, read and write operations pricing for premium are lower than hot read and write operations. This means premium blob storage is meant to store data that is transacted upon frequently and is not intended for storing infrequently or rarely accessed data.

    Given the lower operations costs, is there a point where premium, not only provides better performance, but also costs less than standard (GPv2) hot?

    To answer this question, I created the graph below, which shows the relative total monthly cost of storing 1 Terabytes (TiB) of data in standard (GPv2) hot and premium, varying the operations per second performed on this 1TiB of data using a 70/30 split between read and write operations.

    Graph showing that premium pricing does, at a point, become more cost effective than standard.

    As you can see in the graph above, the estimated total monthly cost for premium becomes less than standard (GPv2) hot between 40 to 50 operations per second for each 1TiB of data. This means customers will save money for workloads with high rate of operations by using premium even if they do not require the better performance provided by premium.

    Next steps

    To get started with Premium Blob Storage, you provision a ‘Block Blob’ storage account in your subscription, and start creating containers and blobs using the existing Blob Service REST API or any existing tools such as AzCopy or Azure Storage Explorer.

    Conclusion

    We are very excited about Azure Premium Blob Storage providing low and consistent latency, and the potential cost savings for applications with high rate of operations. We look forward to hearing your feedback at premiumblobfeedback@microsoft.com or feel free to share your ideas and suggestions for Azure Storage on our feedback forum. To learn more about Azure Blob Storage please visit our product page.

    Viewing all 5971 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>