Quantcast
Channel: Category Name
Viewing all 5971 articles
Browse latest View live

Visual Studio CMake Support – Clang/LLVM, CMake 3.14, Vcpkg, and Performance Improvements

$
0
0

We’ve introduced a bunch of improvements to our CMake support in the latest preview of Visual Studio 2019 Update 1. The latest release includes Clang/LLVM support, CMake 3.14, better vcpkg integration, and many more enhancements. If you are not familiar with Visual Studio’s CMake support, check out how to get started.

Clang/LLVM Support

Visual Studio now supports Clang/LLVM for CMake projects and you can now use Clang to target Windows and remote Linux machines. Full IntelliSense is supported along with Visual Studio’s comprehensive debugging support. Check out our Clang/LLVM post for much more information.

Install the “Clang compiler for Windows” optional component as part of the “Desktop development with C++” workload.

CMake 3.14 with Performance Improvements and Visual Studio 2019 Generator

We’ve updated the version of CMake that ships with Visual Studio to 3.14. This version includes the latest MSBuild generators that support Visual Studio 2019. You can learn more about what’s new in CMake 3.14 in CMake’s release notes.

Microsoft also worked with the CMake community to bring a new way of interfacing IDEs with CMake in 3.14 that replaces CMake server mode. What this means for you, is that many CMake operations (especially configuration) will be faster from now on. If you are interested in learning more about this, we are going to be writing a whole post about this soon.

Better Vcpkg Integration

In our first release of Visual Studio 2019 we introduced basic integration with vcpkg. In the latest preview we have enhanced with better integration with IntelliSense. Now, when you are adding a header for a missing library, we will recommend how to install it with vcpkg if it is available. There is also now IntelliSense for CMake’s “find_package” directive:

Install the “Clang compiler for Windows” optional component as part of the “Desktop development with C++” workload.

Simpler CMake Build and Install Roots

With Visual Studio 2019 version 16.1 Preview 2 we have updated the default build and install roots for local and remote CMake projects to be more in-line with the standard set by open-source CMake projects. When you create a new CMake project in Visual Studio using the “CMake Project” template, or add a new configuration to an existing project, the following build and install roots will be used. You can change these properties at any time by modifying the “Build root” and “Install root” (or “Remote build root” and “Remote install root”) properties of the CMake Settings Editor.

Local CMake Projects:

Build root: ${projectDir}\out\build\${name}
Install root: ${projectDir}\out\install\${name}

Remote CMake Projects:

Remote build root: $HOME/.vs/${projectDirName}/${workspaceHash}/out/build/${name}
Remote install root: $HOME/.vs/${projectDirName}/${workspaceHash}/out/install/${name}

All of the macros used in CMakeSettings.json can be referenced here. In these file paths:

  • ${projectDir} is the full path to the folder of the root CMakeLists.txt file
  • ${name} is the name of the configuration
  • ${projectDirName} is the name of the root directory (similar to ${projectDir} but omits the rest of the path)
  • ${workspaceHash} is a hash of the workspace location. It is used because project names are not guaranteed to be unique across directories.

These changes also resolve an issue with the old default build/install roots for Linux projects where you would see a mkdir failure if there was no write access to the folder (/var/tmp/…).

Changes to the Default CMake Configuration

Previously, new CMake projects in Visual Studio would create a default configuration that was never saved to disc. Now Visual Studio will create a default “x64-Debug” or “x86-Debug” configuration that matches the bitness of your machine and persists like a normal configuration.

Enhanced Customizability for Importing Existing Caches

We’ve added a new way to customize how existing caches are imported into Visual Studio. Previously, the IDE would re-run CMake with its initial settings to refresh the cache and build tree if you made changes to the CMake projects. You can now replace that behavior with any custom script or executable that drives the recreation of the CMake cache. If your project drives CMake with external scripts or custom tools, you can now fully integrate it with the IDE. Edit, build, and debug your complex codebases without having to leave the IDE and go back to the command line.

To use this feature, just set “cacheGenerationCommand” to the command line of your script or tool in your CMakeSettings.json file:

Cache generation command in CMakeSettings.json using a Python script to invoke CMake.

Logging for Remote Connections in Visual Studio

With Visual Studio 2019 you can develop C++ for both Linux and Windows from the comfort of a single IDE. In Visual Studio 2019 version 16.1 Preview 2 we have added logging for SSH connections to remote systems. This makes it easier to diagnose issues with remote connections and provides full transparency into what commands are being executed on your remote machine. Logs include connections, all commands sent to the remote machine (their text, exit code and execution time), and writes from Visual Studio to the shell.

Logging works for any cross-platform CMake project or MSBuild based Linux project in Visual Studio.  To enable logging, navigate to Tools > Options > Cross Platform > Logging, or search for “SSH logging” in the search bar at the top of your screen.

Remote connection logging settings under Tools > Options > Cross Platform.

You can configure logging to a file or to the “Cross Platform Logging” pane in the Output Window. Note that for MSBuild based Linux projects, commands issued to the remote machine by MSBuild will not get routed to the Output Window because they are emitted out-of-proc. They are, however, logged to a file with the “msbuild_” prefix.

Send Us Feedback

Your feedback is a critical part of ensuring that we can deliver the best CMake experience.  We would love to know how Visual Studio 2019 version 16.1 Preview 2 is working for you. If you have any questions specific to CMake Tools, please reach out to cmake@microsoft.com or leave a comment. If you find any issues or have a suggestion, the best way to reach out to us is to Report a Problem.

The post Visual Studio CMake Support – Clang/LLVM, CMake 3.14, Vcpkg, and Performance Improvements appeared first on C++ Team Blog.


Clang/LLVM Support in Visual Studio

$
0
0

Visual Studio 2019 version 16.1 Preview 2 comes with support for Clang/LLVM out-of-the-box. Visual Studio has had great tooling for MSVC and GCC for quite a while now. The latest preview brings Clang into the fold.

Visual Studio 2019 includes out of the box support for editing, building, and debugging CMake projects with Clang/LLVM. If you use MSBuild projects, however, don’t worry. Support for MSBuild based .vcxproj projects is coming soon as well. If you are not familiar with Visual Studio’s CMake, support check out how to get started.

If you are developing on Windows and targeting Clang, we hope to make Visual Studio the best IDE for that job. Please try out the preview and let us know what is working well and what isn’t.

Installing Clang/LLVM for use with Visual Studio

The “Desktop development with C++” workload with the Visual Studio installer now includes full support for targeting Clang/LLVM based toolsets. You will also need the actual compiler and tools to build your projects.

Getting Clang on Windows

On Windows, it’s easy to install the Clang tools. Just grab the “Clang compiler for Windows,” an optional component of the “Desktop development with C++” workload. This will install everything you need to develop with Clang on Windows.

Install the “Clang compiler for Windows” optional component as part of the “Desktop development with C++” workload.

You can also install your own copy of Clang/LLVM or even build it from source. If you have already installed Clang/LLVM you don’t need to install the compiler again with the Visual Studio installer. We do recommend that you use the most recent version of Clang to get the best support in the IDE. Older versions may have some limitations. Check out “Using a Custom Version of Clang” below for more information.

Getting Clang on Linux

To use Clang/LLVM on a remote Linux machine with Visual Studio, just install it with your distribution’s package manager. If ‘which clang’ finds the compiler you are good to go.

Getting Started with Clang/LLVM in Visual Studio

Once you have installed Clang, using it with CMake projects is easy. Visual Studio will use the MSVC compiler by default on Windows. On Linux, it will use the distribution’s default compiler, often GCC. To use Clang instead, add a configuration and select one of the Clang presets.

Select “Manage Configurations” from the configuration drop down on the left of the play button. Then click the “Add Configuration” button:

Clang for Windows CMake configuration presets.

Another option is to modify an existing configuration with the CMake Settings editor to use any of the “clang” toolsets with the Toolset dropdown:

These toolsets will use Clang in clang-cl mode by default on Windows and link with the Microsoft STL. If you are targeting Linux, they will use Clang in GCC compatibility mode. Toolsets to build both 32 and 64-bit binaries are available on Windows while the Linux toolsets will target the architecture of the remote machine. Most of this behavior can be tweaked by customizing either the CMake command line or in the CMake project itself.

Edit, Build, and Debug

Once you have set up a Clang configuration, build and debug work exactly as you would expect in the IDE. Visual Studio will automatically detect that you are using the Clang compiler and provide applicable IntelliSense, highlighting, navigation, and other editing features. Building projects should just work, provided they are compatible with Clang, with any errors or warnings being directed to the Output Window.

Whether you are targeting Windows or Linux, the debugging experience with Clang should be familiar to you. Most of Visual Studio’s debugging features also work with Clang. There are just a few exceptions for compiler dependent features (e.g. edit and continue.) Breakpoints, memory and data visualization, and other inner development loop debugging features are available:

Using a Custom Installation of Clang

Visual Studio will automatically look for Clang in two places:

  1. (Windows) The internally installed copy of Clang/LLVM that comes with the Visual Studio installer.
  2. (Windows and Linux) The PATH environment variable.

If it doesn’t find Clang in one of those places it will offer to install it on Windows. On Linux, you will need to install it with the distribution’s package manager. Alternatively, you can tell Visual Studio to use another installation of Clang/LLVM on the machine by setting the “CMAKE_C_COMPILER” and “CMAKE_CXX_COMPILER” CMake variables in your CMake Settings:

Set “CMAKE_C/CXX_COMPILER” variables with the CMake Settings editor to use a custom Clang installation.

Keep in mind, however, that using old versions of Clang may come with some limitations. For instance, if you are using the Microsoft STL on Windows, support is only gaurenteed for the most recent version of the compiler. As we release new versions of Visual Studio and the Microsoft STL you may need to upgrade your custom installation.

Using Clang/LLVM with MSBuild Projects

Visual Studio 2019 version 16.1 ships with out of the box support for Clang/LLVM for CMake projects. Support for MSBuild based C++ projects is on the roadmap and will be available soon in a future release. To use Clang/LLVM today with MSBuild based C++ projects there are 3rd party extensions for Visual Studio available.

Send Us Feedback

Please try out the latest preview and let us know if you have any feedback. It is always appreciated! The best way to get in touch with us about an issue or suggestion is though Developer Community with the “Report a Problem” or “Suggest a Feature” tools. This makes it easy for us to follow up and for you to get the latest updates about our progress. Feel free to comment here or send an email to cmake@microsoft.com with questions as well.

The post Clang/LLVM Support in Visual Studio appeared first on C++ Team Blog.

Open Source Artificial Pancreases will become the new standard of care for Diabetes in 2019

$
0
0

Loop is an open source pancreas for iPhoneI've been a Type 1 diabetic for over 25 years. Diabetes sucks. They actually give you an award for staying alive for years on insulin. Diabetics don't usually die of old age, they die of heart disease or stroke, kidney failure, and while they're at it they may go blind, get nerve damage, amputation, and a bunch of other stuff. It used to be a death sentence but when insulin was introduced as a treatment in 1921, there was a chance for something new.

The idea is if you keep your blood sugars close to normal - if you can simulate your non-working pancreas - you'll get hit by an ice cream truck! At least, that's how I hope I go. :)

  • Early on it was boiling big gauge steel needles and pork insulin to dose, and peeing on a stick to get a sense of sugar levels.
  • Then it was a dozen finger pricks a day and a half dozens manual shots with a syringe.
  • Then it was inserted continuous glucose meters and insulin pumps that - while not automatic - mean less invasive treatment and greater control.

Today, we are closing the loop. What's the loop? It's this:

  1. Consider my glucose levels, what I'm about to eat, and what I'm about to to (and dozens of other environmental factors)
  2. Dose myself with insulin
  3. GOTO 1. Every few hours, or every few minutes, depending on the situation.

I do that. Manually. Every diabetic does, and the mental pressure - the intense background psychic weight of it all - is overwhelming. We want to lower the cognitive load of diabetes. This is a disease where you may not live as long if you're not good at math. Literally. That's unfair.

The community is "looping" by allowing an algorithm to make some of those decisions for me.

I've personally been looping with an open source artificial pancreas for over two years. It's night and day from where I started with finger sticks and a half dozen needle sticks a day. It's not perfect, it's not automatic, but Open Source Pancreas are "Tesla autopilot for diabetes." It doesn't always park the car right or stop at every stop light, but it works very hard to keep me in-between the lines and going straight ahead and now that I have it, I can't imagine living without it.

I sleep through the night while my Loop makes tiny adjustments every five minutes to keep my sugars as flat as possible. I don't know about you but my pancreas sits on my nightstand.

It's happening and it can't be stopped

Seven years ago I wrote about The Sad State of Diabetes Technology in 2012. Three years ago The Promising State of Diabetes Technology in 2016 and last year The Extremely Promising State of Diabetes Technology in 2018. There's a great comment from the first blog post in 2012 where Howard Loop shared his frustration with the state of things. Unlike most commenters on the Internet, amazingly Howard took action and started the Tidepool Organization! Everything in his comment from 7 years ago is happening.
Great article, Scott. You've accurately captured the frustration I've felt since my 12 year old daughter was diagnosed with T1D nine months ago. She also wears a pump and CGM and bravely performs the ritual you demonstrate in your video every three days. The technology is so retro it's embarrassing.

It's 2019 and things are really looking up. The open source DIY diabetes community is thriving. There are SEVERAL open pancreas systems to choose from and there's constant innovation happening with OpenAPS and Loop/LoopKit.

  • OpenAPS runs on devices like Raspberry Pi Zeros and is a self-contained pancreas with the communications and brain/algorithm all on the main device.
  • Loop runs on an iPhone and uses a "RileyLink" devices that bridges the RF (Radio Frequency) insulin pump communications with modern Bluetooth.

The first bad part is I am running a 15 year old out of warranty cracked insulin pump I bought on Craigslist. Most new pumps are locked down, and my old pump is the last version that supported remote control. However, the Loop open source project announced support for a second pump this week, the OmniPod Eros. This is the first time an "in warranty" pump has been supported and it also proves the larger point made by the diabetes community. We Are Not Waiting. We want open choice and open data and open choices that put us in control.

Read about the history of Loop by original developer Nate Racklyeft. As he points out, a thing like Loop or OpenAPS is the result of a thousand little steps and innovation by countless community members who are so generous with their time.

The first system to run it was a Raspberry Pi; the code was a series of plugins, written with the help of Chris Hannemann, to the openaps toolkit developed by Ben West in collaboration with Dana Lewis and Scott Leibrand. I’m still in awe of the elegant premise in Ben’s design: a system of repeatable, recordable, and extendable transform commands, all backed by Git. The central plugin of the toolkit is decocare: Ben’s 5-year magnum opus, a reverse-engineered protocol of the Minimed Carelink USB radio to command insulin pumps.

There's an amazing write up by Pete Schwamb, one of the core members of the community who works on Loop full time now,  on how Software Defined Radios have allowed the community to "sniff" the communication protocols of insulin pumps in the RF spectrum and reverse engineer the communications for the Medtronic and now Omnipod Eros Insulin Pumps. It's a fascinating read that really illustrates how you just need the right people and a good cause and you can do anything.

You can watch my video presentation "Solving Diabetes with an Open Source Artificial Pancreas" where I offer an overview of the problem, a number solutions offered over the year, and two open source pancreas options in the form of LoopKit and OpenAPS.

The community members and organizations like Tidepool and the Nightscout Foundation are working with the FDA to take projects and concepts like an open source pancreas system from a threat based on years of frustration to a bright future based on mutual collaboration!

In March, 2018, the FDA announced a de novo iCGM (integrated CGM) designation. A de novo designation is the FDA process for creating new device classifications, in this case moving qualifying CGMs from Class-III, the highest FDA risk classification, to Class-II with Special Controls. The first CGM to get this designation is the Dexcom G6.

Diabetic Xbox AvatarWhat does this mean? It means the FDA is willing to classify continuous glucose meters in a formal way that paves a path towards interoperable devices. Today we hack devices to build these Loops with out-of-warranty pumps. We are doing this utterly on our own. It can take months to collect the equipment needed, get ancient pumps on the gray market, compile the software yourself - which is a huge hurdle for the non-technical.

Imagine a future where someone could buy a supported and in-warranty "iPump," download an officially supported app or package, and start looping! We could have world of open and interoperable devices and swappable algorithms.

In October of 2018 the non-profit Tidepool organization announced its intent to deliver the Loop app as a supported and FDA-regulated mobile app in the Apple App Store! This is happening, people but we are just getting started.

To learn more, start reading.

Also, if you're diabetic, consider buying a Nightscout Xbox Avatar accessory so you can see yourself represented while you game!


Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!



© 2018 Scott Hanselman. All rights reserved.
     

Best practices in migrating SAP applications to Azure – part 2

Connecting Global Azure Bootcampers with a cosmic chat app

$
0
0

Every year on the same day, the Global Azure Bootcamp brings together cloud enthusiasts who are eager to learn about Microsoft Azure. This year, the Bootcamp will be held on April 27, 2019 in more than 300 locations worldwide where local communities will welcome developers and IT professionals for a day of workshops, labs, and talks.

It will be a great experience for tens of thousands of Azure Bootcampers all around the world to simultaneously participate in this event. We decided to add a little “cosmic touch” to the Bootcamp by letting the attendees greet each other with a chat app that’s powered by Azure Cosmos DB. The chat app will be made available on April 27, 2019 by following a link displayed in each Bootcamp location!

Azure Cosmos DB chat bot

A global database for a global event

If you think about it, this marriage just makes sense. It’s a planet-scale bootcamp, and what better technology than a globally distributed database to serve a global event?

From the beginning, we tackled this project with the goal of a great user experience. For a chat app, this means low latency in the ingestion and delivery of messages. To achieve that, we deployed our web chat over several Azure regions worldwide and let Azure Traffic Manager route users’ requests to the nearest region where our Cosmos database was deployed to bring data close to the compute and the users being served. That was enough to yield near real-time message delivery performance as we let Azure Cosmos DB replicate new messages to each covered region.

Map displaying Azure Cosmos DB regions

To minimize ingestion latency, we also configured our Cosmos database to be deployed in multi-master mode, which makes it possible to write data to any region the database is deployed to.

The Azure Cosmos DB change feed as a publication system

You expect a chat app to display new messages in real-time, so these messages have to be pushed to the clients without the need for a slow and costly polling mechanism. We used Azure Cosmos DB’s change feed to subscribe to new messages being received and delivered them to each client through the Azure SignalR service.

We also chose to consume the change feed locally in each region to further minimize delivery latency, as having a single point of dispatch would have effectively reduced the benefits we got from being globally distributed and multi-mastered.

Here’s a high-level diagram showing the Azure services in play and how they interact:

Diagram showing the Azure services used to create the chat bot and how they interact

Planet-scale, batteries included

So, what does it take to build such a distributed app? Nothing much really, as Azure Cosmos DB takes care of all the rocket science required! Since all the complexity involved with global data distribution is effectively hidden from the developer, what’s left are simple concepts that are easy to reason about:

  • You write messages to the region that’s closest to you.
  • You receive messages through the change feed, whatever region they were written from.

These concepts are straightforward to implement, and it only took around a day of development time to have the chat app up and running.

Keeping it friendly with automatic moderation

As our chat app was about to be made available for anyone to use, we also had to make sure to filter out any inappropriate message that may spoil the friendly spirit of the Global Azure Bootcamp. To achieve that, we used the content validation service powering the Content Moderator APIs from Azure Cognitive Services to scan each incoming message. This turnkey solution allows us to detect potential profanity or presence of personally identifiable information, and instantly block the related messages to ensure a positive and safe experience for everyone.

The database for real-time, global workloads

By ensuring the lowest latency in both the ingestion and delivery of messages, the combination of Azure Cosmos DB’s global footprint, multi-master capability, and change feed API let us achieve the best experience for anyone using the chat app, wherever they are on the planet.

Want to find out how Azure Cosmos DB’s unique characteristics can help you solve your next data challenges?

Optimize performance using Azure Database for PostgreSQL Recommendations

Securing Azure SQL Databases with managed identities just got easier!

$
0
0

We are happy to share the second preview release of the Azure Services App Authentication library, version 1.2.0. This release enables simple and seamless authentication to Azure SQL Database for existing .NET applications with no code changes – only configuration changes! Up until this release, developers who wanted their existing SQL applications to use managed identities and AAD-based authentication were required to make code changes to retrieve and set the access token used for authentication. As many developers are familiar with the App Authentication library, they opted to use it to retrieve the access token, as below:

SqlConnection connection = new SqlConnection(connectionString);
connection.AccessToken = await (new AzureServiceTokenProvider()).GetAccessTokenAsync("https://database.windows.net/")
connection.Open();

Implementing and testing these code changes were often not straightforward. This was compounded further with EntityFramework where these code changes were more difficult and error-prone for developers, as evidenced by the many external questions we see on StackOverflow. However, in .NET Framework 4.7.2, new functionality for SqlClient was added that enables developers to register a SQL authentication provider via an application's configuration. We've already done the work of implementing this SQL authentication provider for developers, so now the only actions they need to do are a few simple configuration changes to register our provider, as shown below.

With this release of the App Authentication library, we hope to enable many developers to try out our new functionality in existing SQL-backed solutions and gain the security benefits that the App Authentication library and managed identities afford. For more details and to try out this new functionality, please check out our new sample.

Before, using a connection string containing credentials:

<connectionStrings>
   <!-- using connection string containing credentials -->
   <add name="MyDbConnection" connectionString="Data Source=tcp:[SQL Server Name].database.windows.net;Initial Catalog=[SQL DB name];UID=[UID];Password=[pwd]" providerName="System.Data.SqlClient" />
</connectionStrings>

After, registering App Authentication’s SqlAppAuthenticationProvider:

<configSections>
   ...
   <!-- Change #1: Register the new SqlAuthenticationProvider configuration section -->
   <section name="SqlAuthenticationProviders" type="System.Data.SqlClient.SqlAuthenticationProviderConfigurationSection, System.Data, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />
</configSections>
<connectionStrings>
   <!-- Change #2: Update the connection string to remove credentials and reference the registered SQL authentication provider from change #3 -->
   <add name="MyDbConnection" connectionString="Data Source=tcp:[SQL Server Name].database.windows.net;Initial Catalog=[SQL DB Name];UID=AnyString;Authentication=Active Directory Interactive" providerName="System.Data.SqlClient" />
</connectionStrings>
<!-- Change #3: Add the new SqlAuthenticationProvider configuration section, registering the built-in authentication provider in AppAuth library -->
<SqlAuthenticationProviders>
   <providers>
     <add name="Active Directory Interactive" type="Microsoft.Azure.Services.AppAuthentication.SqlAppAuthenticationProvider, Microsoft.Azure.Services.AppAuthentication" />
   </providers>
</SqlAuthenticationProviders>

We hope you enjoy trying out our new SQL authentication provider functionality in the App Authentication library! For any questions or concerns that you may have, please reach out to us on StackOverflow.

Get ready for Global Azure Bootcamp 2019

$
0
0

Early this Saturday morning—while it’s still Friday in many parts of the world—people will gather at several locations in New Zealand to kick off Global Azure Bootcamp 2019 to learn about Microsoft Azure and cloud computing.

Truly global

Throughout the day, thousands more will attend these free events to expand their knowledge about Azure using a variety of formats as chosen by each location. As Saturday makes its way around the globe, more people will gather in groups of 5 to 500 at hundreds of other locations to do the same until the final locations on the west coast of North America get underway. Chances are, there’s a location near you. Check out the map to find your local event.

Global Azure Bootcamp is a 100 percent community-driven event driven by MVPs, regional directors, and user group leaders around the world working in collaboration to deliver the largest one-day, global Azure event.

Learn more about Global Azure Bootcamp from this recent episode of Azure Friday, in which Scott Hanselman spoke with Global Azure Bootcamp co-founder and Azure MVP, Magnus Mårtensson:

Get ready for Global Azure Bootcamp 2019

Science lab

This year, the organizers teamed up once again with the Instituto de Astrofisica de Canarias in Tenerife, Spain in the Canary Islands to provide those seeking a real-world challenge with a science lab to systematically analyze data from NASA’s new Transiting Exoplanet Survey Satellite (TESS) using a machine learning (ML) algorithm to help search for new planets from millions of stellar light curves.

TESS is an MIT-led NASA mission, an all-sky survey for transiting exoplanets. Transiting planets are those that go in front of the star as seen from the telescope and, to date, is the most successful discovery technique for finding small exoplanets. - TESS

You can monitor the Science Lab Dashboard to track the progress of your deployment and how is performing, comparing with other community members around the world. For more information, see The IAC, re-elected to the Science Lab of Microsoft's Azure computer network.

Sponsors

Global Azure Bootcamp organizers work closely with Microsoft to ensure both bellies and brains are well fed with free lunches at many locations. Global Azure Bootcamp is also sponsored by CloudMonix, Serverless360, JetBrains, RevDeBug, Skill Me Up, Kemp, Progate, and Enzo Online.

See you Saturday!

Scott Hanselman, Partner Program Manager at Microsoft noted, "If you've been looking for an excuse to learn about Azure, you should take advantage of this free opportunity to grab lunch while you learn about the cloud without having to travel far from home. Heck, you may even help discover a new exoplanet. That's not a bad way to spend a Saturday."

And if you can’t join us at one of the hundreds of events on Saturday, check out the Global Online Azure Bootcamp that’ll take stream throughout the day on YouTube and be sure to follow #GlobalAzure on Twitter throughout the day.


Announcing Azure Backup support to move Recovery Services vaults

$
0
0

A Recovery Services vault is an Azure Resource Manager resource to manage your backup and disaster recovery needs natively in the cloud. Today, we are pleased to announce the general availability support of the move functionality for recovery services vaults. With this feature you can migrate a vault between subscriptions and resource groups with a few steps, in minimal downtime and without any data-loss of old backups. You can move a Recovery Services vault and retain recovery points of protected virtual machines (VMs) to restore to any point in time later.

Key benefits

  1. Flexibility to move across subscriptions and resource groups: You can move the vault across resource groups and subscriptions. This is very helpful in scenarios like expiry of old subscription, moving from an Enterprise Agreement to Cloud Solution Provider subscription, organizational and departmental changes, or separation between QA environment and production environment.
  2. No impact to old restore points and settings in the vault: Post migration, all the settings, backup policies, and configurations in the vault are retained, including all backup and recovery points created in the past inside the vault.
  3. Restore support independent of the VM in the old subscription: You can restore from retained backup history in the vault regardless of whether the VM is moved with the vault or not to the target subscription.

For step-by-step guidance on moving your recovery services vaults, refer the documentation, “Move a Recovery Services vault across Azure Subscriptions and Resource Groups.”

Related links and additional content

Come meet Microsoft at PyCon 2019!

$
0
0

Next week we (the Python team here at Microsoft) will be at the PyCon conference in Cleveland, OH on May 1-9, and are looking forward to meeting you! We are excited to support this event as Keystone sponsors of PyCon for the third time, and we will be participating in all aspects of the conference. Come join our workshops, then stop by our booth to get some great swag and check out the latest in Visual Studio Code, Azure, Azure Pipelines, and Python on Windows. Lastly, come work alongside our team at the developer sprints!

Speaking of our team, this year we prepared a short video to help you get to know us, and also to know who to look for when you drop by our booth:

Keep on reading to learn more about what we’re doing at PyCon and when/where to come find us throughout the conference!

Demos, hands-on-labs, expert help and swag at the Microsoft booth

We will be at the Microsoft booth in the expo hall during the opening reception 5:30-7:30PM Thursday May 2nd, and during the conference on Friday and Saturday. We will be handing out some fun and unique swag and will have members from across our team available to chat and show you the latest in Visual Studio Code, Azure, Windows, and Azure DevOps.

There will be a lot of exciting stuff going on at our booth, so be sure to:

  • Come by our demo stations to chat with our team and get a guided tour of our products.
  • Work through hands-on labs to quickly try out the latest capabilities for yourself, and pick up an Adafruit lunchbox kit (while supplies last).
  • Pick up your Adafruit kit by completing labs before Friday afternoon, and ask about details about a special CircuitPython mini-workshop by the Adafruit team happening at our booth later that day.
  • Come to the Azure Starting Zone for expert help with setting up Visual Studio Code, Azure, and Azure Pipelines on your own machine.
  • Fill out a survey to get a one-of-a-kind t-shirt.

We will also be announcing some new features at the opening reception, so don’t miss it!

Check out our Conference Talks

We also have some of our team members giving talks during the conference, all on Sunday May 5th:

  • Python on Windows is Okay, Actually by Steve Dower at 1:50PM on Sunday in Room 26A/B/C,
  • Brett Cannon from Microsoft and Barry Warsaw from LinkedIn will be part of the Python Steering Council’s keynote presentation at 9:20PM
  • Nina Zakharenko’s keynote presentation on the joys of using Python on embedded devices and how it impacts and grows our community at 3:30PM

These talks are not affiliated in any way with Microsoft’s sponsorship of PyCon.

Scaling out Postgres with Citus Data

Microsoft recently acquired Citus Data, who is also a sponsor of PyCon this year. If you use PostgreSQL (or are considering using Postgres), be sure to talk to our new teammates at Citus Data at booth #516. They’ll have elicorn socks, stickers, and more importantly, can give you a deep dive on running distributed PostgreSQL at scale—whether as open source, as enterprise software, or as a fully-managed Database-as-a-Service.

Sponsored Workshop: Dog Classification using PyTorch, Azure, and Visual Studio Code

Attend our first sponsored workshop Dog Classification using PyTorch, Azure, and Visual Studio Code on Thursday April 2nd from 11AM-12:30PM in room 13. Rong Lu will show you how to train and deploy a machine learning model to dog and cat breeds.

Sponsored Workshop: An Introduction to Building Scalable Web Apps with Visual Studio Code and Azure

Attend our second workshop An Introduction to Building Scalable Web Apps with Visual Studio Code and Azure on Thursday April 2nd from 1:30-3PM in Room 23 to learn from Nina Zakharenko how to build and deploy a Django + PostgreSQL web application.

Code beside us at the Developer Sprints

A number of Python core developers work at Microsoft and will be attending the developer sprints to work on CPython, come join in on the fun if you are around!

Can’t make it to PyCon in Cleveland?

If you cannot make it to the conference, we cannot make that up to you, but we can share some resources that include a subset of what we will be showing at PyCon:

The post Come meet Microsoft at PyCon 2019! appeared first on Python.

Azure DevOps Labs now includes Azure DevOps Server 2019 VM and labs

$
0
0

Azure DevOps Labs now includes Azure DevOps Server 2019 VM and labs

Today, I am excited to announce the availability of Azure DevOps Server 2019 Virtual Machine and the corresponding self-paced labs on Azure DevOps Labs.

For the past decade, we have regularly maintained and enhanced the Hyper-V virtual machine with the latest bits, along with a growing set of hands-on labs. This combination provides an ideal set of self-guided instructional materials that anyone can pick up and use to learn all about the current state of our DevOps products and features.

And the best part is that everything is already set up and configured for you. Simply download the virtual machine and follow the illustrated steps for the scenario you want to learn about. Plus, we’re always adding and improving labs, so give us your feedback and check back often to see how our DevOps stack can improve and enhance the productivity of your software team.

Microsoft Hands-on Labs

The VM is also available on Microsoft Hands-On Labs for those who want to evaluate or try the labs without the hassle of downloading or setting up the virtual machine.

New and Updated Labs in this version:

  • Agile Planning and Portfolio Management with Azure DevOps Server 2019
  • Managing Delivery Plans with Azure DevOps Server 2019
  • Collaboration Experiences for Development Teams using Azure DevOps Server 2019
  • Getting Started with Git using Azure DevOps Server 2019
  • Getting Started with GitHub using Visual Studio 2019
  • Introduction to Build Pipelines in Azure DevOps Server 2019
  • Embracing Continuous Delivery with Release Management for Azure DevOps Server 2019
  • Package Management in Azure DevOps Server 2019
  • Making Developers More Productive with Azure DevOps Server 2019
  • Test Planning and Management with Azure DevOps Server 2019
  • Debugging with IntelliTrace in Visual Studio Enterprise 2019
  • Debugging with Snapshot Debugger
  • Diagnosing Issues in Production with IntelliTrace and Visual Studio 2019
  • Generating Unit Tests with IntelliTest using Visual Studio Enterprise 2019
  • Live Dependency Validation in Visual Studio 2019
  • Live Unit Testing, Code Coverage and Code Clone Analysis with Visual Studio 2019
  • Using Code Analysis with Visual Studio 2019 to Improve Code Quality
  • Exploratory Testing and Feedback Management using Azure DevOps Server 2019
  • Authoring ARM Templates with Visual Studio
  • Building ASP .NET apps in Azure with SQL Database
  • Working with EditorConfig in Visual Studio 2019

To find out more including links to download the VM and the Hands-on-Labs, please check out the DevOps Server page on the Azure DevOps Labs site

The post Azure DevOps Labs now includes Azure DevOps Server 2019 VM and labs appeared first on Azure DevOps Blog.

AI, Machine Learning and Data Science Roundup: April 2019

$
0
0

A monthly roundup of news about Artificial Intelligence, Machine Learning and Data Science. This is an eclectic collection of interesting blog posts, software announcements and data applications from Microsoft and elsewhere that I've noted over the past month or so.

Open Source AI, ML & Data Science News

Facebook open-sources PyTorch-BigGraph for producing embeddings for graphs where the model is too large to fit in memory. Also published: an embeddings graph for 50 million Wikipedia concepts.

Visual Studio Code expands Python support, including a new variable explorer and data viewer, improved debugging capabilities, and real-time collaboration via Live Share.

Pyright, a static type-checker for Python, available as a command-line tool and a VS Code extension.

Frequently Asked Questions about Tensorflow 2.0, now in alpha testing.

VoTT, an open source annotation and labeling tool for image and video assets, has been updated to version 2.

Industry News

Google announces AI Platform (beta), an integrated platform for developers to create AI capabilities and deploy them to GCP or on premises.

New enterprise AI solutions from Google (in beta): Document Understanding AI, Contact Center AI, and Recommendations AI for retail.

Google made a number of other of AI-related announcements at Google Next, including new ML capabilities in BigQuery, AutoML Tables to generate an ML model predicting a data column with one click, updated pre-trained models for vision and natural language services, general availability of Cloud TPU v3, and more.

NVIDIA Tesla G4 GPUs are now available in Google Colab for faster training with larger models.

Microsoft News

Microsoft commits to host the world's leading environmental data sets on Azure as part of a broader sustainability initiative.

Microsoft's research in Machine Teaching: the process of augmenting models with human input rather than using data alone.

Anomaly Detector, a new Azure Cognitive Service to detect unusual values in time series data, is now in preview.

Azure Custom Vision, the transfer learning service for image classification with user-provided labels, is now generally available.

Azure Video Indexer can now be trained to recognize specific people in video from user-provided photographs.

Azure Cognitive Services are now accessible in Apache Spark workloads, via Spark ML pipelines.

You can now deploy Tensorflow models to Azure Data Box Edge with Azure ML Service and ONNX, for local inference on high-performance FPGA chips.

Azure HDInsight now supports Apache Hadoop 3.0, and updated versions of Hive, HBase, and Phoenix.

Managed MLflow is now generally available in Azure Databricks, to track experiments and manage models and projects. Microsoft has also joined the open-source MLflow project.

PowerBI now offers AutoML, to enable business analysts to build machine learning models from data without coding.

Learning resources

Advanced Natural Language Processing with spaCy, a free online course from Ines Montani, a core developer of the Python package. These tutorial notebooks on sentiment analysis by Ben Trevett also make use of spaCy.

The Economist reworks some of their worst published charts, providing a lesson in better data visualization.

Discriminating Systems: Gender, Race and Power in AI: Results from a year-long study on diversity in the AI sector from the AI Now Institute.

A tutorial on creating a serverless HTTP endpoint with Python and Azure Functions using Visual Studio Code.

Tutorial: incrporating predictions from an R model in Power BI report.

AWS Machine Learning University: free access to the curriculum used to teach Amazon staff about ML on AWS.

An introduction to reinforcement learning, with AWS RoboMaker.

Applications

Ganvatar: an impressive demonstration of synthesizing faces along semantic vector axes (age, gender, happiness).

Using the time series and outlier detection features of the Kusto Query Language to detect cyber threats with Azure Sentinel.

Forecasting orange juice sales in a grocery chain (Jupyter Notebook), using automated machine learning in Azure ML Service.

Seek, a smartphone app that uses computer vision to identify plant and animal species in real time.

Find previous editions of the monthly AI roundup here.

Software Defined Radio is a great way to bridge the physical and the digital and teach STEM

$
0
0

Software Defined Radio AdapterOne of the magical technologies that makes an Open Source Artificial Pancreas possible is "Software-defined Radio" or SDR. I have found that SDR is one of those technologies that you've either heard of and agree it's amazing or you've literally never heard of it. Well, buckle up, friends

There's an amazing write up by Pete Schwamb, one of the core members of the community who works on Loop full time now, on how Software Defined Radios have allowed the community to "sniff" the communication protocols of insulin pumps in the RF spectrum and reverse engineer the communications for the Medtronic and now Omnipod Eros Insulin Pumps. It's a fascinating read that really illustrates how you just need the right people and a good cause and you can do anything.

In his post, Pete explains how he configured the SDR attached to his computer to listen into the 433MHz range and capture the RF (radio frequencies) coming to and from an insulin pump. He shows how the shifts between a slightly higher and slightly lower frequency is used to express 1s and 0s, just like a high voltage is a 1 and a low or no voltage is a 0.

Radio Frequency to 1s and 0s

Then he gets a whole "packet," plucks it out of the thin air, and then manipulates it from Python. Insert Major Motion Picture Programmer Montage and a open source pancreas pops out the other side.

1s and 0s from RF into a string in Python

Lemme tell you, Dear Reader, Hello World is nice, but pulling binary data out of electromagnetic radiation with wavelengths in the electromagnetic spectrum longer than infrared light is THE HOTNESS.

From a STEM perspective, SDR is more fun than Console Apps when educating kids about the world and it's a great way to make the abstract REAL while teaching programming and science.

You can get a SDR kit for as little as US$20 as a USB device. They are so simple and small it's hard to believe they work at all.

Just plug it in and download Airspy (Formerly SDRSharp, there are many choices in the SDR space). and run the install-rtlsdr.bat to setup a few drivers.

You'll want to run zadig.exe and change the default driver for listening to radio (FM, TV) over to something more low-level. Run it, select "List All Interfaces," and select "Bulk Interface 0"

Updating SDR wtih Zadig

After you hit Replace Driver with WinUSB, you can close this and run SDRSharp.exe.

I've set my SDRSharp to WFM (FM Radio) and turned the Gain up and OMG it's the radio.

Listening to the Radio with SDR

In this pic I'm listening to 91.5 FM in Portland, Oregon which is National Public Radio. The news is the center red line moving down, while the far right is 92.3, a rock station, and 90.7 on the far left is more jazz. You can almost see it!

AdaFruit has as great SDR tutorial and I'll use it to find the local station for National Weather Radio. This is the weather alert that is available anywhere here in America. Mine was Narrow Band (WFM) at 162.550 FM! It was harder to hear but it was there when I turned up the gain.

The weather report

But wait, it's more than radio, it's the whole spectrum!

Here I am sending a "Get Pump Model" command to my insulin pump in the 900Mhz range! The meaty part is in the red.

Talking to an Insulin Pump

Here's the heartbeat and requests that are sent to my Insulin Pump from my Loop app through a RileyLink (BT to RF Bridge). I'm seeing the Looping communications of my Open Source Artificial Pancreas here, live.

Watching RF Pump Communications

Next post or two I'll try to get the raw bits off of the RF signal of something interesting. If you haven't messed with SDR you should really give it a try! As I said before you can get a SDR kit for as little as US$20 as a USB device.


Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!


© 2018 Scott Hanselman. All rights reserved.
     

Azure Tips and Tricks – Become more productive with Azure

$
0
0

Today, we’re pleased to re-introduce a web resource called “Azure Tips and Tricks” that helps existing developers using Azure learn something new within a couple of minutes. Since inception in 2017, the collection has grown to over 200+ tips as well as videos, conference talks, and several eBooks spanning the entire universe of the Azure platform. Featuring a new weekly tip and video it is designed to help you boost your productivity with Azure, and all tips are based off of practical real-world scenarios. The series spans the entire universe of the Azure platform from App Services, to containers, and more!

 

The Azure Tips and Tricks Homepage.

Figure 1: The Azure Tips and Tricks homepage.

 

With the new site, we’ve included the much-needed ability to navigate between Azure services, so that you can quickly browse your favorite categories.

 

A gif showing the new Azure Tips and Tricks navigation capabilities.

Figure 2: The new Azure Tips and Tricks navigation capabilities.

 

There is search functionality to assist you to quickly find what you are looking for.

 

 

A gif showing the new Azure Tips and Tricks search function.

Figure 3: The new Azure Tips and Tricks search function.

 

The site is also open-source on GitHub, so anyone can help contribute to the site, ask questions, and jump-in wherever they want! While you are on the page go ahead and star us to keep up to date.

 

An image showing the Azure Tips and Tricks GitHub repo.

Figure 4: The Azure Tips and Tricks GitHub repo.

 

What are you waiting for? Visit the site and star the repo so that you don’t miss future updates to the site, and to ensure you make the most of the Azure platform’s constantly evolving services and features. I’ll also be presenting a session on the series at Microsoft Build on Monday, May 6th from 2:30-2:50pm. I hope to meet you in person.

Thanks for reading and keep in mind that you can learn more about Azure by following our official blog and Twitter account. You can also reach the author of this post on Twitter.

Developer Tools UI updates for Microsoft Edge Insiders

$
0
0

Earlier this month, we released the first preview builds of the next version of Microsoft Edge. Developers who are using our Dev and Canary channel builds may notice one significant and intentional change when you open up the browser DevTools: dark theme is enabled by default.

If it looks familiar, that’s no accident! We’ve begun work to align our dark theme with VS Code to provide a consistent experience across Microsoft’s web developer tools.

Screen capture of the Microsoft Edge DevTools in the default "Dark Theme"

As we began planning our visual updates to DevTools for the our first Canary and Dev previews, we were on the lookout for opportunities to provide a consistent experience across products at Microsoft.

Consistency is a major theme of our goals with the new DevTools, as we set out to provide one streamlined experience for Microsoft Edge’s tools across platforms, whether you’re inspecting a browser window or a web app. With VS Code’s growing popularity (check out the results from the 2018 State of JS survey below), it only made sense to look for common inspiration and areas to align the experience.

Screen capture of the State of JS Developer Survey showing VS Code as the leading text editor of choice.

Over 14,000 respondents said VS Code is their text editor of choice

Our data shows that more than 85% of users are using some variation of a dark theme in VS Code, so we decided to flip the browser DevTools to dark theme by default. If dark theme isn’t your preferred theme to develop in, we’ve also added a new light theme – just press F1 or navigate to Settings using the ... button in the top right corner of the DevTools.

If you prefer the way the DevTools look out of the box in Chromium, you can switch over to that theme by selecting the Dark (Chromium) or Light (Chromium) themes respectively.

We Want Your Feedback

We’re early in our journey of refreshing the DevTools and are exploring a number of UI surfaces for further updates and improved consistency across products. Your feedback is important to us so we can build better developer experiences in and outside of the browser.

If you haven’t already, be sure to download the Canary and Dev preview builds on the Microsoft Edge Insider site. If you have feedback or suggestions on the DevTools in the new Microsoft Edge Insider builds, use the “Send Feedback” tool in Microsoft Edge (clicking the smiley face next to the Menu button) and let us know what you think!

Stephanie Drescher, Designer, Microsoft Edge DevTools

The post Developer Tools UI updates for Microsoft Edge Insiders appeared first on Microsoft Edge Blog.


Azure SignalR Service now supports ASP.NET!

$
0
0

We’ve just shipped the official version of the SignalR Service SDK for ASP.NET support:

Azure SignalR Service is a fully managed Azure service for real-time messaging. It is a preferred way for scaling ASP.NET Core SignalR application. However, SignalR Service is based on SignalR for ASP.NET Core 2.0, which is not 100% compatible with ASP.NET SignalR. Some code changes and proper version of dependent libraries are needed to make ASP.NET SignalR application work with SignalR Service.

We have received many usage feedbacks from customers since we announced the preview support for ASP.NET, at Microsoft Ignite 2018. Today, we are excited to announce that we have released the generally available version 1.0.0 of ASP.NET support SDK for Azure SignalR Service!

Typical architecture for ASP.NET support

This diagram shows the typical architecture to use Azure SignalR Service with application server either written in ASP.NET Core, or now, in ASP.NET.

arch.png

For self-hosted SignalR application, the application server listens to and serves client connections directly. With SignalR Service, the application server will only respond to clients’ negotiate requests, and redirect clients to SignalR Service to establish the persistent client-server connections.

Using the ASP.NET support for Azure SignalR Service you will be able to:

  • Continue to keep SignalR application using ASP.NET, but work with fully managed ASP.NET Core based SignalR Service.
  • Change a few lines of SignalR API codes, to switch to use SignalR Service instead of self-hosted SignalR Hubs.
  • Leverage Azure SignalR Service’s built-in features and tools to help operate the SignalR application, with guaranteed SLA.

Supported SDK versions

To receive the full benefit from the new ASP.NET support feature, please download and upgrade your SDKs to the latest supported versions:

  • .NET: 4.6.1+
  • Microsoft.AspNet.SignalR.*: 2.4.1
  • Microsoft.Azure.SignalR.AspNet: 1.0.0

What features are not supported?

Many factors, including non-technical ones, make the web application migrate from ASP.NET to ASP.NET Core difficult.

The ASP.NET support for Azure SignalR Service is to enable ASP.NET SignalR application developers to easily switch to use SignalR Service with minimal code change.

Some APIs and features are no longer supported:

  • Automatic reconnects
  • Forever Frame transport
  • HubState
  • PersistentConnection class
  • GlobalHost object
  • HubPipeline module
  • Client side Internet Explorer support before Microsoft Internet Explorer 11

ASP.NET support is focus on compatibility, so not all ASP.NET Core SignalR new features are supported. To name a few: MessagePack, Streaming, etc., are only available for ASP.NET Core SignalR applications.

SignalR Service can be configured for different service mode: Classic/Default/Serverless. For ASP.NET support, the Serverless mode is not supported.

For a complete list of feature comparison between ASP.NET SignalR and ASP.NET Core SignalR, the proper version of SDKs to use in each case, and what are the recommended alternatives to use for features discontinued in ASP.NET Core SignalR, please refer to doc here.

Next steps

We’d like to hear about your feedback and comments. You can reach the product team at the GitHub repo, or by email.

The post Azure SignalR Service now supports ASP.NET! appeared first on ASP.NET Blog.

Supercharging ASP.NET Core API with OData

$
0
0

Summary

In this article, I’m going to show you how you can supercharge your existing ASP.NET Core APIs with OData to provide better experience for your API consumers with only 4 lines of code.

For the purpose of this tutorial please clone our demo project WashingtonSchools so you can follow up and try the different features we are going to talk about in this article.

You can clone the demo project from here:

https://github.com/hassanhabib/ODataDemo

 

What is OData?

Let’s talk about OData …

OData is an open source, open protocol technology that provides the ability to API developers to develop Queryable APIs, one of the most common things API developers need is pagination for instance.

With OData you can enable pagination, ordering of data, reshaping and restructuring of the data and much more with only 4 lines of code.

One of the most common scenarios today is when developers call an endpoint and pull out data that they are going to filter, reshape and order later on the client side.
This seems a bit wasteful. especially if the data is large enough that could cause latency and require further optimizations for a better user experience.

 

Getting Started

To get this started, this tutorial assumes you already have an API that provides some list of objects to your end users, for the purpose of this tutorial, we are going to use a sample WashingtonSchools API project that we built to demonstrate this feature.

Our API endpoint api/students returns all the students available in our database.

To enable OData on that endpoint, we are going to need to install a nuget package for all OData binaries that’ll enable us to turn our endpoint into a Queryable endpoint.

The nuget package we are targeting here is:

https://www.nuget.org/packages/Microsoft.AspNetCore.OData

 

Once that’s installed, let’s go ahead and add OData services to our API.

To do that, go to: Startup.cs file and add in this line of code in your ConfigureServices Function:

services.AddOData();

Now we need to enable the dependency injection support for ALL HTTP routes.

To do that, in the same file Startup.cs let’s go to the Configure function and add these two lines of code:

app.UseMvc(routeBuilder => {

     routeBuilder.EnableDependencyInjection();

     routeBuilder.Expand().Select().OrderBy().Filter();

});

 

The first line of code enables the dependency injection to inject OData services into your existing API controller.

The second line of code determines which OData functionality you would like to enable your API consumers to use, we will talk about Expand, Select and OrderBy in details shortly after we complete the setup of OData on your project.

The last line of code we would want to add is an annotation on top of your API controller method, in our example here, we have a StudentsController class that has a GetStudents method to server all the available students in our database.

Now all we need to do is to add this annotation on top of your GetStudents endpoint as follows:

[EnableQuery]

 

So your controller method code should look like this:

[HttpGet]

[EnableQuery]

public IEnumerable<Student> GetStudents(){

       return this.context.Students;

}

Now that the setup is done, let’s test OData select, OrderBy and expand functionality.

 

Select

On your favorite API testing software or simply in the browser since this is a GET endpoint, let’s try to select only the properties that we care about from the students objects.

If the students object consists of an ID, Name and we care only about the Name, we could call our endpoint like this:

api/students?$select=Name

The results will be a json list of students that only has the Name property displayed. try different combinations such as:

api/students?$select=ID,Name

The select functionality enables you to control and reshape the data to fit just your need, it makes a big difference when your objects hold large amounts of data, like image data for instance that you don’t really need for a specific API call, or rich text that is contained within the same object, using select could be a great optimization technique to expedite API calls and response time.

 

OrderBy

The next functionality here is the OrderBy, which allows you to order your students based on their names alphabetically or their scores, try to hit your endpoint as follows:

api/students?$orderby=Name

You can also do:

api/students?$orderby=Score desc

 

Expand

One other functionality we want to test here is the expand. The expand functionality comes with a great advantage since it allows navigation across multiple entities.

In our example here, Students and Schools are in a many-to-one relationship, we can pull the school every student belongs to by making an expand call like this:

api/students?$expand=School

Now we get to see both students and their schools nested within the same list of objects.

 

Filter

The last functionality here is the Filter. Filtering enables you to query for a specific data with a specific value, for instance, If I’m looking for a student with the name Todd, all I have to do is to make an API call as follows:

api/students?$filter=Name eq ‘Todd’

The call will return all students that have the name Todd, you can be more specific with certain properties such as scores. For instance: if I want all students with scores greater than 100, I could make the following API call:

api/students?filter=Score gt 100

There are more features in OData that enables even more powerful API functionality that I urge you to explore such as MaxTop for pagination.

 

Final Notes

Here are few things to understand about OData:

  1. OData has no dependency whatsoever on the entity framework, it can come in handy with EF but it doesn’t depend on it, here’s a project that runs an OData API without EF: https://github.com/hassanhabib/ODataWithoutEF
  2. OData can give you more optimization with EF if you use IQueryable as a data return type that IEnumerable since IQueryable only evaluates after the query is built and ready to be executed.
  3. OData is much older than any other similar technology out there, it was released officially in 2007 and it has been around since then being used in large scale applications such as Microsoft Graph which feeds most of Microsoft Office products and XBOX in addition to Microsoft Windows.

 

The Future of OData

OData team continues to improve the feature and make it even easier and simpler to use and consume on existing and new APIs with both ASP.NET Web API (classic) or ASP.NET Core and we are working on adding even more powerful features in the near future.

The post Supercharging ASP.NET Core API with OData appeared first on OData.

Microsoft container registry unaffected by the recent Docker Hub data exposure

$
0
0

Docker recently announced Docker Hub had a brief security exposure that enabled unauthorized access to a Docker Hub database, exposing 190k Hub accounts and their associated GitHub tokens for automated builds. While initial information led people to believe the hashes of the accounts could lead to image:tags being updated with vulnerabilities, including official and microsoft/ org images, this was not the case. Microsoft has confirmed that the official Microsoft images hosted in Docker Hub have not been compromised.

Consuming Microsoft images from the Microsoft Container Registry (MCR)

As a cloud and software company, Microsoft has been transitioning official Microsoft images from being served from Docker Hub, to being served directly by Microsoft ​as of May of 2018. To avoid breaking existing customers, image:tags previously available on Docker Hub continue to be made available. However, newer Microsoft images and tags are available directly from the Microsoft Container Registry (MCR) at mcr.microsoft.com. Search and discoverability of the images are available through Docker Hub, however docker pull, run and build statements should reference mcr.microsoft.com. For example, pulling the windows-servercore image:
docker pull mcr.microsoft.com/windows/servercore

Official microsoft/ org images follow the same format.

Microsoft recommends pulling Microsoft official images from mcr.microsoft.com.

Recommended best practices

Leveraging community and official images from Docker Hub and Microsoft are a critical part of today’s cloud native development. At the same time, it’s always important to create a buffer between these public images and your production workloads. These buffers account for availability, performance, reliability and the risk of vulnerabilities. Regardless of which cloud you use, or if you are working on-prem, importing production images to a private registry is a best practice that puts you in control of the authentication, availability, reliability and performance of image pulls. For more information, see Choosing A Docker Container Registry.

Automated container builds

In addition to using a private registry for your images, we also recommend using a cloud container build system that incorporates your companies integrated authentication. For example, Azure offers Azure Pipelines and ACR Tasks for automating container builds, including OS & .NET Framework patching. ACR also offers az acr import, for importing images from Docker Hub and other registries, enabling this buffer.

Microsoft remains committed to the security and reliability of your software and workloads.

Serverless automation using PowerShell preview in Azure Functions

$
0
0

As companies of all sizes move their assets and workloads to the cloud, there’s a clear need to provide more powerful ways to manage, govern, and automate their cloud resources. Such automation scenarios require custom logic best expressed in PowerShell. They are also typically executed either on a schedule or when an event happens like an alert on an application, a new resource getting created, or when an approval happens in an external system. 

Azure Functions is a perfect match to address such scenarios as it provides an application development model based on triggers and bindings for accelerated development and serverless hosting of applications. PowerShell support in Functions has been a common request from customers, given its event-based capabilities.

Today, we are pleased to announce that we have brought the benefits of this model to automating operational tasks across Azure and on-premises systems with the preview release of PowerShell support in Azure Functions.

Companies all over the world have been using PowerShell to automate their cloud resources in their organization, as well as on-premises, for years. Most of these scenarios are based on events that happen on the infrastructure or application that must be immediately acted upon in order to meet service level agreements and time to recovery.

With the release of PowerShell support in Azure Functions, it is now possible to automate these operational tasks and take advantage of the native Azure integration to modernize the delivering and maintenance of services.

PowerShell support in Azure Functions is built on the 2.x runtime and uses PowerShell Core 6 so your automation can be developed on Windows, macOS, and Linux. It also integrates natively with Azure Application Insights to give full visibility into each function execution. Previously, Azure Functions had experimental PowerShell support in 1.x., and it is highly recommended that customers move their 1.x PowerShell functions to the latest runtime.

Quickstart for Azure Functions for PowerShell

PowerShell in Azure Functions has all the benefits of other languages including:

  • Native bindings to respond to Azure monitoring alerts, resource changes through Event Grid, HTTP or Timer triggers, and more.
  • Portal and Visual Studio Code integration for authoring and testing of the scripts.
  • Integrated security to protect HTTP triggered functions.
  • Support for hybrid connections and VNet to help manage hybrid environments.
  • Run in an isolated local environment.

Additionally, functions written with PowerShell have the following capabilities to make it easier to manage Azure resources through automation.

Automatic management of Azure modules

Azure modules are natively available for your scripts so you can manage services available in Azure without having to include these modules with each function created. Critical and security updates in these Az modules will be automatically upgraded by the service when new minor versions are released.

You can enable this feature through the host.json file by setting “Enabled” to true for managedDependency and updating Requirements.psd1 to include Az. These are automatically set when you create a new function app using PowerShell.

host.json
{
    “version”: “2.0”,
    “managedDependency”: {
       “Enabled”: “true”
    }
}

Requirements.psd1
@{
    Az = ‘1.*’
}

Authenticating against Azure services

When enabling a managed identity for the function app, the PowerShell host can automatically authenticate using this identity, giving functions permission to take actions on services that the managed identity has been granted access. The profile.ps1 is processed when a function app is started and enables common commands to be executed. By default, if managed identify is enabled, the function application will authenticate with Connect-AzAccount -Identity.

Function application authenticate with Connect-AzAccount -Identity.

Common automation scenarios in Azure

PowerShell is a great language for automating tasks, and with the availability in Azure Functions, customers can now seamless author event-based actions across all services and applications running in Azure. Below are some common scenarios:

  • Integration with Azure Monitor to process alerts generated by Azure services.
  • React to Azure events captured by Event Grid and apply operational requirements on resources.
  • Leverage Logic Apps to connect to external systems like IT service management, DevOps, or monitoring systems while processing the payload with a PowerShell function.
  • Perform scheduled operational tasks on virtual machines, SQL Server, Web Apps, and other Azure resources.

Next steps

PowerShell support in Azure Functions is available in preview today, check out the following resources and start trying it out:

  1. Learn more about using PowerShell in Azure Functions in the documentation, including quick starts and common samples to help get started.
  2. Sign up for an Azure free account if you don’t have one yet, and build your first function using PowerShell.
  3. You can reach the Azure Functions team on Twitter and on GitHub. For specific feedback on the PowerShell language, please review its Azure Functions GitHub repository.
  4. We also actively monitor StackOverflow and UserVoice, so feel free to ask questions or leave your suggestions. We look forward to hearing from you!
  5. Learn more about automation and PowerShell in Functions on Azure Friday and Microsoft Mechanics.

Azure.Source – Volume 80

$
0
0

Spark + AI Summit | Preview | GA | News & updates | Technical content | Azure shows | Events | Customers, partners, and industries

Spark + AI Summit 2019

Spark + AI Summit - Developing for the intelligent cloud and intelligent edge

Last week at Spark + AI Summit 2019, Microsoft announced joining the open source MLflow project as an active contributor. Developers can use the standard MLflow tracking API to track runs and deploy models directly into Azure Machine Learning service. We also announced that managed MLflow is generally available on Azure Databricks and will use Azure Machine Learning to track the full ML lifecycle. The combination of Azure Databricks and Azure Machine Learning makes Azure the best cloud for machine learning. Databricks open sourced Databricks Delta, which Azure Databricks customers get greater reliability, improved performance, and the ability to simplify their data pipelines. Lastly, .NET for Apache Spark is available in preview, which is a free, open-source, .NET Standard compliant, and cross-platform big data analytics framework.

Diagram showing relationship of Azure IoT
            Edge and Azure Services

Dear Spark developers: Welcome to Azure Cognitive Services

With only a few lines of code you can start integrating the power of Azure Cognitive Services into your big data workflows on Apache Spark™. The Spark bindings offer high throughput and run anywhere you run Spark. The Cognitive Services on Spark fully integrate with containers for high performance, on premises, or low connectivity scenarios. Finally, we have provided a general framework for working with any web service on Spark. You can start leveraging the Cognitive Services for your project with our open source initiative MMLSpark on Azure Databricks.

Now in preview

Securing Azure SQL Databases with managed identities just got easier

Announcing the second preview release of the Azure Services App Authentication library, version 1.2.0, which release enables simple and seamless authentication to Azure SQL Database for existing .NET applications with no code changes – only configuration changes. Try out the new functionality in existing SQL-backed solutions and gain the security benefits that the App Authentication library and managed identities afford.

Now generally available

Announcing Azure Backup support to move Recovery Services vaults

Announcing general availability support of the move functionality for recovery services vaults, which is an Azure Resource Manager resource to manage your backup and disaster recovery needs natively in the cloud. Migrate a vault between subscriptions and resource groups with a few steps, in minimal downtime and without any data-loss of old backups. Move a Recovery Services vault and retain recovery points of protected virtual machines (VMs) to restore to any point in time later.

Azure SQL Data Warehouse reserved capacity and software plans now generally available

Announcing the general availability of Azure SQL Data Warehouse reserved capacity and software plans for RedHat Enterprise Linux and SUSE. Purchase Reserved Capacity for Azure SQL Data Warehouse and get up to a 65 percent discount over pay-as-you-go rates. Select from 1-year or 3-year pre-commit options. Purchase plans for RedHat Enterprise Linux and save up to 18 percent. Plans are only available for Red Hat Enterprise Linux virtual machines and the discount does not apply to RedHat Enterprise Linux SAP HANA VMs or RedHat Enterprise Linux SAP Business Apps VMs. Save up to 64 percent on your SUSE software costs. SUSE plans get the auto-fit benefit, so you can scale up or down your SUSE VM sizes and the reservations will continue to apply. In addition, there is a new experience to purchase reservations and software plans, including REST APIs to purchase azure reservation and software plans.

Animated
            GIF showing new experience to purchase reservations and software
            plans

Azure Cost Management now generally available for Pay-As-You-Go customers

Announcing the general availability of Azure Cost Management features for all Pay-As-You-Go and Azure Government customers, which will greatly enhance your ability to analyze and proactively manage your cloud costs. These features enable you to analyze your cost data, configure budgets to drive accountability for cloud costs, and export pre-configured reports on a schedule to support deeper data analysis within your own systems. This release for Pay-As-You-Go customers also provides invoice reconciliation support in the Azure portal via a usage csv download of all charges applicable to your invoices.

News and updates

Microsoft container registry unaffected by the recent Docker Hub data exposure

Docker recently announced Docker Hub had a brief security exposure that enabled unauthorized access to a Docker Hub database, exposing 190k Hub accounts and their associated GitHub tokens for automated builds. While initial information led people to believe the hashes of the accounts could lead to image:tags being updated with vulnerabilities, including official and microsoft/ org images, this was not the case. Microsoft has confirmed that the official Microsoft images hosted in Docker Hub have not been compromised. Regardless of which cloud you use, or if you are working on-premises, importing production images to a private registry is a best practice that puts you in control of the authentication, availability, reliability and performance of image pulls.

AI for Good: Developer challenge

Do you have an idea that could improve and empower the lives of everyone in a more accessible way? Or perhaps you have an idea that would help create a sustainable balance between modern society and the environment? Even if it’s just the kernel of an idea, it’s a concept worth exploring with the AI for Good Idea Challenge. If you’re a developer, a data scientist, a student of AI, or even just passionate about AI and machine learning, we encourage you to take part in the AI for Good: Developer challenge and improve the world by sharing your ideas.

Promotional graphic for AI for Good
        developer challenge

Azure Notification Hubs and Google’s Firebase Cloud Messaging Migration

When Google announced its migration from Google Cloud Messaging (GCM) to Firebase Cloud Messaging (FCM), push services like Azure Notification Hubs had to adjust how we send notifications to Android devices to accommodate the change. If your app uses the GCM library, follow Google’s instructions to upgrade to the FCM library in your app. Our SDK is compatible with either. As long as you’re up to date with our SDK version, you won’t have to update anything in your app on our side.

Governance setting for cache refreshes from Azure Analysis Services

Data visualization and consumption tools over Azure Analysis Services (Azure AS) sometimes store data caches to enhance report interactivity for users. The Power BI service, for example, caches dashboard tile data and report data for initial load for Live Connect reports. This post introduces the new governance setting called ClientCacheRefreshPolicy to disable automatic cache refreshes.

Azure Updates

Learn about important Azure product updates, roadmap, and announcements. Subscribe to notifications to stay informed.

Technical content

Best practices in migrating SAP applications to Azure – part 1

This post touches upon the principles outlined in, Pillars of a great Azure architecture, as they pertain to building your SAP on Azure architecture in readiness for your migration.

Solution diagram showing
            ExpressRoute or VPN facilitating connectivity

Best practices in migrating SAP applications to Azure – part 2

Part 2 covers a common scenario where SAP customers can experience the speed and agility of the Azure platform is the ability to migrate from a SAP Business Suite running on-premises to SAP S/4HANA in the cloud.

Use Artificial Intelligence to Suggest 1-5 Star Ratings

When customers are impressed or dissatisfied about a product, they come back to where it was purchased looking for a way to leave feedback. See how to use artificial intelligence (Cognitive Service) to suggest star ratings based on sentiment - detected as customers write positive or negative words in their product reviews. Learn about CogS, Sentiment Analysis, and Azure Functions through a full tutorial - as well as where to go to learn more and setup a database to store and manage submissions.

You should never ever run directly against Node in production. Maybe.

Running against Node might cause your app to crash. To prevent this, you can run Node with a monitoring tool, or you can monitor your applications themselves.

Configure Azure Site Recovery from Windows Admin Center

Learn how to use Windows Admin Center to configure Azure Site Recovery to be able to replicate virtual machines to Azure, which you can use for protection and Disaster Recovery, or even migration.

Connecting Twitter and Twilio with Logic Apps to solve a parking problem

See how Tim Heuer refactored a solution to a common problem NYC residents face—trying to figure out street-side parking rules— using Azure Logic Apps and provided connectors for Twitter and Twilio to accomplish the same thing.

Creating an Image Recognition Solution with Azure IoT Edge and Azure Cognitive Services

Dave Glover demonstrates how one can use Azure Custom Vision and Azure IoT Edge to build a self-service checkout experience for visually impaired people—all without not needing to be a data scientist. The solution is extended with Python Azure Function, SignalR and Static Website Single Page App.

Get Azure Pipeline Build Status with the Azure CLI

For those who prefer the command line, it's possible to interact with Azure DevOps using the Azure CLI. Neil Peterson takes a quick look at the configuration and basic functionality of the CLI extension as related to Azure Pipelines.

dotnet-azure : A .NET Core global tool to deploy an application to Azure in one command

The options for pushing your .NET Core application to the cloud are not lacking depending on what IDE or editor you have in front of you. But what if you just wanted to deploy your application to Azure with a single command? Shayne Boyer shows you how to do just that with the dotnet-azure global tool.

Detecting threats targeting containers with Azure Security Center

More and more services are moving to the cloud and they bring their security challenges with them. In this blog post, we will focus on the security concerns of container environments. This post goes over several security concerns in containerized environments, from the Docker level to the Kubernetes cluster level, and shows how Azure Security Center can help you detect and mitigate threats in the environment as they’re occurring in real time.

Illustration showing
            Vulnerable web application container accesses the API Server

Customize your Azure best practice recommendations in Azure Advisor

Cloud optimization is critical to ensuring you get the most out of your Azure investment, especially in complex environments with many Azure subscriptions and resource groups. Learn how Azure Advisor helps you optimize your Azure resources for high availability, security, performance, and cost by providing free, personalized recommendations based on your Azure usage and configurations.

5 tips to get more out of Azure Stream Analytics Visual Studio Tools

Azure Stream Analytics is an on-demand real-time analytics service to power intelligent action. Azure Stream Analytics tools for Visual Studio make it easier for you to develop, manage, and test Stream Analytics jobs. This post introduces capabilities and features to help you improve productivity that were included in two major updates from earlier this year: test partial scripts locally; share inputs, outputs, and functions across multiple scripts; duplicate a job to other regions; local input schema auto-completion; and testing queries against SQL database as reference data.

Azure Tips and Tricks - Become more productive with Azure

Since inception in 2017, the Azure Tips & Tricks collection has grown to over 200+ tips as well as videos, conference talks, and several eBooks spanning the entire breadth of the Azure platform. Featuring a new weekly tip and video it is designed to help you boost your productivity with Azure, and all tips are based off of practical real-world scenarios. This post re-introduces a web resource called Azure Tips and Tricks that helps existing developers using Azure learn something new within a couple of minutes.

Screenshot of new Azure Tips and Tricks
            resouce site

Optimize performance using Azure Database for PostgreSQL Recommendations

You no longer have to be a database expert to optimize your database. Make your job easier and start taking advantage of Azure Database for PostgreSQL Recommendation for Microsoft Azure Database for PostgreSQL today. By analyzing the workloads on your server, the recommendations feature gives you daily insights about the Azure Database for PostgreSQL resources that you can optimize for performance. These recommendations are tightly integrated with Azure Advisor to provide you with best practices directly within the Azure portal.

Azure shows

Episode 275 - Azure Foundations | The Azure Podcast

Derek Martin, a Technology Solutions Principal (TSP) at Microsoft talks about his approach to ensuring that customers get the foundational elements of Azure in place first before deploying anything else. He discusses why Microsoft is getting more opinionated, as a company, when advocating for best practices.

Code-free modern data warehouse using Azure SQL DW and Data Factory | Azure Friday

Gaurav Malhotra joins Scott Hanselman to show how to build a modern data warehouse solution from ingress of structured, unstructured, semi-structured data to code-free data transformation at scale and finally to extracting business insights into your Azure SQL Data Warehouse.

Serverless automation using PowerShell in Azure Functions | Azure Friday

Eamon O'Reilly joins Scott Hanselman to show how PowerShell in Azure Functions makes it possible for you to automate operational tasks and take advantage of the native Azure integration to deliver and maintain services.

Meet our Azure IoT partners: Accenture | Internet of Things Show

Mukund Ghangurde is part of the Industry x.0 practice at Accenture focused on driving digital transformation and digital reinvention with industry customers. Mukund joined us on the IoT Show to discuss the scenarios he is seeing in the industry where IoT is really transforming businesses and how (and why) Accenture is partnering with Azure IoT to accelerate and scale their IoT solutions.

Doing more with Logic Apps | Block Talk

Integration with smart contracts is a common topic with developers. Whether with apps, data, messaging or services there is a desire to connect the functions and events of smart contracts in an end to end scenario. In this episode, we look at the different types of scenarios and look at the most common use case – how to quickly expose your smart contract functions as microservices with the Ethereum Blockchain Connector for Logic Apps or Flow.

How to get started with Azure API Management | Azure Tips and Tricks

Learn how to get started with Azure API Management, a service that helps protect and manage your APIs.

Thumbnail from How to get started with Azure API
            Management | Azure Tips and Tricks

How to create a load balancer | Azure Portal Series

Learn how to configure load balancers and how to add virtual machines to them in the Azure Portal.

Thumbnail from How to create a
            load balancer | Azure Portal Series

Rockford Lhotka on Software Architecture | The Azure DevOps Podcast

This week, Jeffrey Palermo and Rocky Lhotka are discussing software architecture. They discuss what Rocky is seeing transformation-wise on both the client side and server side, compare and visit the spectrum of Containers vs. virtual machines vs. PaaS vs. Azure Functions, and take a look at microservice architecture. Rocky also gives his tips and recommendations for companies who identify as .NET shops, and whether you should go with Containers or PaaS.

Episode 8 - Partners Help The Azure World Go Round | AzureABILITY Podcast

Microsoft's vast partner-ecosystem is a big part of the Azure value proposition. Listen in as Microsoft Enterprise Channel Manager Christine Schanne and Louis Berman delve into the partner experience with Neudesic; a top Gold Microsoft Partner.

Events

Get ready for Global Azure Bootcamp 2019

Global Azure Bootcamp is a free, one-day, local event that takes place globally. This annual event, which is run by the Azure community, took place this past Saturday, April 27, 2019. Each year, thousands attend these free events to expand their knowledge about Azure using a variety of formats as chosen by each location. Did you attend?

Connecting Global Azure Bootcampers with a cosmic chat app

We added a little “cosmic touch” to the Global Azure Bootcamp this past weekend by enabling attendees to greet each other with a chat app powered by Azure Cosmos DB. For a chat app, this means low latency in the ingestion and delivery of messages. To achieve that, we deployed our web chat over several Azure regions worldwide and let Azure Traffic Manager route users’ requests to the nearest region where our Cosmos database was deployed to bring data close to the compute and the users being served. That was enough to yield near real-time message delivery performance as we let Azure Cosmos DB replicate new messages to each covered region.

Customers, partners, and industries

Connect IIoT data from disparate systems to unlock manufacturing insights

Extracting insights from multiple data sources is a new goal for manufacturers. Industrial IoT (IIoT) data is the starting point for new solutions, with the potential for giving manufacturers a competitive edge. These systems contain vast and vital kinds of information, but they run in silos. This data is rarely correlated and exchanged. To help solve this problem, Altizon created the Datonis Suite, which is a complete industrial IoT solution for manufacturers to leverage their existing data sources.

Illustration showing
            Datonis Suite from Altizon

Migrating SAP applications to Azure: Introduction and our partnership with SAP

Just over 25 years ago, Bill Gates and Hasso Plattner met to form an alliance between Microsoft and SAP that has become one of our industry’s longest lasting alliances. At the time their conversation was focused on how Windows could be the leading operating system for SAP’s SAPGUI desktop client and when released a few years later, how Windows NT could be a server operating system of choice for running SAP R/3. Ninety percent of today’s Fortune 500 customers use Microsoft Azure and an estimated 80 percent of Fortune customers run SAP solutions, so it makes sense why SAP running on Azure is a key joint initiative between Microsoft and SAP. Over the next three weeks leading up to this year’s SAPPHIRENOW conference in Orlando, we’re publishing an SAP on Azure technical blog series (see Parts 1 & 2 in Technical content above).

Azure Marketplace new offers - Volume 35

The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. You can also connect with Gold and Silver Microsoft Cloud Competency partners to help your adoption of Azure. In the first half of March, we published 68 new offers.


Azure Government Secret Regions, Azure Batch updates & Service Fabric Mesh new additions | Azure This Week - A Cloud Guru

This time on Azure This Week, Lars covers new Azure Government Secret regions and the new updates to Azure Batch. He also talks about new additions to Service Fabric Mesh. Check it out!

Thumbnail from Azure Government Secret Regions,
            Azure Batch updates & Service Fabric Mesh new
            additions | Azure This Week - A Cloud Guru

Viewing all 5971 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>