Thursday, November 23, 2017

Presented a session at E2EVC 2017 in Barcelona!

Presented a session at E2EVC 2017 in Barcelona!

imageLast week was Experts 2 Experts Virtualization (E2EVC) Conference time again! This edition was being held in Barcelona. Great location for the 36th edition of this event with attendees coming from 19 different countries!

I presented a session together with Benny Tritsch and in this blog post I want to elaborate some more on what we presented. Our session title was called “Microsoft RDS, current state and future vision”.

clip_image002

When we created the abstract for our session somewhere last summer, we came up with the idea to present a session showing the latest evolution of the Remote Desktop Platform. Back then, we weren’t really sure if we would be able to show it or even be allowed talk about it because back then this was all very early stage. Fortunately, 3 weeks before the E2EVC event in Barcelona we were able to get the bits of the very first RDmi Private Preview and successfully installed it in our labs. We contacted the RDS Product Team and they gave us permission to demo both RDmi and HTML5 at E2EVC last week!

So, during the session we covered these 4 major topics

clip_image004

Remote End User Experience benchmarking
We started the session by covering Remote End User Experience Benchmarking and talked about the importance of benchmarking the perceived end user experience. We showed demos of a Rex Analytics, a framework and toolset that allows you to Build, Measure and Analyze the user experience. Using the REX Analyzer, we’re able see and compare the end users experience and include & correlate to the telemetry data by showing the resources consumed by the sessions. The REX Analyzer shows all of this in a single view 4-up player that allows easy playback & zooming.

clip_image006

We continued the session by covering Project Honolulu. This is the codename of the Web Based Interface to manage Windows Server going forward replacing many, many, many different tools we use today. Project Honolulu was introduced and talked about a lot at Microsoft Ignite. I tweeted the pictures below while attending Jeff Woolsey session. This shows the before and after picture. On the left-hand side all kinds of different tools, MMC Snap ins etc. On the right-hand side a single web interface allowing you to do the same type of management remotely!

clip_image008

In our session we did a live demo of Project Honolulu Preview and showing a couple of examples.

clip_image010

Next, we covered Remote Desktop modern infrastructure (RDmi). RDmi is the latest evolution of the Remote Desktop Services platform. It basically means that the RDS Infrastructure roles like RD Connection Broker, RD Web Access and RD Gateway are going to be transformed into .NET services. That means you no longer need to host, manage and maintain the Virtual Machines that hold these roles. You can directly purchase these roles from the Azure MarketPlace, or host these .NET services yourself.

We started by explaining the RDmi Architecture covering the .NET services, the way the RD Session Host servers connect to the environment, how this platform allows for multitenancy and that the platform is based on Azure AD. It comes down to being;

- More secure because of the out of box integration with AAD Conditional Access and MFA

- More Cloud ready because it leverages PaaS components that automatically scale

- Allowing all clients including a new HTML5 web client

clip_image012

After covering the architecture, the rest of the session was all based on live demos!

We coved the Azure Portal showing how there are now zero Virtual Machines needed for the RDS Backend and how the entire backend is based on Azure App Services.

clip_image014

We showed an example of what the Azure Dashboard could look like integrating telemetry data from the RDmi environment by simply dragging and dropping in these Azure App Service.

clip_image016

We concluded our session with a live demo of the HTML5 Web Client which is available in RDmi as well as for RDS based on Windows Server 2016.

clip_image018

Thanks to everyone who attended the session, it was great to see that much interest. Also, a huge thanks to Alex Cooper, and crew for creating yet another awesome edition of E2EVC! If you are interested in attending the next edition, it will be held in Amsterdam June 8-10, 2018! Visit www.e2evc.com for more information!

Friday, October 6, 2017

Microsoft Ignite 2017 Benchmarking demo with Outlook Search performance and FSLogix

I presented a session at Microsoft Ignite 2017, for more details on the session check out this blog post: Recap of the session I presented at Microsoft Ignite 2017 Part of that session was also a demo on benchmarking Remote End User Experience. After the session I've received a lot of questions about the specific test I used as the example, so I decided to elaborate on it some more in this blog post.

The example test I showed was performed to benchmark outlook search performance when using FSLogix Office 365 container. If you're not familiar with the product, check out this page: https://fslogix.com/products/office-365-container



Here are some of the results from that benchmarking test:

"...When users roam from RD session host to another RD Session host, their search experience is far better with FSLogix. This is because it roams the search index on per user basis inside the user’s profile container; there is no need to re-index a user’s Outlook OST no matter which RD Session Host server they end up on..."

"...During our test runs we could observe time and time again that secondary FSLogix users that had Outlook indexed had a far better user experience in than secondary UPD users who had to have their OSTs indexed. The FSLogix secondary users got consistent returned results for searches that used the Windows Local Search service while the UPD secondary users did not..."

If you want more information on all the tests we (RDS Gurus) performed check out this article: http://www.rdsgurus.com/outlook-performance-in-non-persistent-environments-using-fslogixs-office-365-containers/

http://www.rdsgurus.com/outlook-performance-in-non-persistent-environments-using-fslogixs-office-365-containers/

Wednesday, September 27, 2017

Recap of the session I presented at Microsoft Ignite 2017

I’m at Microsoft Ignite in Orlando this week and yesterday I presented a session on hosting RDS and VDI in the Cloud. This blog post is a recap of the session including links to demos I showed.

I started my session with the question whether RDS or VDI would still be relevant today. I pointed out that the application landscape of an average organization today already contains a lot of SaaS and Web based applications. Also, Apps from App stores are becoming more common in the corporate world. These types of applications are already enabled for the modern workplace and can already be accessed from any device at any time. I continued by pointing out the traditional Windows Applications. How these types of applications are not enabled for the modern workplace, that they demand a Windows Desktop to run and that in many cases they rely heavily on an application backend. I talked about how many might argue that the Windows Application will disappear in the future. That the Windows Application will be transformed into other form factors like Web-Based and Apps. I agree with that statement but showed the diagram below indicating that the number of Windows Applications currently still out there is in the millions and that these types of Applications have different requirements. Key take away being that as long as Traditional Windows Applications exist, RDS or VDI can still help you provide those traditional Windows Applications on top of the modern workplace.



After the introduction, I covered what it takes to run RDS or VDI on top of Azure. I talked about ways to optimize for the Cloud by leveraging PaaS like Azure SQL. I shared ways to auto scale an RDS or VDI environment on Azure IaaS by using scaling scripts. I also talked about ways to integrate Azure MFA into an RDS environment.

During the second half of the session I covered 2 demos. In the first demo, I showed how to leverage Azure Resource Manager and JSON scripts to perform a fully automated deployment of RDS running on top of Azure IaaS. The ARM template I showed creates an entire HA deployment in 30 minutes including things like SQL, SSL Certificates, Branding and Customization. I uploaded the 5-min video that I shared in the session and it’s available on my YouTube channel here: https://youtu.be/Y7Gaa2URhdE
https://youtu.be/Y7Gaa2URhdE

The second demo was related to benchmarking the end user experience in a remoting session. I started by explaining that only benchmarking performance counters does not always fulfill the needs and how being able to see the actual end user experience can be extremely helpful. The demo I showed was based on a Framework called REX Analytics. Amongst other things, this framework provides an analyzer tool that allows you to see and compare the end user experience of various remoting sessions. To get a detailed overview of all the capabilities of this Framework check out the following links that also contain the demo I shared during the session:

Outlook Performance in Non-Persistent Environments Using FSLogix’s Office 365 Containers
OneDrive for Business Performance in Non-Persistent Environments Using FSLogix’s Office 365 Containers

I got a good turn up for my session and a lot of positive feedback! I want to that everyone who attended my session and hope to be back next year!

If you have additional questions about the topics I covered, feel free to contact me!




Thursday, September 21, 2017

First look at updates coming to Remote Desktop Services

There is a new Microsoft Mechanics video published that provides a great overview of the recently announced changes on the Remote Desktop Services platform, RD Modern Infrastructure.

"...The RDS team has innovated in three key areas:

Security: RDS-hosted environments can use authentication with Azure Active Directory – see how you get advantages like Conditional Access policies, Multifactor Authentication, Integrated authentication with other SaaS Apps using Azure AD, and the ability to get security signals from the Intelligent Security Graph. Moreover, by isolating the infrastructure roles (Gateway, Web, connection broker and others) from the desktop and app deployment hosts, we add another layer to separation for higher security of your virtualized environments.

Cloud readiness: There are updates coming to infrastructure roles with innovations in the existing RD infrastructure roles – Web, Gateway, Connection Broker, Licensing – see how to take advantage of the elasticity and scale capabilities of Azure. Get a first look at the new Diagnostics role that helps you monitor your deployment effectively.

Windows apps on ANY device:  RDS has long had the flexibility to run on cross-platform desktop and mobile operating systems using apps, but we are now building support for HTML5 browser-delivered experiences. Of course, RDS works with Windows – even Windows 10 S – offering even more flexibility for how your apps and desktops are accessed..."

There will also be several session at Microsoft Ignite next week that talk about this new infrastructure in more detail.

Source & video: https://blogs.technet.microsoft.com/enterprisemobility/2017/09/20/first-look-at-updates-coming-to-remote-desktop-services/

Friday, September 15, 2017

Remoting Graphics and GPU’s in End User Computing

Remoting Graphics and GPU’s in End User computing are becoming commodity. The days where GPU was beneficial to only very specific applications like AutoDesk, AutoCad, Solidworks et cetera are definitely over. Today, almost every Remoting Windows environment can benefit from Remoting Graphics. Even applications like Office and also browsers can leverage a GPU in a Remote Windows environment. With the N-series on Azure being available for some time now, Remote Desktop Services or Virtual Desktop Infrastructure hosted on Azure IaaS is also able to use GPU. Companies like Frame offer options to leverage that same GPU in their solutions, even with a fully web based client. I recently attended a webinar where they showed this in an impressive demo.
 
With GPU being commodity and several remoting protocols being available that can offer it, the question arises which one to use. As always, it can be answered by the usual IT answer; “it depends!”. Comparisons are interesting to match use cases to remoting protocols. This is exactly what Benny Tritsch and Kristin Griffin recently did in a test where they compared the user experience of RDP vs PCoIP.
 
The primary focus of the test was on benchmarking the performance of graphics workloads in Hyper-V virtual machines accelerated by NVIDIA M60 GPUs attached through Discrete Device Assignment (DDA).
 
“In our test environment, we used the REX Analytics framework to benchmark remote end-user experience (REX) by simulating a range of user interaction workloads. The REX Analytics framework includes fully automated (synthetic) test sequences, control services, management consoles, agents, screen and telemetry data recorders, analysis tools and a unique visualization component. The framework works on-premises and in cloud environments.”
 
The results of these test show contain some interesting comparison videos. Read the full article, including links to various video’s here: http://www.rdsgurus.com/rdp10-versus-pcoip-on-hyper-v-with-dda/


 

Wednesday, July 12, 2017

RDS modern infrastructure and HTML5 for RDS announced!

Within the RDS MVP group we had already discussed this, and the information is now public! I’m super excited about this step!

RDS modern infrastructure is announced!

“…The RDS modern infrastructure components provide functionality that extends the current RD Web Access, RD Gateway, and RD Connection Broker services, as well as adding a new RD Diagnostics service. The RDS modern infrastructure components are implemented as .NET Web Services enabling a wide variety of deployment options. For example:

Both single and multi-tenant deployments, making smaller deployments (less than 100 users) much more economically viable, while providing the necessary security of tenant isolation

Deployments on Microsoft Azure, on-premises equipment, and hybrid configurations

Virtual machines or Azure App Services can be used for deployment

Azure App Services, part of Azures Platform-as-a-Service environment, simplifies the deployment and management of the RDS modern infrastructure because it abstracts the details of the virtual machines, networking, and storage. This simplifies administrative tasks like configuring scale out/in of a service to dynamically and automatically handle fluctuating usage patterns…”

And also, HTML5 for Remote Desktop Services is coming!

“…The web client, combined with the other RDS modern infrastructure features, allows many Windows applications to be easily transformed into a Web-based Software-as-a-Service (SaaS) application without having to rewrite a line of code…”

More details will be shared in today’s session at Microsoft Inspire.

Source: https://blogs.technet.microsoft.com/enterprisemobility/2017/07/12/today-at-microsoft-inspire-next-generation-architecture-for-rds-hosting/

Afbeeldingsresultaat voor remote desktop

Friday, July 7, 2017

Real time logging your Microsoft RDS environment using PowerShell

All Remote Desktop Services events logs in a single pane? Every RDS event from machine A and B that has written an event in last 10 minutes? Listen to events from RDS event logs in real time from all RDS related servers in your deployment?

Jason Gilbertson, a Technical Advisor at Microsoft who works closely with the RDS Product team wrote a single PowerShell that does all of the above, and much more!!

Some of the features:

- Export logs locally or remotely to .csv format on local machine grouped by machine name

- Convert *.evt* files to .csv

- View and manage 'debug and analytic' event logs

- Listen to event logs real-time from local or remote machines displaying color coded messages in console

Although the script is very multifunctional, it has specific parameters for RDS to allow you to collect RDS related event log from all servers that are running RDS roles. So, for example, you can combine all event logs from your RD Connection Broker-, RD Web Access-, RD Gateway- and RD Session Host Servers in single view.

The script also exports to CSV which allows you to feed the exports into Excel Graphs or PowerBI environments for further analysis.

A couple of examples;

Query rds event logs for last 10 minutes on a remote RD Connection Broker Server
PS C:\>.\event-log-manager.ps1 -rds -minutes 10 -Machines rdcb-01

clip_image002

Below is what the command outputs to CSV:

clip_image004

Example command to enable ‘debug and analytic’ event logs for 'rds' event logs and 'dns' event logs:
PS C:\>.\event-log-manager.ps1 –enableDebugLogs -eventLogNamePattern dns -rds -machines rdcb-01

clip_image006

Below is what the command outputs to CSV:

clip_image008

Example command to listen to multiple RD Gateway Servers for all eventlogs related to Remote Desktop Services to get live results
PS C:\> .\event-log-manager.ps1 -listen -rds -machines RDGW-01, RDGW-01

Below is a sample output

clip_image010

These were only a few RDS related examples, but the script Jason created has awesome capabilities! It’s available on TechNet Gallery here: https://gallery.technet.microsoft.com/Windows-Event-Log-ad958986

Tuesday, June 6, 2017

Presented a session at E2EVC in Orlando, Florida on RDS, ARM, JSON and stroopwafels!

Last week, right after Citrix Synergy, Alex Cooper hosted yet another awesome edition of the Experts 2 Experts Virtualization conference (E2EVC) in Orlando, Florida! If you’re not familiar with the event, please check out e2evc.com. It’s a vendor neutral, virtualization conference focusing on all that is related to End User Computing covering topics like RDS, VDI, RemoteApp, Application Virtualization and munch more. Sessions are presented by the community, and because of the vendor neutral approach, you’ll see a good mix of sessions related to Microsoft, Citrix, VMWare, Parallels and many other products as well.

This is what Alex says about E2EVC;

"...E2EVC Virtualization Conference Events is a series of worldwide non-commercial, virtualization community Events. Our main goal is to bring the best virtualization experts together to exchange knowledge and to establish new connections. E2EVC is crammed with presentations, Master Classes and discussions delivered by both virtualization vendors product teams and independent experts. Over 50 of the best virtualization community experts present their topics..."

Last week the Orlando edition was on the agenda. I presented a session Azure Resource manager, JSON Templates and doing a fully automated deployment of RDS running on Azure IaaS. The session was entitled:

Grab a Stroopwafel while we watch ARM do an automated RDS deployment in Azure IaaS

The idea behind the session was to perform the ARM deployment live on stage while enjoying a stroopwafel J. And so, I actually brought stroopwafels for the entire audience. The deployment finished successfully within 31 minutes. After the deployment was completed I did a demo of the end result, an entire HA RDS deployment running on Azure IaaS including things like Load Balancing, SQL Server, SSL certificates, publishing RemoteApps, branding RD Web Access, configuring RD Gateway Policies and much more!

Thanks, everyone who attended my session! Thanks, Alex for hosting an awesome community event, and thanks to the all the sponsors including Nvidia, Citrix and ControlUp!

Start of the session while all attendees enjoyed a stroopwafel :)clip_image002

2 slides from the deck I presented to give you an idea about what the JSON template creates.clip_image004clip_image006

E2EVC will publish the recording of the session on their you tube channel.If you have questions on the content, need help with creating JSON templates, feel free to reach out!

Thursday, April 20, 2017

Dude, where’s my OneDrive for Business Sync in RDS? FSLogix to the rescue!

image_thumb
OneDrive for Business Sync and cache inside RDS & VDI environments, what are the options? Whats supported? Here’s what you need to know!



Introduction
Office 365 is a worldwide adopted SaaS offer these days. As part of Office 365, Skype for Business and OneDrive for Business are very commonly used products in the Office 365 suite. In many architected solutions Office 365 Pro Plus is published as one of the applications suites using Remote Desktop Services. These can vary from shared sessions on a RD Session Host to dedicated virtual machines on a Virtual Desktop Infrastructure and both are capable of publishing either a Full Desktop or separate RemoteApps. This allows users to access the full Office 365 suite, including Visio, Project et cetera, on any device and from any location.

The Challenge
Publishing Office 365 as described above however, results in an interesting challenge, what about OneDrive for Business access? Since many years, users have been provided access to a Home Drive, a space where they can store personal data and files. Typically, this was a drive mapping (most of the time mapped as the H: drive) pointing to a share on a file server. With Office 365, users have access to OneDrive for Business. The same space where they can store personal data and files, this time however hosted in the Cloud and accessible on any device at any time. A great solution offering a lot of flexibility to the end user.

Where a classic Home Drive is typically hosted on a file server that is close to the client, OneDrive for Business is hosted in the public Cloud. This means that the network connection (bandwidth, latency packet loss) now suddenly play an important role into the overall performance of working with these files. To overcome this issue, and to also allow offline access, OneDrive for Business comes with Synchronization. This basically means data stored on OneDrive for Business is also cached on the local client and synched to Cloud, a process in the background and transparent to the end user.

For a users’ personal device, this sync process is great! But what about an RDS or VDI hosted solution? How do we make sure users have access to OneDrive for Business from their hosted Remote Desktop or RemoteApps? Can we provide a home drive like experience that users a used to? And more importantly, can we have OneDrive for Business cache inside those environments? What are the options?

Discussing the options

- OneDrive for Business and Roaming Profiles
Although Roaming Profiles is an ancient solution to allow user to roam their profile & data across multiple RD Session Host Servers, I still see it being used in older environments. With a roaming profile, a user has a centrally stored profile stored on a file server and during logon and logoff, the delta of that profile is synched with the locally cached copy of the profile. If we would configure OneDrive for Business in such an environment, the OneDrive for Business cache would be synched back ‘n forth for each user at logon and logoff. Depending on the amount of data, this kills the user experience. Redirecting AppData? Don’t even go there :)

- OneDrive for Business and User Profile Disk
User Profile Disk (UPD) was introduced in Windows Server 2012. UPD also allows users to roam their profile across multiple RD Session Host Servers. This time however, not by synching the delta to and from a file server, but by mounting a dedicated .VHDX file per user, stored on a file server that contains the entire users’ profile. To accomplish this, UPD makes sure a mount path is created under C:\users\<username>. Because of this, any application that writes to and reads from the user’s profile ‘thinks’ its accessing a locally stored profile, but it is in fact a mounted file. Although this technology is fully transparent for most applications, there are important exceptions. UPD mounts the .VHDX with a symbolic link. The catch here is that Windows won’t send file change notifications. As a result, OneDrive for Business cache won’t know that files have changed and will get out of sync.

- OneDrive for Business drive mapping
A workaround used in some environments is to create a drive mapping pointing directly towards OneDrive for Business in the Cloud. This workaround simulates a home drive like experience because the end user will be presented with an H: mapping. However, since this a WebDAV connection, the experience is not too good. User will start to see delays when opening and saving files.

- Direct access to OneDrive for Business from within applications
Office 365 and Pro Plus have obviously also been improving over time. Now allowing users to access OneDrive for Business directly from the Open and Save menu’s inside for example Word or Excel. And Outlook also allows direct interaction with attachments coming from OneDrive for Business. Although this user experience is quite good, it does require a good user adoption strategy. Users need to be guided into this new way of working with their personal data. The other catch is obviously that non-Office applications generally don’t support direct open and save to OneDrive.

- OneDrive for Business and FSLogix™ Office 365 Container
FSLogix is a unique company that focusses on extending RDS and VDI solutions with a, in terms of footprint, very small yet very powerful tools that makes the life of an end user and administrator much easier. They have several tools in their portfolio, but for this use case we’ll focus on their Office 365 Container product (which I also talked about before in this article). That tool does exactly what the name implies. It provides 3 main solutions:

* True Cached Exchange Mode
Providing what they call OST containerization, allowing Outlook to function and perform as if locally installed on a high performance virtual workspace session

* Outlook Real-Time Search
This enables inbox and personal folder search to work as designed with maximum performance

* OneDrive and OneDrive for Business
This roams OneDrive (for Business) user data seamlessly without the need to resync at each logon.

The latter is of course exactly what we were looking for in the use case discussed in this article! How does it work? The FSLogix agent, that is running on the RD Session Host or VDI and configured using GPO, makes sure that the OneDrive (for Business) data is captured in an isolated container. The technology seems similar to Microsoft User Profile Disk, however FSLogix operates on a lower level of the operating system to ensure that file changes are noticed and processed. Also, the FSLogix streaming technology can cache OneDrive files in the situations where network connectivity to the file server goes temporarily offline. This is also important to ensure OneDrive files do not corrupt upon network interruption.

clip_image002[4]_thumb[1]

What’s also interesting about the Office 365 Container is that it can be layered on top of any existing profile solution you might have. It’s fully transparent and profile solution agnostic, and even the OneDrive for Business application is roamed in the O365 Container.

Conclusion
It’s clear that both the Roaming Profile and User Profile Disk option are a no go when it comes to OneDrive for Business Caching. The drive mapping might work in some scenario’s but must clearly be considered as a work around. The option to guide users to access OneDrive for Business directly from within their applications is a viable solution, but does not solve the users request to easily navigate using file explorer, and this approach mostly only works inside Office applications. FSLogix Office 365 Container is clearly the winner providing both the full OneDrive for Business Sync options users expect, without the need for a complex software suite or application back end.

Got excited about FSLogix Office 365 Container? Get more info of request a trial here: https://fslogix.com/products/office-365-container

Thursday, April 13, 2017

Securing RD Gateway with MFA using the new NPS Extension for Azure MFA!

Introduction
Back in 2014 I co-authored an article together with Kristin Griffin on how to secure RD Gateway with Azure MFA. This article was based on putting an Azure MFA Server (previously Phone Factor) in place in your on-premises environment (or Azure IaaS) to act as the MFA Server and enforce Multifactor Authentication for all session coming through RD Gateway. You can get the article here: Step By Step – Using Windows Server 2012 R2 RD Gateway with Azure Multifactor Authentication. Although this is a great solution and I have successfully implemented this for various customers, the big downside has always been the mandatory MFA Server. As part of the setup, a server in your Active Directory had to be installed running the Azure MFA Server component to be able to inject Azure MFA into the login sequence. Not only did you have to install and maintain this MFA Server, synching and managing users (and in most cases you would set up 2 servers to be HA), the other downside was that this was yet another MFA Provider for your end user. MFA server comes with a self-service portal to allow users to do their own enrollment and it can leverage the same Azure Authenticator App. However, if your end users used Azure MFA to secure e.g. Office365 or other SaaS services, that would be a different MFA provider, with a different Self Service signup sequence etc.

Introducing the NPS Extension for Azure MFA
So what has changed? A few days ago Microsoft announced the availability of the Azure MFA Extension for NPS (preview)! Read about the announcement where Alex Simons, Director of Program Management of the Microsoft Identity Division and Yossi Banai, Program Manager on the Azure Active Directory team talk about this new (preview) feature here:

Azure AD News: Azure MFA cloud based protection for on-premises VPNs is now in public preview!

Although the article specifically talks about securing a VPN, I figured the same would apply to secure Remote Desktop Gateway. And it turned out it does! In my lab I was able to successfully secure RD Gateway with Azure MFA using this new Extension for NPS! In this article I want to take you through the setup process and show the end result.

Prerequisites
There are a few prerequisites to use the NPS extension for Azure MFA, these are:

- License
For this to work you obviously need a license for Azure MFA. This is included with Azure AD Premium, EM+S, or it can be based on an Azure MFA subscription

- NPS Server
A Server is needed where the NPS role is installed. This needs to be at least Windows Server 2008 R2 SP1 and can be combined with other roles, however it cannot be combined with the RD Gateway role itself.

- Libraries
The below two libraries are needed on the NPS server, although Microsoft Guidance says the NPS Extension installer performs those installations if they are not in place, it doesn’t. Be sure to download and install these components prior to installing the NPS Extension.

1 Microsoft Visual Studio 2013 C++ Redistributable (X64)
2 Microsoft Azure Active Directory Module for Windows PowerShell version 1.1.166

- Azure Active Directory

Obviously Azure Active Directory has to be in place and users who need access, need to have been enabled to use MFA.

Installing
As mentioned in the introduction, I have written an article on securing RD Gateway with Azure MFA Server before. As you read though the installation & configuration process, you’ll see similarities with this article. That is not a coincidence, the same basic principles of RD Gateway, RD CAP, Radius Client, Remote Radius Servers et cetera all also apply on this setup.

Installing and configuring AAD & AAD Sync
Note, if you already have AAD & AAD Sync in place you can obviously skip this paragraph.
First things first, you need Azure Active Directory as a prerequisite. I won’t go over the entire process of setting up ADDS and AAD because there are many guides out there that explain this process very well. Basically you create a new AAD using the Azure Classic portal (or PowerShell), similar to below.
clip_image002[4]

With AAD in place, you can then start to sync your users from an on premises ADDS (or like in my case one that is running on Azure IaaS). To manage the AAD you can already use the New Azure Portal as shown below, although do be aware that this feature is still in preview in this portal. You can also use this portal to get a link to the most recent version or Microsoft Azure Active Directory Connect that you need to be able to sync users from ADDS to AAD.
clip_image004[4]

Again, I won’t go into great detail explaining the installation & best practices of installing AAD Connect, if you need detailed guidance on that part, check Connect Active Directory with Azure Active Directory. Basically what you do is run the installer on a server that is part of your ADDS domain and the only thing you will have to provide are the credentials of an AAD account and an ADDS connect with the appropriate permissions to access both domains.
clip_image006[4]

Once you have successfully finished the setup of AAD Connect and the initial synchronization took place, the portal will reflect this as shown below.
clip_image008[4]

With the initial synchronization complete, you can now start to assign Azure MFA to your users. To do this, open the All Users section in the Azure Portal and click on the Multi-Factor Authentication link.
clip_image010[4]

That will take you to the Azure MFA Management Portal. In the screenshot below you can see the steps to enable and enforce Azure MFA for my test user called rdstestmfa.
clip_image012[4]


Installing and configuring the NPS Extension for Azure MFA
Now that we have AAD and AAD Sync in place, lets drill down into the actual installation of the NPS Extension for Azure MFA! The first step is to download the latest version of the installer, which can be found here: NPS Extension for Azure MFA.

The NPS Extension needs to be installed on a (virtual) server that is part of the ADDS domain and that is able to reach the RD Gateway. In my case I used an ou-of-the-box Windows Server 2016 VM in Azure IaaS, but it can be anything from Windows Server 2008 R2 SP1 or above. Before installing the Extension, 3 other requirements need to be place.

1. The NPS Server role needs to be installed. Open Server Manager and add the role called Network Policy and Access Services.
clip_image014[4]

2. The library Microsoft Visual Studio 2013 C++ Redistributable (X64) needs to be installed. Microsoft documentation says this library is installed automatically as part of the NPS Extension installer, the current Preview version 0.9.0.1 does however not do this yet. You can get the download here

3. The Microsoft Azure Active Directory Module for Windows PowerShell version 1.1.166 needs to be installed. Again, Microsoft documentation says this module is installed automatically as part of the NPS Extension installer, but the current Preview version 0.9.0.1 does not do this yet. You can get that download here

Now that we have the prerequisites in place, we can start the NPS Extension installer. The setup is very straight forward, just hit Install and wait for the process to finish.
clip_image015[4]


After the installation is finished, the Extension components are placed in the folder C:\Program Files\Microsoft\AzureMfa\

Now open a new PowerShell Prompt (with elevated permissions) and change the directory to C:\Program Files\Microsoft\AzureMfa\Config and run the PowerShell script called AzureMfaNpsExtnConfigSetup.ps1. The output should look similar to below.
clip_image017[4]


While the PowerShell Script runs it will prompt you for the ID of your Azure AD tenant, you can find that in the Azure Portal, in the properties of your AAD domain.
clip_image018[4]


The PowerShell script will prompt you to authenticate to AAD with appropriate permissions. The PowerShell script then performs the following actions (source).

- Create a self-signed certificate.
- Associate the public key of the certificate to the service principal on Azure AD.
- Store the cert in the local machine cert store.
- Grant access to the certificate’s private key to Network User.
- Restart the NPS.

This completes the installation of the NPS Extension. The final step is to connect RD Gateway to this NPS Extension to get Azure MFA into the authentication process.

It’s important to realize that installing the NPS Extension causes all authentications processed by this NPS server to go through Azure MFA. There is no way to make exceptions for specific users.

Configuring RD Gateway
With the installation of the NPS Extension complete, it’s now time to configure RD Gateway. As mentioned before, this process is very similar to what Kristin Griffin and I explained here. The first step is to configure RD Gateway to use a Central Server running NPS. To do so, open RD Gateway Manager, right click the server name, and select Properties. Now select the RD CAP Store tab, select the Central Server running NPS option and enter the IP address of the NPS Server where you previously installed the NPS Extension. Also provide a shared key and store this somewhere safe.
clip_image019[4]


Now open NPS on the RD Gateway Server (not on the NPS Server that contains the NPS Extension, we’ll do that later).

Open the Remote RADIUS Server Groups and open the TS GATEWAY SERVER GROUP. Enter the IP Address of the NPS Server running the extension as a RADIUS Server, edit it and make sure the timeout settings match what is shown below.
clip_image021[4]


Now go to the RADIUS clients tab and create a new radius client with a friendly name, the IP address of the NPS Server running the Extension and enter the same shared secret you used before.
clip_image023[4]

Next, we need to configure two Connection Request Policies in NPS, one to forward requests to the Remote RADIUS Server Group (which is set to forward to NPS server running the extension), and the other to receive requests coming from MFA server (to be handled locally).

The easiest way to do this is to use the existing policy that was created when you created an RD CAP in RD Gateway. In NPS, expand the Policies section in the left side of the screen and then select Connection Request Policies. You should see a policy already created there, called TS GATEWAY AUTHORIZATION POLICY. Copy that Policy twice and rename those copies to “MFA Server Request No Forward” and “MFA Server Request Forward”.

Now edit the MFA Server Request No Forward and set the following settings, where Client IPv4 Address is the IP Address of the NPS Server running the NPS Extension. Make sure you also enable this policy.
clip_image025[4]

Now edit the MFA Server Request Forward and set the following settings, so that this rule forwards to the TS SERVER GATEWAY GROUP. Again, make sure you also enable this policy.clip_image027[4]

And lastly, disable existing TS GATEWAY AUTHORIZATION POLICY, and set the processing order of the rules as shown below.
clip_image029[4]

Configuring NPS ServerIt’s now time to configure the NPS Server running the extension to make sure it can send and receive RADIUS requests too. Open NPS on the NPS Server (not on the RD Gateway Server we did that before).

Open the Remote RADIUS Server Groups and create a new group called RDGW. Enter the IP Address of the RD Gateway as a RADIUS Server, edit it and make sure the timeout settings match what is shown below.
clip_image031[4]

Now go to the RADIUS clients tab and create a new radius client with a friendly name, the IP address of the RD Gateway Server and enter the shared secret you used before.
clip_image032[4]


Go to the Connection Request Policies tab and create a new Policy called To RDGW and use the source Remote Desktop Gateway. Set the condition to Virtual (VPN) and configure it to forward requests to the Remote Radius Group called RDGW that we created before. Make sure the policy is enabled. Below is was the Policy should look like.
clip_image034[4]


Create another Policy called From RDGW and again use the source Remote Desktop Gateway. Set the condition to Client IPv4 Address and enter the IP address of the RD Gateway server. Configure it to handle request locally. Make sure the policy is enabled. Below is was the Policy should look like.
clip_image036[4]

Preparing the user account for Azure MFA
Since our test user called rdstestmfa@rdsgurus.com is new to the organization, we first need to make sure that our test user is successfully configured to use Azure MFA. If your users are already configured for Azure MFA, you can obviously skip this step.

An easy way to do this is to logon to portal.office.com and sign in with the account. Since our test account was enforced to use Azure MFA, the portal will prompt us to configure MFA before we can continue. Click Set it up now to start that process.clip_image038[4]

In this case I chose Mobile App as the authentication method, downloaded the Azure Authenticator App for iOS and used that to scan the QR image on the portal. The Azure Authenticator App is available for Android, iOS of Windows Phone.clip_image040[4]

Click Done. To complete the verification, Azure MFA will now send an MFA request to the configured phone number of the user account.
clip_image042[4]

The user account is now ready to use for our RD Gateway setup! If you want more detailed information on the Azure MFA Service, you can find that here: What is Azure Multi-Factor Authentication?

Testing the functionality
It’s now finally time to take a look at the end result!

You can basically use any RDP Client that has support for RD Gateway. For this scenario we’ll use the RD Web Access page. We log on to RD Web Access with our rdstestmfa@rdsgurus.com account and open the Desktop. In this case we used a Full Desktop Scenario, but these could also have been RemoteApps. The RDP Client will be launched showing the state Initiating Remote Connections.clip_image044[4]

A few seconds later, the NPS Extension will be triggered to send Azure MFA a request to prompt our user for two-factor authentication.
clip_image046[4]

After pressing Verify on the Phone and without the user having to interact with the computer, the status changes to Loading the virtual machine.
clip_image048[4]


And the desktop is then launched.
clip_image050[4]

The end result is a great and seamless experience for the end user. Similar to using Azure MFA Server, but this time NPS directly contacting Azure MFA! This is a great improvement!

Eventlogs
When troubleshooting this setup, there are several Eventlogs that could come in handy.

The RD Gateway role logs event in the location:
Application and Services Logs > Microsoft > Windows > Terminal Services Gateway
Below is an example of the event that shows that end user met the requirements of the RD CAP.
clip_image051[4]

The NPS Service role logs event in the location:
Custom Views > Server Roles > Network Policy and Access Services
Below is an example of NPS Granting access to a user. You can also check the Windows Security log for Auditing events.
clip_image052[4]

And finally, the NPS Extension role logs event in the location:
Application and Services Logs > Microsoft > AzureMfa
clip_image054[4]

Additionally, you can also use the Azure MFA Portal to create reports on the usage of Azure MFA.
clip_image056[4]

Conclusion
This article ended up to become >2500 words, but I hope you find it valuable. To reiterate on what is explained in the instruction; MFA Server is this is a great solution. The big downside however has always been the mandatory MFA Server, and in most cases you would set up 2 of them to be HA. The other downside is that this was yet another MFA Provider for your end user. With the introduction of the NPS Extension for Azure MFA these downsides are now gone! You can now secure your RDS environment with Azure MFA without the need for MFA Server and a separate MFA Provider. I really believe this is a game changer not only for this scenario, but also for all other scenarios like VPN’s, websites et cetera where Azure MFA Server is currently in place. Great job by Microsoft and looking forward to this Extension becoming GA!