Welcome!

Machine Learning Authors: Yeshim Deniz, Pat Romanski, Liz McMillan, Elizabeth White, Corey Roth

Related Topics: Microsoft Cloud, Containers Expo Blog, Silverlight, Agile Computing, @CloudExpo, Cloud Security

Microsoft Cloud: Blog Post

Integrating Active Directory into Windows Azure Virtual Machines

Active Directory Deployment Considerations and Implementation Steps in Windows Azure

Migrating traditional client/server applications to Windows Azure Virtual Machines is what Don Noonan does every day. The majority of these workloads use Active Directory Domain Services as their authentication provider, or in other words, classic Windows authentication. In this post Don walks us through the best practices high level architecture and the basic building blocks of creating a private forest within Windows Azure.

If Active directory is not available, you better be
As we all know, if AD is down so is your app. Imagine setting up a single domain controller responsible for both name resolution (DNS) and authentication. You just created another synonym for single-point-of-failure. At a minimum you should deploy two (2) domain controllers, and they should be created as part of an Availability Set. This will ensure that at least one (1) domain controller is always available for authentication and name resolution requests. If you’re considering saving a few bucks by deploying a single domain controller in non-production environments, let me save you a few more. The first call you get from development or QA will cost you at least 6 months of compute. Telling a dozen upset people on a conference call that you wanted to save the company $50/month will sound pretty bad…

A private forest for me? oh you shouldn’t have
There are currently two major scenarios for providing Windows authentication in Windows Azure Virtual Machines:

  • Deploy a new private forest
  • Extend an existing on-premise forest

In this blog we’ll cover deploying a new private forest. Here is a quick Visio of a classic 3-tier application (using Windows Azure features) to get us started:

As you can see, we have a management subnet that contains our domain controllers, as well as separate database and application “tiers”.

Stop Talking and Start Deploying
As with any new deployment to Windows Azure Virtual Machines, you will perform the following high-level steps:

  1. Create an affinity group (See Bob Hunt’s Article in the Series)
  2. Create a virtual network (See Bob Hunt’s Article in the Series)
  3. Create a storage account (See Kevin Remde’s Article in the Series)
  4. Create virtual machines (See Tommy Patterson’s Article in the Series)

While creating the virtual network, you will need to specify that the domain controllers will also be providing name resolution for all of the servers in your deployment. You can do this in the Windows Azure management portal as well as through the management web service. Here is how you do this via PowerShell:

Specifying custom DNS servers using PowerShell

Example command line:

Set-AzureVNetConfig –ConfigurationPath “C:\networkConfiguration.xml”

Contents of C:\networkConfiguration.xml:

<NetworkConfiguration>
<VirtualNetworkConfiguration>
<Dns>
<DnsServers>
<DnsServer name="skydc01" IPAddress="10.1.1.4" />
<DnsServer name="skydc02" IPAddress="10.1.1.5" />
</DnsServers>
</Dns>
<VirtualNetworkSites>
<VirtualNetworkSite name="skyvn" AffinityGroup="skyag">
<AddressSpace>
<AddressPrefix>10.1.0.0/16</AddressPrefix>
</AddressSpace>
<Subnets>
<Subnet name="Management">
<AddressPrefix>10.1.1.0/24</AddressPrefix>
</Subnet>
<Subnet name="Database">
<AddressPrefix>10.1.2.0/24</AddressPrefix>
</Subnet>
<Subnet name="Middleware">
<AddressPrefix>10.1.3.0/24</AddressPrefix>
</Subnet>
<Subnet name="Application">
<AddressPrefix>10.1.4.0/24</AddressPrefix>
</Subnet>
</Subnets>
<DnsServersRef>
<DnsServerRef name="skydc01" />
<DnsServerRef name="skydc02" />
</DnsServersRef>
</VirtualNetworkSite>
</VirtualNetworkSites>
</VirtualNetworkConfiguration>
</NetworkConfiguration>

In the example above, the IP addresses used assume the domain controllers are the first virtual machines created on the Management subnet. Let’s make sure that’s true by creating them now:

Creating Highly Available Domain Controllers using PowerShell

Relevant excerpts from createService.ps1:

$instanceSize = 'Small'
$imageName = 'MSFT__Win2K8R2SP1-Datacenter-201210.01-en.us-30GB.vhd'
$subnetName = 'Management'
$availabilitySetName = 'skydc'

$password = '@skyDc01'
$vmName = 'skydc01'
$skydc01 = New-AzureVMConfig -Name $vmName -AvailabilitySetName $availabilitySetName -ImageName $imageName -InstanceSize $instanceSize |
Add-AzureProvisioningConfig -Windows -Password $password |
Set-AzureSubnet $subnetName

$password = '@skyDc02'
$vmName = 'skydc02'
$skydc02 = New-AzureVMConfig -Name $vmName -AvailabilitySetName $availabilitySetName -ImageName $imageName -InstanceSize $instanceSize |
Add-AzureProvisioningConfig -Windows -Password $password |
Set-AzureSubnet $subnetName

Once you’ve created the servers, you will need to make them domain controllers, also known as promotion.

Promoting a Server to a Domain Controller using DCPROMO or PowerShell

Depending on what operating system you have chosen, you can automate forest creation via command line. In the following examples, be sure to replace DOMAIN_HERE with the desired domain name, and replace passwords with those corresponding to temporary password you assigned to the local administrator account on the first (primary) server.

Windows Server 2008 R2 – Create a new forest using DCPROMO

dcpromo.exe /unattend:C:\primaryDomainController.txt

Contents of C:\primaryDomainController.txt:

[DCInstall]
; New forest promotion
ReplicaOrNewDomain=Domain
NewDomain=Forest
NewDomainDNSName=[DOMAIN_HERE].com
ForestLevel=4
DomainNetbiosName=DOMAIN_HERE
DomainLevel=4
InstallDNS=Yes
ConfirmGc=Yes
CreateDNSDelegation=No
DatabasePath="C:\Windows\NTDS"
LogPath="C:\Windows\NTDS"
SYSVOLPath="C:\Windows\SYSVOL"
[email protected]
RebootOnCompletion=Yes

Windows Server 2012 – Create a new forest using PowerShell

C:\primaryDomainController.ps1

Contents of C:\primaryDomainController.ps1:

Import-Module ADDSDeployment
Install-ADDSForest `
-CreateDnsDelegation:$false `
-DatabasePath "C:\Windows\NTDS" `
-DomainMode "Win2012" `
-DomainName "[DOMAIN_HERE].com" `
-DomainNetbiosName "DOMAIN_HERE" `
-ForestMode "Win2012" `
-InstallDns:$true `
-LogPath "C:\Windows\NTDS" `
-NoRebootOnCompletion:$false `
-SysvolPath "C:\Windows\SYSVOL" `
-Force:$true

Part of your homework will be to create the second domain controller in the new forest. There will need to be slight changes made to the answer files above.

What's Next?

Creating the rest of servers required by your application seems like the logical next step. However, there are a handful of important tasks I like to do prior to creating ANY additional virtual machines:

Create domain user accounts that will be used for future system administration.

Create containers for major objects such as server computer accounts.

Create core group policies for significant items such as:

  • Remote Desktop Services – Enable Keep-Alives (article posted previously at Skylera)
  • User Account Control
  • Windows Firewall
  • Windows Update

Important Considerations

When creating a private forest, consider the amount of administrative overhead involved vs. level of isolation. For example, you may want to have a single forest for all pre-production environments so that you only need to perform user account tasks in one place. This is easy to do in Windows Azure.

Written by Don Noonan (Don's Blog at Skylera)

Edited by Tommy Patterson (Tommy's Blog on Virtuallycloud9.com)

Be Sure to Read Up on the Rest of the Series for 31 Days of Servers in the Cloud!

http://virtuallycloud9.com/index.php/31-days-of-servers-in-the-cloud-series/

More Stories By Tommy Patterson

Tommy Patterson began his virtualization adventure during the launch of VMware's ESX Server's initial release. At a time when most admins were only adopting virtualization as a lab-only solution, he pushed through the performance hurdles to quickly bring production applications into virtualization. Since the early 2000s, Tommy has spent most of his career in a consulting role providing assessments, engineering, planning, and implementation assistance to many members of the Fortune 500. Troubleshooting complicated scenarios, and incorporating best practices into customer's production virtualization systems has been his passion for many years. Now he share his knowledge of virtualization and cloud computing as a Technology Evangelist in the Microsoft US Developer and Platform Evangelism team.

CloudEXPO Stories
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
In this presentation, you will learn first hand what works and what doesn't while architecting and deploying OpenStack. Some of the topics will include:- best practices for creating repeatable deployments of OpenStack- multi-site considerations- how to customize OpenStack to integrate with your existing systems and security best practices.
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the benefits of the cloud without losing performance as containers become the new paradigm.