Menu
After 20 years of Windows Server, Microsoft looks to agile future

After 20 years of Windows Server, Microsoft looks to agile future

Windows Server 2016 still comes with a traditional, graphical server interface, but that’s not the future for the server operating system, according to its chief architect.

Windows Server hasn’t quite reached its silver anniversary (the first version, Windows NT 3.1, shipped in July 1993, although it first saw the light of day in a demo in 1991). But 23 or 25 years later, depending on how you want to count, Windows Server is becoming a very different server operating system. Part of that is due to a new focus at Microsoft on supporting the platforms that customers want to use.

The change in focus at Microsoft reflects a move in the industry towards new development models that take advantage of cloud services, containers and microservices, a trend that Jeffrey Snover, lead architect for the Enterprise Cloud Group and the Microsoft Azure Stack, calls “born in the cloud.” It’s a long way from the first version of Windows NT that ran on a 486 processor, but it’s not the first time that Windows Server has changed so fundamentally.

In fact, Snover breaks down the history of Windows Server into four eras, based on the problems Microsoft was trying to solve at the time: “the server for the masses era, the enterprise era, the data center era and now the cloud era.”

Four eras, three architects

The three architects who designed Windows Server over the years all came to Microsoft from Digital Equipment Corp. “Dave Cutler was the first. He came in and gave us the great kernel that led us through the ‘server for the masses’ era. Then Bill Laing took over as chief architect; he was a big enterprise guy and he really took that enterprise approach to the server.”

David Cutler Majorconfusion/wikimediacommons

David Cutler at work on Windows Azure

Both Cutler and Laing moved on to work on Azure, Snover says. “Then I took over as chief architect and focused on the management aspects of it and the cloud aspects of it.”

“At the heart of Windows Server is Dave Cutler’s great kernel, the object-based kernel and the separation of responsibilities. That's been the enduring thing, the heart and soul of Windows,” Snover says. But to be successful, it needed to be both manageable and affordable.

“The thing that made Windows so successful was matching that kernel with a great desktop experience and then running it on PC class hardware,” Snover says. “That combination meant that … now anybody could buy their own server, they could deploy their own server and they could run it. That really was the magic.”

Windows Server also succeeded, release after release, by adding features that you had to pay extra for on other operating systems, from transaction management tools and web servers, to software-defined storage and networking, and virtualization management.

“That is the heart of Microsoft,” Snover says. “First we take what used to be only available to princes and high priests and we make those things available to everyone because they can be run by everyone; we simplify things so they can be run by everyone. And then we run them at economics that mean they can be afforded by everyone.”

The rise and fall of the GUI

Alongside the ever-increasing list of features, the management options of Windows Server gradually changed, to the point where Nano Server, a minimal version of Windows Server 2016, has no graphical interface at all. Snover calls this the future of Windows Server and says it’s “by far the most important, most significant change in Windows Server since we came out with Windows NT.” Nano Server is a ‘major refactoring’ of Windows Server to create a smaller footprint that’s more secure and needs much less patching, and has no local interface at all – you can manage it only remotely.

If you look back, you can see the trend that led to Nano Server, even if it wasn’t obvious at the time. “In the enterprise era of Windows Server, we still had the local GUI,” Snover notes. “You just did it remotely with RDP. You still had a local management interface, we just remoted the GUI stack. But that wasn't going to work for large systems for data centers, so we had had to switch to automation.”

That automation was based on PowerShell, a command line scripting and configuration management system built on .NET that Snover helped to design. If you’re used to using PowerShell, moving to Nano Server will be less painful, but even so, it will take some effort to get there. Snover says Microsoft realized during the development of Windows Server 2016 that automation isn’t the right answer for every business.

Snover jokes that “architecture is the art of deciding when one thing should be two and two things should be one,” but behind the joke is a serious point about the different roles that Windows Server needs to perform.

Jeffrey Snover flickr/Microsoft PDC

Jeffrey Snover participates in a PDC panel discussion in 2009

“Back in the Windows NT days, we had one thing. We had this great combined kernel and desktop operating system. Now what we need is two things. We need that first thing for small businesses, for people who don’t have the skills to run very large systems. But then we also need a version that doesn't have that desktop operating system. That's the version that the big enterprises, the clouds and these new born-in-the-cloud applications need, something that has that great kernel and then has as many features as they need, and no more.”

“In the past, because of the way we did software, it was a benefit to have everything there. If you needed it, you used it. And if you didn’t need it, you didn’t use it. It wasn’t a big deal.” That kitchen sink approach isn’t realistic any more, he maintains.

“Today, we have so many resources, so why not continue with that model? The answer is that it's all about efficiency. It’s about speed and agility. With Nano Server, when you boot it up, you use half the kernel memory, and that means you can have more and more instances on the same hardware. But it’s also about security. Security has become even more important, so you need something that's absolutely lightweight and has only the features you need and use.”

Addressing two very different markets doesn’t mean compromising on either of them, Snover insists. “The reality is that Windows Server 2016 is great for the cloud era but it's also great as a server for the masses. In fact, it's better now, because now it's really the client experience with the server features added.”

That’s important, because it means that Windows 10 and Windows Server 2016 are consistent, which is what Snover says customers asked for. “When we did the technical previews for Windows Server 2016, which didn’t have that client option, customers gave us a very strong message; ‘no, that’s not right – we still need that, we need a great client experience for server.’ ”

There are potential drawbacks to this approach, Snover warns. “The ramification is you can no longer go from ‘server with a desktop’ to Server Core. But it turns out that the ability to do that was important when people weren't sure whether Server Core would meet their needs…. Now we're confident that Server Core has everything people need and they can be successful with it.”

Snover says that Server Core benefits from the work Microsoft has done to create the minimal Nano Server option. “Frankly, the focus on Nano Server has driven the clean-up of the long tail of manageability; because if you can't do it remotely with Nano Server, you can't do it at all.” And yes, that is a big change in what it means to work with Windows Server. “There’s a little bit of Cortés burning his ships,” he admits.

Evolving development models

Another way that Windows Server has stayed relevant for so long is by supporting new application models, from client-server to n-tier, n-tier plus web, and now cloud.

Again, Snover highlights how important it was to make application development available to more businesses. “[In] Windows Server 2003, .NET really allowed that line of business app explosion…. Before that, you could write a VBScript program, but now you could write a real mission critical application because you were freed up from the minutiae of things like memory allocation.”

Microsoft has moved quickly to support containers in Windows Server and Azure as they’ve become popular, but Snover notes that the idea of containerization isn’t as new to Windows as you might think. “Think of it less as ‘we didn’t have it and now we do’ and more as a linear scale. We used to have processes, then we had job objects, which did some resource management, then we had virtual machines, which did namespace isolation. Containers really fuse these things together and put a better user experience on top. Windows Server containers are a great improvement to our job object with better resource management and basically a namespace switch in the OS that gives you that isolation.”

Hardware improvements have made these techniques much more widely applicable, Snover says. “The increased networking bandwidth, speed and lower latency mean now I can do with protocols things that I could only ever do in the past with DLL calls — and because of that I can now separate things into their own environment, where they have their own versioning and their own lifecycle. That's the big thing of this era. In principle you could always do that, but the network was just so expensive that it made no sense — or you had to work at really high levels of abstraction.”

Then there’s the emerging idea of “serverless computing” (like Azure Functions and AWS Lambda). “Of course there's a server there, but you don’t have to worry about it. The Azure Automation service has been doing this for a while: Give us your PowerShell code and we’ll run your code. You don’t have to worry about the server and setting it up, we handle that. The way we do that, there is no server but when your code runs, we fire up a server and put your code on it and run it. And when your code's done, then we throw that server away. And by the way, in that environment, having a very small, very lightweight, very fast server [like Nano Server] is very important.”

The question with all of this is, how ready are Microsoft’s business customers for these changes? After all, enough businesses are still hanging on to older versions of Windows Server that Microsoft attributes some of the recent growth in its enterprise service revenue to “customer demand for Windows Server 2003 end of support agreements.” That means those businesses are paying extra to get security updates for a version of the OS that hasn’t been eligible for support for a year.

“It is a big difference,” Snover admits. “And with any big change, there are people who get there early, there are people who wait and hover, and there are people who hang on for dear life to where they are.” But he points out that this kind of transition is nothing new.

“As we go from one model to another, there’s always a period of chaos and confusion. There’s an old model that people try to adhere to, but it ceases to solve the problems people want to solve. If the old model solves the problem they want to solve, it works fine for them. Windows Server 2016 is a great OS for someone who wants to buy a server and attach touchscreen monitors and say ‘hey Cortana, launch IIS’.”

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about AWSBillMicrosoft

Show Comments
[]