Follow Us on Twitter
by admin on March 31, 2006 03:00pm
Sam Ramji sits down with resident UNIX guru and Microsoft Architect Jason Zions to discuss the history of Windows and UNIX interoperability.
Format: WMV Duration: 40:43
Updated: Download the transcript (PDF)
by admin on March 31, 2006 02:00pm
Microsoft’s Open Source Software Lab is an ambitious research project. Located on the company’s main campus, the lab houses more than 300 servers, which collectively run more than 15 versions of UNIX and 50 Linux distributions. It boasts a team of senior-level programmers and system administrators, some of whom were architects of popular Linux distributions or authors of well-regarded books. In short, the lab is one of a few such facilities in the world dedicated to open source research.
The driving force behind the lab is Bill Hilf, General Manager of Platform Strategy at Microsoft. Hilf joined the company in 2004 after working at IBM, where he was instrumental in driving IBM's Linux technical strategy for its emerging and competitive markets organization. Prior to his stint at IBM, Hilf was VP of Engineering at E-Toys, where he helped build the company's e-commerce infrastructure.
When Hilf speaks about the lab and his involvement, the usual response he gets is, “At Microsoft? Why run Open Source in a mixed environment at Microsoft?” While theories abound—ranging from “Microsoft is working on its own Linux implementation” to “Microsoft is considering porting Windows to Linux”—the truth is far simpler. The lab provides Microsoft with deeper insight into the world of open source software, and it helps the company improve how Microsoft products work with open source software.
“Contrary to the belief that Microsoft is anti-open source, the reality is not so black-and-white,” says Hilf. “Most customers don’t live in an either/or world, nor do they choose a technology based on its development model. Instead, they choose a technology based on its ability to serve a business need or solve a particular problem. By running open source software in a Windows environment, we’re learning how those technologies can work better together so that our customers can benefit from a broader range of choices.”
One of the issues being addressed in the lab is how Microsoft management tools can do a better job in heterogeneous environments. For example, for customers who are using Microsoft Systems Management Server or Microsoft Operations Manager and need to manage a Linux or UNIX server, the lab can provide input on which third-party technologies can enable that scenario.
Another example is the testing it has done with Windows Server 2003 R2, which includes a variety of UNIX-based services like Network File System (NFS) and Network Information Service (NIS). Collectively called the Subsystem for UNIX-based Applications, the lab tested those services extensively to see how well they could interoperate with open source software in a data center environment.
Practicing the Art of Coopetitition Although testing interoperability between Microsoft products and open source software is one of the lab’s primary roles, it’s not the only one. The lab also helps Microsoft to build better products through a deeper understanding of open source software.
“Licensing restrictions permitting we analyze and benchmark open source software in areas where Microsoft competes or has an interest,” says Hilf. “We share those results with other teams at Microsoft, who use the data to determine how we can improve our own products.”
One recent example is the work the lab did for the Microsoft Windows Compute Cluster Server 2003, which the company announced in late 2005 as part of its entrance into the high performance computing (HPC) market. Today, that market is largely dominated by Linux.
“When the product team first began building Compute Cluster Server, they asked us to find the best HPC solution from an open source perspective,” says Hilf. “We built a large, clustered system using Linux and did extensive benchmarking, then we wiped out that installation and ran the same tests for Compute Cluster Server. The data we collected will help us to deliver a more compelling product.”
“Both Windows and open source software will continue to be around for years to come, so it’s important that we test and analyze interoperability with open source software even if we may sometimes compete with some of this software – this is the real world where mixed environments exist,” says Hilf. “Coopetition – cooperating and competing – is part of the real world. Customers exist in the real world so we focus on what they care about, not what people philosophize about.”
Although there are many different ways that Microsoft could gain that desired knowledge about open source software, Hilf believes that one of the most effective ways is through a hands-on approach in which his team must address the same challenges as customers who run open source software in real-world scenarios.
“Deeply understanding a technology without actually using it would be like trying to deeply understand a foreign country without spending any time there,” says Hilf. “Listening to Berlitz language CDs or reading travel guides might help familiarize you with a foreign culture before you visit, but you’ll remain a tourist until you’ve lived there for a while.” Rather than function as a third-party trying to understand the open source phenomenon by looking in from the outside, the Microsoft Open Source Software lab is immersing itself deep into this space, relying on hands-on experience and hiring the necessary technical expertise to generate fact-based, unbiased information.
“We’re out to find the science that proves or disproves the statements made about open source software, so that we don’t need to guess or draw abstract conclusions,” says Hilf. “By being a center of knowledge and competency, we’re able to provide hard facts to Microsoft product teams when they ask questions on the state of management for open source software or the state of a certain open source application.”
A Piece of Fiber and a Hole in the Wall When Hilf was asked to build a lab and hire a team of researchers, he had no idea that he would literally be starting from scratch. Microsoft runs everything on Windows, yet Hilf had to make the lab a resemble a real-life, open source environment, meaning that he would probably get limited help from Microsoft’s IT group.
“During one of my first days on the job, I stood in an empty room while some IT guys threaded a network cable through a hole in the ceiling,” said Hilf. “I was still standing there, staring at the piece of fiber, when one of the guys came downstairs and said, ‘That's it—you're on your own now.’ Other than that cable, we literally had to build the lab from scratch.”
Hilf’s first step was to hire the lab’s staff, which would be its most important asset. The lab employs a mix of employees and contractors, all of whom have been senior developers or systems administrators in the open source community. Some have been chief architects or technical leads for Linux distributions, such as Daniel Robbins, the founder of Gentoo Linux, who worked in the lab from June 2005 to December 2005.
Other lab staffers have deep UNIX experience and are authors of UNIX books or tools. The lab also boasts open source software security experts, embedded developers, virtualization and clustering experts, and developers with strong backgrounds in GTK+, GNOME, KDE, and Localization.
After hiring a staff, Hilf began to assemble a vast array of different hardware, software, and applications. The lab contains more than 300 servers from vendors including Dell, Hewlett-Packard, IBM, Microtel, Penguin, Pogo, and Sun. The lab’s software is even more diverse, with some 15 versions of UNIX and 50 distributions of Linux—including many lesser-known ones like Asianux, CentOS and NetBSD.
“We run dozens of different versions of Linux to test open-source interoperability in a multitude of scenarios,” says Hilf. “And because we do everything on our own, from running our own network and security services to patching and updating, our environment mimics those of real customers.” If a Microsoft product makes it through this lab, it will probably survive in 90 percent of the UNIX, Linux and open source customer environments out there.”
An Open Source Bubble in a Sea of Microsoft One of the more interesting and unexpected dynamics at the lab is that this very large open source and UNIX shop is surrounded by the world’s largest all-Microsoft IT environment, which includes Windows-based security services, Internet proxies, mail services, and other IT infrastructure elements.
“Customers frequently ask us how we manage open source software inside such a Microsoft-centric IT environment,” says Hilf. “They want to know how we get the platforms to work together, how we handle software deployment, and what kind of tools we use. We’ve had to figure out ways to interoperate not just within the lab, which itself is incredibly complex and diverse—but also between the lab and the rest of Microsoft.”
The lab’s breadth of management tools parallels its diverse servers, operating systems, and other applications. For software management and distribution, the lab uses a combination of Microsoft Systems Management Server and mainstream open source software, solutions, and services, such as Vintela VMX, Kickstart, Red Carpet, Portage, and Red Hat Network. To remotely manage the lab’s more than 300 systems, lab staffers use SSH, VNC, X-Windows Tunneling and Windows Terminal Services.
While it’s highly unlikely that any single customer would run such a diverse range of technologies, mimicking a broad range of scenarios allows lab personnel to better understand the challenges that customers face and hopefully play a role in remedying those issues.
“Running such a diverse range of technologies within a Windows-centric IT infrastructure has allowed us to test interoperability on a daily basis,” says Hilf. “In the process, we’ve learned some very interesting things. Some are simple, like how to access the Internet from Linux machines behind a Microsoft ISA proxy. Others are more complex, like how to set up highly mixed storage and backup systems .
Building and Testing Interoperability at the Lab One of the lab’s more interesting discoveries came about while testing the interoperability of management tools—specifically, in extending Microsoft Systems Management Server (SMS) so that it can be used to manage UNIX, Linux, and even Apple systems.
“SMS was built to use an open protocol—called OpenWBEM—to communicate with other software that runs on non-Microsoft systems,” says Hilf. “By using that capability to extend SMS using Vintela Management Extensions (VMX), we’re able to manage all of our servers and desktops through a single interface..”
Another useful lesson in how commercial third-party software can be used to extend Microsoft products for use in a non-Windows environment involved Microsoft partners Centrify and Vintela, whose solutions the lab used to integrate its UNIX and open source systems with Microsoft’s Active Directory directory service, which provides identity, user access, and policy management services.
“The combination of Active Directory and Centrify Direct Control gives us a really powerful single authentication solution across a highly mixed environment. Although there are other ways to do this, we’ve had good success with this solution,” says Hilf.
The lab also played a role in helping test support for Linux in Microsoft Virtual Server 2005, Service Pack 1, which can virtualize Linux and Sun Solaris operating systems on servers running Windows. “We ran all 50 Linux distributions as guest operating systems on a single machine running Virtual Server,” says Hilf. “It worked out very well because we didn’t need a separate server to test each version of Linux.”
In “A Look Inside Microsoft’s Open Source Software Lab – Part 2,” (available here the week of 4/10/2006), we examine some of the lab’s other areas of focus, including work to facilitate skills transfer between Windows-based and UNIX/open source environments. We also look at research the lab is doing to better understand related aspects of the open source software where it may be headed next, including a look at the community development model and other key trends.
by admin on March 31, 2006 01:00pm
Getting the Open Source Software Lab up and running presented a number of challenges – not the least of which was how we were going to manage fifty Linux distributions, fifteen versions of UNIX, and multiple Windows instances deployed across literally hundreds of physical and virtual servers. This is quite a job for any management solution. Being the pragmatists we are, we decided to use this to test the viability of SMS (Microsoft Systems Management Server) using VMX (Vintela Management Extensions) in a mixed environment.
We deployed the solution and found it to be capable of handling our environment. Currently a large part of the lab is managed by SMS and VMX. When we describe this to people we are often asked, "Why does Microsoft supports this kind of solution? Why do we care about mixed environments?"
We asked Bill Anderson, Lead Program Manager on the Windows Management Team, and here is what he had to say:
Bill Anderson Not really, but his lab is less camera-shy.
The first question I always get asked is, “what really was the catalyst for SMS to seek out a partner to provide extensions to OSS/Linux? Simple – our customers demanded it. Our existing SMS customers are managing both desktops and servers, and have a multitude of platforms in production in those environments and wanted to extend the success they have with SMS on Windows to those additional platforms. And, as we’re driving SMS into new customers, it has become one of the top requirements for customers – an integrated solution to manage all their critical platforms.
Now, the second driver was the WAY in which the market was doing cross-platform management. It’s, well, “suboptimal”. You either take 2 management systems (Windows mgmt, non-Windows mgmt) with their own array of servers, agents, and databases – and join the databases, or you try to take one agent that runs on all platforms, and you can then only join the things that are the same/similar. You either get a bunch of extra infrastructure with no leverage of skillsets, or you get a lowest common denominator management experience.
What we did was option 3 – build a single shared infrastructure that was extensible at the protocol, data, and UI layer, and then take the 2 leaders in the field to build from that same plumbing. So, we optimized our agents for the work on Windows, and we worked with the Vintela team as the experts in managing OSS/Linux to really optimize their experience for that platform. So, what does a customer get? 1 database, one UI, one protocol, and agents unique to each platform. Low operational cost, leveraged skillsets, and the opportunity for each vendor to really highlight the best they could do on each platform. Some of the things that Vintela can surface and manage on the Linux platform, using SMS as a pipeline, are pretty amazing! They’ve extended our UI to really expose all the remote functions available on Linux from the different vendors like Red Hat, SuSE, HP, and Sun. My challenge to them was to make Linux look BETTER in SMS than Windows does. We’ll try to make Windows more manageable by adding more, not by restricting. And the results are pretty compelling. As Andi put it in Network World's Network/Systems Management Newsletter: “Yes, you read that correctly - Microsoft tools can make Linux management easier. To its credit, Microsoft has made this easier through partnerships and programs like its Dynamic Systems Initiative - a commitment from Microsoft and its partners to deliver self-managing dynamic systems…(snip). This allows enterprises to leverage their investment in native Windows tools to make them a very effective management platform for diverse networks. “
How the Vintela solution works is really pretty simple. They take a WBEM based agent (they are the project maintainer for OpenWBEM) that runs on the major OSS platforms, that points at a URL that is our Management Point role. They extend our MMC based UI and voila – instant management for Linux! No database schema changes required, no separate middle infrastructure, etc. Initially, there was an ISAPI.dll “gateway” they had built to convert their agent protocol to ours at the Management Point, but we’ve worked to even eliminate that as they are now using our native protocols. As you can see, this is a slam dunk for a customer using SMS already to manage Windows that wants to extend it to manage Linux. It’s amazing to walk to an SMS admin, open their admin UI, have them see machine collections based on Linux versions/vendors, and be able to send software to a group of Linux machines in about 3-4 clicks. But, we’re even seeing customers use THIS as a solution for managing Linux only!
Vintela has done a great job of really just using the SMS UI, database, and pipes as their engine, and leveraging all the manageability on the native Linux platform to provide a great stand-alone tool for managing Linux. Inventory, software distribution, patch management and remote tools – all in one single UI and infrastructure. The other key is really leveraging the OpenWBEM work to provide consistent management on different Linux versions. The Vintela team has done a great job of driving consistency via OpenWBEM, but still leverage all the extra tools and functions provided by each Linux vendor. If I were managing Linux systems (not a lot of that around here by the way!) I’d definitely use it!
by admin on March 31, 2006 06:59am
Who would have guessed?
A sincere thank you for all the excitement and feedback since we launched Port25 last week. We’ve had a tremendous response and the conversation has been lively to say the least.
There have been hundreds of blog posts and hundreds of emails sent – both through the feedback aliases and many that you have sent directly to me. There have been rants, demands, questions, encouragement, suspicion, affirmation, ideas, pontifications and guidance. There are many of you who gave us technical advice (such as video formats) that was valuable and we’re making those changes – thank you for this input. Many of you have asked about the signal-to-noise ratio, and some of you have commented on this to me both on the blog and privately. I was pretty adamant about keeping the blog post system wide open to start, and introduce a registration system if you wanted us to. We’ve heard this loud and clear, and we’re looking into this now.
Let me clarify some things and hopefully set some expectations. Our goal with Port 25 is to have a community discussion on people working with OSS and Microsoft software. Many of you who know me, know that I’m a no BS type of guy and I’ve spent many years answering the 3am pager calls when problems arise in the data center. At 3am, there’s not a lot of interest in technology dogma and rhetoric. I’m now officially a PHB (hair withstanding) and although I don’t carry the pager, I haven’t lost this core principle. I understand there is going to be philosophy and zealotry, and that’s why I titled this ‘Who would have guessed?', but the work our team does in the OSS research lab is heavily oriented around trying to understand and help real customer interoperability. So this is the type of discussion that you'll hear from us, more than trying to answer why Microsoft doesn't give away all its software for free, etc.
This does not mean we won’t discuss the issues, we will, but I wanted to explain our intent and hopefully the community that grows here will be able to focus on productive and progressive technical discussions. I’m sure there are options out there on the Web for those who want to bash Microsoft, or dream up yet another conspiracy theory, but our goal here is to evolve and to hopefully provide information that makes it easier for people using OSS and Microsoft software in the real world.
So what’s next? Sam and Kishi have a variety of topics on deck for discussion and we’re going to be diving into more analysis as well as new profiles on other folks we find interesting at Microsoft. I’ll be blogging soon on a few of my experiences in open source software over the past twelve years, particularly looking to start conversations with you about your approaches and thoughts on these subjects. Also, I’ll talk about conversations I have with customers around the world on what interoperability issues they are interested in discussing. I’m thinking my next blog post should be about the talk I gave at Linux World Boston last week, and some of the ideas on interoperability I shared there. If you'd like to keep track of when we're adding new content to the site, please subscribe to our RSS feed on the home page.
Again, a sincere thank you and I look forward to seeing this community grow. And for the curious, my Russian is very, very rusty.
by admin on March 31, 2006 01:21am
Ben Canning, Group Product Manager from the Office team talks Watson and how this unique solution is helping improve the Windows experience.
Format: WMV Duration: 30:22
Updated: Download the transcript (PDF)
by admin on March 29, 2006 03:35am
Directory specialist Jackson Shaw from Quest Software joins Sam Ramji from the Open Source Software Lab at Microsoft, to talk Active Directory and interoperability.
Format: WMV Duration: 25:20
by admin on March 28, 2006 06:00pm
Roblimo from Slashdot warned me.
Last Fall, I did an interview on Slashdot and put my email address at the end of the interview, following the statement that if “If you'd like to contact me directly, I can be reached at billhilf at microsoft dot com” Roblimo told me that I might want to rethink including my email address, and suggested possibly a link to a Web page as a way to redirect possible spambots and general bedlam.
I didn’t get the spam (or the Microsoft Exchange spam filters are really good) but I did get some feedback. Approximately two thousand emails of feedback. The interview posted a few days before I presented at Linux World San Francisco, which meant I was getting most of the email while I was on the road and preparing to discuss Interoperability and the Open Source Software labs I run here in Redmond. Although it was somewhat of a deluge of email, the feedback was extremely valuable (thank you to all of you who wrote to me) and really helped me realize the importance of this subject. Tim O’Reilly has talked about the importance of architectures for participation. The value of building an architecture to allow participation was never more clear than reading (and responding!) to thousands of these emails. Now you may be reading this saying “Amazing, it took this guy how long to learn about blogs?” and that’s a fair criticism, and Jason Matusow and Robert Scoble have been telling me to do this for a long while now. But hopefully by the time you finish reading through this site, you’ll understand why Port 25 is somewhat more than just a place to blog.
So why is it called Port 25? Some background on port numbers first. SMTP is short for Simple Mail Transfer Protocol and is the protocol for sending email messages between servers or from a mail client to a mail server. On a server, the port for SMTP is 25. When you open a port on a server, such as to allow for SMTP traffic, it is commonly referred to as ‘listening’ on the port. Port 25, therefore, is a metaphor for how we are opening the communication lines to for a discussion around Open Source Software and Microsoft. Cute, huh? As someone who has many hours at the command line, debugging things such as protocol states (LISTENING?) and getting software and servers working to provide some type of service, the concept of server ports and being open is well engrained in how I and the team here in our lab think about communications – so we thought it was applicable to how we want to start the dialogue around this subject. I guess it just took a Slashdot interview and a couple thousand emails (and consistent nudging from friends) to really drive the point home that having a participative discussion around OSS and Microsoft technologies is a good thing, not –as many people may believe- something we want to ‘hide’ or shy away from.
What will you find here? This will be the place we not only blog, but also where we put analysis from our OSS labs and also where we discuss and show other parts of Microsoft that we think are just plain cool or interesting. I think what you’ll see here over time is how a bunch of open source guys inside Microsoft think, as well as people and technologies inside Redmond that we think other folks like us would find interesting as well. So, there will be much more to discuss, debate and learn from together – but for now, port 25 is open.
by jcannon on March 22, 2006 09:47pm
Healthy and productive discussion only occurs when there are two parties listening & responding to each other – the principle element of all communication. This is the foundation that Port 25 is built on. And everything we do will be guided by a set of 10 hard and fast rules based on that principle:
Port 25 is about having a healthy conversation with customers and the industry wherein people can talk openly and honestly about their biggest interoperability challenges, whether it is on UNIX, Linux, Windows, or among other open source packages.
Be accessible. Our door is always open and the address won’t change. We’re here for the long haul.
Be approachable. No comment goes unread & every idea (common sense required) is openly discussed. No topic is off-limits.
Be smart. Think before you post – everybody is going to have different opinions based on very different experiences. Don’t re-enforce stereotypes or unhealthy perceptions that don’t move the conversation forward.
No Marketing. When we invest in Port 25, we invest to provide tools, resources & opportunities to help the customer, not to sell them.
Know the solution, not the platform. Customers buy solutions that solve business problems. They rarely look at the platform alone; we will approach problems in totality and understand objectively the “right tool for the right job.”
Change takes time. Real change takes time. Change within Microsoft and within the industry will happen with time.
Teach What You Know. A lot of IT knowledge is trapped in our heads and will never be captured in a blog, research paper or study. Take the time to commit & talk to each other when you see an opportunity to help someone solve a problem.
Learn What You Don’t. Take advantage of the community & listen to what each other have to say. We live in a time of unprecedented – and growing – complexity. Every bit helps.
Grow. Our learnings will grow our skills and our understanding of what technology is capable of. This is what will help us – and our customers – work better together.