Follow Us on Twitter
by admin on August 30, 2006 01:17pm
Kishi Interviews Arne Josefsberg General Manager of Infrastructure Services in Windows Live Operations. He is responsible for the strategy, design and operation of the online infrastructure that forms the foundation for Microsoft’s online businesses. His areas of responsibility include: global data centers, networks, hardware and operating systems standards; and foundational shared system level services such as caching, load balancing, DNS, Active Directory, back-up/restore, remote management access, content replication etc.
During Arne’s 20 years with Microsoft, he has held technical leadership roles in a number of areas. His Microsoft career started as a Technical Manager on the team that built the Nordic subsidiary region business. Since then he has held management roles in Product Support Services leading OEM and ISV customer teams and OEM hardware compatibility certification; as well as leadership roles in Information Technology managing customer systems applications and IT infrastructure. As a member of the MSN 1.0 launch team, Arne gained early knowledge of the online services business building the dial-up access network as well as data center operations.
Prior to joining Microsoft, Arne worked in systems engineering at Intel Corporation specializing in CPU, operating system and software consulting for Nordic customers. Arne holds a Masters degree in Physics from the Lund Institute of Technology, Sweden.Outside of work he enjoys spending time with his wife and two children, as well as photography, reading and various forms of exercise including soccer, weight training and cycling. Arne shares some insight into the challenges faced when managing a global infrastructure and provides some tips for success.
by jcannon on August 29, 2006 01:58pm
Last week, O'Reilly Media posted portions of two conversations that took place at the O'Reilly Radar Executive Briefing. One between Tim O'Reilly and Brian Behlendorf about lessons from Apache and CollabNet, and the onter between Bill Hilf and Danese Cooper of Intel about Open Source at Microsoft. O'Reilly was kind enough to allow us to re-post these discussions on Port 25 - we're hoping you enjoy the lively, and frank discussion as much as we did. Details below...
From O'Reilly: Distributing the Future August 21, 2006: "Open Source at Microsoft" Total running time: 33:40
Production Notes The initial montage is from Tim O'Reilly, recorded at OSCON '04 in a phone interview with Doug Kaye of IT Conversations, and used with permission. "The future is here, it's just not evenly distributed yet" is a quote from author William Gibson that Tim used with attribution.
Credits include special thanks to David Battino for composing and performing the theme music. David can be found at Batmosphere.com, and he also edits O'Reilly's Digital Audio site. David provided a lot of help and feedback getting this program launched. We used Soundtrack Pro, Bias Peak, and Audio Hijack Pro to put it together.
Daniel H. Steinberg is a developer, a longtime technical writer, and currently spends most of his time podcasting for O'Reilly.
by jcannon on August 24, 2006 06:41pm
If you've subscribed to our RSS feed in the past couple of months, and there are many of you, please update to our new home at Feedburner.
Our new RSS feed is here: http://feeds.feedburner.com/Port25/
by MichaelF on August 24, 2006 05:47pm
Alfonso Fuggetta is a Professor of Software Engineering at Politecnico di Milano in Italy, CEO of CEFRIEL and Faculty Associate for the University of California, Irvine's Institute of Software Research. Among his many activities, Alfonso advises European Policy Makers on Information Technology Issues.
A paper, written by Professor Fuggetta: Open Standards, Open Formats, and Open Source is widely circulated at Microsoft as a helpful guide in thinking about Open Source, Standards and Formats.
In this interview Bryan and Professor Fuggetta discuss his views on Open Source, Standards and Formats as well as why he chose to write this valuable paper.
by MichaelF on August 23, 2006 08:52pm
My first impression about LinuxWorld 2006 was – This is BIIIIG! As I walked down the escalator in the Moscone Center in San Francisco I could see the big flashy banners, the props and the mascots from the vendors. It seemed like just another tradeshow. But how could that be, wasn’t this supposed to be “Linux”-world?
There was a kind of muffled roar coming from the exhibit hall and the speaking sessions were off to one side (in fact some of them were clear across the street in Moscone Center South). I remember thinking that I would probably learn more from hanging out with the vendors in the exhibit hall than in the sessions.
I felt a strange sense of disappointment – I guess I expected LinuxWorld to be more like OSCON, a show FOR open source proponents. Here I seemed to find more vendors selling closed source products on top of the open source platforms than I found open source products (there were quite a few of those but not as much as I expected).
Seems like the businesses have figured out how to make money on top of Linux. The overwhelming impression I got was that this show was about “Linux /Open Source Management and Monitoring” – and there were a large number of vendors both small and big focusing on that. There was also a preponderance of hardware vendors who were selling “built for Linux” servers/blades/racks/what-have-you. I didn’t know that you could differentiate commodity hardware in so many ways!
I hate to say it, but I was disappointed in the quality of some of the talks – a talk about “Desktops in Linux” turned out to be a commercial for the Walmart distribution of a desktop Linux product. Another one on “What Open Source Really costs” turned out to be a thinly disguised commercial for a commercial (though open source) distribution of PostgreSQL. At OSCON – the talks were at least what they stated they were about.
A session that I thoroughly enjoyed was the panel with Eric Raymond, Jon “Mad Dog” Hall, Chris De Bona and Dirk Hohndel and moderated by Larry Augustin. The panel was about the reminiscences of the participants experiences with Linux over the “First 15 years”. With these 5 guys on the panel there were bound to be 6 divergent opinions for every question! Some of the things that stood out for me were
Some of the keynotes were tainted (in my opinion) by the commercial interests of the presenters – why did Richard Wirt from Intel have Bill Coleman of Cassatt (and ex-chairman of BEA) deliver half his keynote?
But then again, is the commercialization of Linux such a bad thing? Will the opportunity drive more innovation and provide more incentive to the community? Or is the community going to be overshadowed by the big money vendors? (I haven’t made up my mind – please feel free to comment!)
My mood definitely improved as the conference went on – how could it not? I was talking to people who love the business of developing software, commercial or open source. There were smart and committed people who cared about developing good products – and were able to do if for love or money (or both)! I was able to learn the same things I would in a session by hanging out with the vendors!
by MichaelF on August 22, 2006 03:35pm
Just returning from Linux World in San Francisco, and virtualization was once again the topic du jour. A lot of you outside of the technology vendor-sphere (where we like to speak in weird acronyms and corporate buzzwords), might wonder why Microsoft and many others can’t stop talking about virtualization. Go to any IT conference today and it’s highly likely there will be at least some sessions, if not a bevy of keynote speeches, on the topic of virtualization. These are usually accompanied by marketecture diagrams of lego-block like pictures showing different operating systems all running in some combination on top of a single physical server. Having once worked at IBM, I’m long familiar with the idea of virtualization, often called ‘logical partitioning’ in IBM mainframe speak. However, the reason why there’s so much discussion around virtualization today is because it is becoming much more widely available at a much better value than it has in past. Intel and AMD have improved their microprocessors to make them virtualization aware (in the past, virtual machine managers had to do all sorts of silliness to get around the very virtualization unaware x86 instruction set). This has allowed virtual machine software developers to build powerful technologies, often called hypervisors, that can reside in the operating system itself, allowing for much more efficient, reliable and seamless virtualization of one operating system or systems on top of the ‘host’ operating system.
Cool science project or is there any real use for this stuff? Let me give you a simple example of how we’ve used this here in our Open Source Labs. We provide quite a few different types of Linux distributions of various version levels and hardware architectures for testing and analysis, probably over fifty or so all told. Typically, you would use a single server (or even a single PC) for each operating system, which would mean about fifty different machines. Each of those machines requires power, cooling, new parts, maintenance, and so on. The costs add up quickly; in some data centers I’ve seen, power and cooling can be over half of the total operations costs year over year. In our lab, we can run almost all of these Linux distributions on one server, a four-way Opteron-based HP server with eight gigs of memory and a lot of disk. This is for testing, so I wouldn’t run this many virtual guest images on anything with heavy production workloads, but you get the idea. Bottom line, I save money and time (particularly in systems management).
I’ve also spoken with customers who are using virtualization for disaster recovery and backup scenarios, new deployment scenarios where a call center or branch office can be ‘installed’ with virtualized images in a fraction of the time as traditional server installs, and scenarios where testing and quality assurance groups can do large, diverse and automated testing of hardware and software across dozens of types of operating system configurations. IDC forecasted that 45 percent of new servers purchased this year will be virtualized.
Virtualization is a critical part of the Microsoft strategy, and we have been in this business for a while with our Virtual PC and Virtual Server 2005 products. Today, Virtual Server 2005 R2 is available as a free download. We’ve also opened up the specifications of our Virtual Hard Disk (VHD) Image format with Virtual Server 2005. You can use this specification to learn how to access (read & modify) the data stored in a VPC or Virtual Server virtual hard disk. The VHD format spec is available under a royalty-free license.
We are making even larger investments with our ‘Viridian’ hypervisor and System Center Virtual Machine Manager (code named ‘Carmine’) projects. These are the names for our Windows Server Longhorn virtualization hypervisor and virtualization management product, respectively. You can download the beta of System Center Virtual Machine Manager today. From what I’ve seen thus far in the development of these products, you can expect some great software from us in this area. You may want to check out Mike Neil’s post about how we announced and demoed much of this at WinHec this year – Mike also has a link to a video of the WinHec virtualization demos from Bill Gates’ keynote.
Related to this, we recently announced an important partnership between Microsoft and XenSource. XenSource is the company around the open source Xen project – the leading virtualization technology in Linux. Peter Levine, CEO of XenSource, discussed our partnership in his Linux World San Francisco keynote. Together with XenSource we will be working on enabling great virtualization between Windows and Linux, which is significant for customers running heterogeneous environments looking to consolidate servers and to take advantage of the new deployment scenarios – like I described above – in the future. This work will be part of our Longhorn server plans, taking advantage of our virtualization technology, Viridan. I’m personally very excited by this partnership and this is an indicator of how we think about our long term product roadmaps vis a vis interoperability.
There is a lot happening in this area of virtualization and I think it’s one of the most important change agents in our industry. Sure there will be all sorts of hype, which is typical of where we’re at in this adoption curve, but I’ve seen how this can save money/time in my own labs and I’ve talked with customers who are finding similar advantages. Exciting times indeed. -bill
by MichaelF on August 22, 2006 03:30pm
We wanted to follow up on this post from June 14th: Shared Source Development Contest and share the results of this contest.
Results can be found here: http://www.windowsfordevices.com/news/NS8278694574.html
Congratulations to Port 25 readers: Marcelo Van Kampen and his teammates: Lucas Berinotti, Evandro Rezende and Rafael Teixeira who took third place with their Street Blog project!
From left to right: Marcelo, Lucas, Evandro (Rafael is behind the camera so you'll have to use your imagination)
Nice work guys!
by MichaelF on August 21, 2006 11:49am
We've spent several weeks going through the process of freeing the content on Port 25 so that it can be freely reused. I feel that this is very important in order for this site to be truly useful to the community.
The changes we've made (which are detailed here) are basically this:
o All code posted on Port 25 (from a legal perspective, compilation and setup instructions are considered code) can be freely reused, excerpted, modified, commercialized, or just about anything else, unless otherwise noted and we’re trying hard not to make exceptions. The specific license for this is the MS-PL, or Microsoft Permissive License, located here. It's a BSD-like license that encourages broad use, granting worldwide, non-exclusive, royalty-free copyright and patent rights.
o The articles (including video interviews and podcasts) can be copied and hosted elsewhere, printed, etc., as long as they are reproduced in whole - similiar to the Creative Commons by-nc-nd license (Attribution Non-commercial No Derivatives).
We decided not to standardize all Port 25 content under a standard Creative Commons license because it would make it harder to get material up to the site - for example, interviews with and articles by third parties would have to go through a whole new level of process to get their authorization to move the content to Creative Commons. Frankly, my day job running the lab takes enough time and energy as it is without taking on additional licensing negotiations for our content.
We'll be considering licensing articles & interviews under Creative Commons licenses which would allow remixing, etc. on a case-by-case basis.
I think this is a big deal. It took us quite a while because no Microsoft site has ever done this (like other things on Port 25, such as interviewing Miguel de Icaza - thanks for your time and candor, Miguel!). But it's done, I'm happy with it, and I believe that you'll find this to be a great change.
by MichaelF on August 18, 2006 11:47am
Sam interviews Ryan Waite, Group Program Manager for HPC, who was recently involved in the development and release of Compute Cluster Server. Ryan and Sam discuss how Open Source influenced CCS through the inclusion of Open Source in the product and contributions back to the community.
by MichaelF on August 17, 2006 10:00am
On July 31, 2006 Microsoft and a number of other leading technology companies including: BEA, Cisco, Sun, IBM, Intel, HP, BMC, and Dell ANNOUNCED: "they have published a draft of a new specification that defines a consistent way to express how computer networks, applications, servers and other IT resources are described — or modeled — in extensible markup language (XML) so businesses can more easily manage the services that are built on these resources." The specification defines a common language for communicating information about IT services and resources.
This sounds great, but we wanted some more details regarding the scope of this draft specification, its impact on the industry and the technical details behind it. To get those answers Sam interviewed Praerit Gart, Senior Director of the Dynamic Systems Foundation Team, to discuss the announcement and what it means to the industry and IT Professionals.
You can also download the specification and schema here. The team will be taking direct feedback on the schema, as well as holding a public Feedback Review Meeting on September 12th.
by MichaelF on August 16, 2006 06:12pm
I’ve been surrounded by people who want to study us like bugs—and they intend that as a compliment.
I just got back from the attending sessions of Communication and Information Technology Section of the American Sociological Association (CITASA) at the ASA conference in Montreal. This group of researchers studies:
The social aspects of computing, the Internet, new media, computer networks, and other communication and information technologies. This includes online communities, knowledge management, the digital divide, labor markets, workplaces, and how the Internet fits into everyday life [and] the design and use of technology [including] developing and analyzing new kinds of software, and thinking about the implementation of technologies for teaching, research, and the real world.
Myself and Prof. James Witte (Clemson), Chair of CITASA
The thought of bugs struck me when I sat down to recap my experience because, I thought—as a systems-minded person myself—an individual bug is not that interesting. But put a lot of bugs together, and the humble ant, bee, or termite, can, by acting in concert, create spectacular feats of engineering. Sound like building software? –That’s interesting.
What’s particularly important is that these practitioners focus on the criticality of social dynamics to any endeavor—the relevance to distributed, voluntary open source development is obvious, but these dynamics are important to closed-source development, diffusion and use of technology, information dissemination, bridging the “digital divide”…
I know I won’t do justice to everything here and now: suffice to say there are folks working on things like how building your reputation on Slashdot works (got your attention?); how to predict the level of documentation that will be produced by developers given different levels of social reinforcement for contributing code versus good documentation (…a challenge as applicable to proprietary as open source development, in my experience); and how different newsgroup communities use data to measure and control their “community health” (one group dedicated to quilting (yes, like with needles and fabric) seem to be particularly aggressive about this, going to show that traditional geek stereotypes may be becoming victims of the ubiquity of the Internet.) I’ll just promise that we will work hard on bringing some of the most interesting and relevant information, and interviewees, to Port25 over the next year.
Before I go nurse my jet lag, two quick notes:
First, since it is always nice to agree with the Boss (--which holds true for both Bill Hilf and Bruce Springsteen, IMHO), I will pile on Bill’s blog post about closed (or open) mindedness. If you happened to click through to CITASA’s website, you might have noticed something: our lab is a sponsor, Microsoft Research (MSR) is a participant (and a long-time participant and creator of some of the leading software for conducting social network analysis, I might add)—and the site is hosted by the Clemson Linux User Group. Bernie Hogan, one of the sociologists at the conference, said it best when the group turned to discussing “moral panic” (their words) or what one might also call hysteria (my word) about MySpace and “Internet Child Predators” right now: researchers have a critical role to play in “explaining what’s really happening.” I take as a point of pride that we share across our Port25 team, CITASA, MSR, and (I hope) you the reader a commitment to understand what’s really true—empirically measured, tested, challenged, tire-kicked…if everyone was focused on finding the demonstrably best solution to well-articulated scenarios, the world would be, IMHO, a better place…and the perceptions Bill referred to would take care of themselves expeditiously.
The second point is that my bug metaphor, above, has a basis in published research: Valverde, et al, from the University Pompeu Fabra, Barcelona, Spain, published “Self-organization patterns in wasp and open source communities” in IEEE Intelligent Systems, March-April 2006 (Volume 21, Issue 2). In this “comparative study of how social organization takes place in a wasp colony and OSS developer communities” they found “both systems also define interacting agent networks with similar common features that reflect limited information sharing among agents.”
And if you didn’t think that I really do care about tracking down every single bit of knowledge available to understand “what’s really going on”….I read the whole thing.
by jcannon on August 15, 2006 12:37pm
Preliminary stuff Hank Janssen and myself attended the OSCON on the 27th and 28th of July. We did not attend the tutorials or the Executive briefing but were there two days of the two and a half days the sessions were in progress. We also attended the keynotes on both days (27th and 28th July).
As a strategy, Hank and I discussed the sessions and their subject matter, splitting up to attend different sessions in order to maximize coverage. In general I attended the “business” and “strategy” sessions and Hank attended the more technical sessions.
I’ll cover the sessions I attended. Hank can be responsible for his own thoughts!
This was the first OSCON I have attended, even though I have been to other conferences where there was large open source presence, so it was very exciting for me! I’ll talk about some of the sessions I attended in chronological order.
Some of the sessions I attended are not covered here, because I wasn’t impressed with them. So even in Open Source software there are some, shall we say, “imperfections”!
The sessions are hyperlinked from the “index” below, that way you can just jump to the one you want without getting meta-carpal tunnel syndrome from blog scrolling!
The conference was very well attended. It was clear that there were not only traditional open source “hackers” and startups (though there were many of those), but there were a number of established enterprise vendors (HP, Dell, AMD, VMWare). It also seemed that the startups were a lot more mature than Open Source startups from the past – their message was clear, but not strident.
Surprisingly, IBM pulled out from the conference at the last moment and there was nobody (visible) from IBM at the conference. Google was present and had a number of presenters, but had a booth only for recruiting. (They did announce their portal to Sourceforge at the conference).
There was a dearth of enterprise customers – all the people I met fell into the vendor, academic or Open Source organization ( Mozilla, Apache) category.
The only customer (non-software related company) with an official presence was Ticketmaster, who were recruiting for Linux admins and Open Source developers.
There was a lot of talk about Open Source as a business, a number of keynote speeches and sessions addressed this. There didn't seem to be buzz about a particular technology or company that stood out. The conference was a good place for me to gauge which products/technologies/companies were gaining momentum and get opinions through face to face contact with both users and principals. It also was a place to make an assessment about what was going well, what was not going so well and concerns of the Open Source community.
Lars Thalmann was part of a team that was acquired into MySQL from Ericsson’s Business Innovation divison. The product that they worked on was a closed source Ericsson product called “Alzato” - which was a clustered database system used mostly by stock markets and telcos.
The product was successful enough that there was a 60 person team in Ericsson developing and maintaining it. mySQL acquired the entire team along with the product and renamed the product mySQL Cluster. mySQL Cluster is open source just like mySQL’s other offerings.
Alzato was meant to be a high availability and performance database with five nines (99.999 %) availability & had a parallel architecture with replication for speed and scalability. In short, this was an advanced technology project.
The architecture was changed so that the clustered storage engine of Alzato was now accessed through mySQL and NDB in mySQL Cluster rather than through SQL and NDB as before.
Lars presented the talk as 10 “shocks” that the closed source team had to go through when they found out what was different between open source and closed source.
Some of the things he said (I will not try to transcribe his entire talk here) were of the nature - “It should be possible to install software in less than 15 minutes" since “The community consists of people with little patience. You surf, you find something, you try it – if it does not work right away you move on!” The more I listened to Lars, the more I was convinced that open source had forced developers to adopt good practices, just by the nature of development rather than by any coercion. Microsoft also followed the same practices, at least within a large development team, but had come to those processes by painful experience!
The one thing that I learned was to make all interactions explicit - by having bug databases and forums that capture every small piece of information that might be needed. Having community coaches and documentation constantly improved by user review was another highlight.
The thing that really struck me was how strong the motivation of developers could be if they were able to directly interact with the users. Lars said - “Developers work all the time (rather than 9 to 5), being inspired by the feedback and suggestions – which makes people enthusiastic”
Apache Incubator is the incubator of projects for Apache. It takes projects and project proposals for open source projects and evaluates them for suitability as Open Source Projects under the Apache umbrella.
Apache Incubator is the incubator of projects for Apache. It takes projects and project proposals for open source projects and evaluates them for suitability as Open Source Projects under the Apache umbrella. The talk was focused on “what makes a project a good candidate to be open sourced through the Apache incubator”?
The thing that I took away from the talk was the danger signs of an Open Source project NOT being a strong project . The list that Aaron Farr presented was:
An interesting comment was made by the presenter – “Java Enterprise space may be the best analogy to Microsoft OSS activity”. I am not sure I completely followed that – any reader care to comment? The other thing that stayed with me was “One of the upsides to being IN a healthy, thriving OSS project are a renewed enthusiasm for software”. They must have been talking about my job here at the Open Source Software Lab!
Neelan Choksi was President of Solarmetric that produced a Object Relational database mapping engine called Kodo. Kodo was closed source product, even before it was acquired by BEA.
BEA open sourced Kodo as Open JPA which include the kernel and the J2EE EJB 3 Persistence specification implementation. The decision to make the O/R mapping engine open source was taken in February 2006 and the product was released in July (it took 6 months to open source the project). Explaining the business reasoning behind the decision, Neelan suggested that the reasons one would want to open source a product were:
It was interesting to see how a company that primarily relied on closed source adopted an open source strategy. Hearing it from someone who had been through it himself was refreshing. (Rather than hearing from pundits who theorized on the topic without having the real world experience!). He also said that it was hard to get a business behind an open source strategy, because as soon there was a priority conflict the “old business” people would try to de-prioritize the open source projects.
The one takeaway that Neelan wanted us to have was that open source was not magic dust. To develop a product ,whether you have people with those titles or not, you do need Product Management, QA and (yes!) Marketing.
According to him a business is a business so the success of a product will be determined by overall execution not technical excellence or the closed/open nature of the code.
This was a session with significant participation from Portland State University (PSU) and Oregon State University (OSU), but with other members from as far afoot as Texas and Tennessee. Some of the people were lecturers, some faculty and some were system administrators from academic institutions.
When I asked whether they were aware of the Academic program that offered Windows source code for educational purposes – most seemed to be vaguely aware of it. One of them suggested that Microsoft write a textbook that included programs and instructions on how to use the source code. There are no such textbooks available for academic institutions to use as yet.
Another had anecdotal information of the cost of training students for IT Pro. They said they were teaching Windows because the license for setting up a lab was cheaper from Microsoft – which also had very cheap certified 'train the trainer' programs. Red Hat was almost an order of magnitude more expensive – and their train the trainer program required expensive yearly renewals. Besides, Windows admins were more readily employable in their local regions.
There was some discussion of using open source to teach academic lessons vs. developing open source itself as part of academic training. It seemed like opinions were divided as to the utility of each approach. Bart Massey talked about the Open Source Education Lab at PSU. Among other things he runs a course (a summer long lab) on Open Source. We are talking to Bart about seeing if any of his students are interested in working at the Open Source Software Lab. Any of you readers out there interested? Drop us an e-mail!
Karl had an interesting argument which is interesting not just because he works for Google. His contention is that copyright is not for the benefit of the creator but for the benefit of the distributor. And given that the Internet has made cheap ubiquitous distribution really easy, copyright has lost its utility.
An interesting tidbit that he mentions is that the theory of copyright was advanced to protect their interests by the Royal Stationers guild when censorship, which was implemented by printing only being allowed through the guild, was revoked by the English parliament. He takes the radical position that copyright be abolished and be replaced by some other fair mechanism that doesn’t benefit the distributors but benefits the creators. This puts even the GPL in jeopardy because the basis of the viral nature of GPL is copyright. Even though his ideas may seem radical, the argument about the nature of distribution changing the landscape for businesses should be taken seriously. His site is at www.questioncopyright.com
Jorg Janke is a founder of Compiere – the premiere vendor of an open source ERP product. According to Jorge, Compiere has seen over 1 million downloads since 1999 and is a top 10 most downloaded product on SourceForge. They have 250+ customers and have concentrated on Product, Process and Distribution. They have 70 partners who play an important part in their ecosystem.
Jorg believes that there is no “one size fits all” open source development model. He said that as a case in point SugarCRM tried Compiere’s model but evolved to other models. He also said that Compiere has been evolving their model constantly since 2002.
He suggested that there were some myths about OSS development
According to Jorg “The Basic Open Source Contract” was
I was impressed by Jorg’s grasp of the software landscape – he was no radical hacker developing in his garage, but a clear thinking businessman with a great grasp of the software business.
He said Compiere first and foremost was a product that solved the major pains of all ERP projects, the installation and implementation. There were no compromises in Compiere because it was open source, it all the features necessary to its users. They also had a clear strategy, Compiere made the enabling product but Compiere’s partners sold it to users.
Another thing that they had thought through was that they only supported one (and only one) version at any point in time. But they didn’t leave their customers high and dry, they had proven tools that migrated between versions which made it easy for their customers to migrate.
According to Tony, some users of open source s/w, especially companies, WANT to pay and EXPECT to pay for use of software. Tony is the person behind OpenBRR, about which organization I had written in my blog “What does business readiness of software really mean?”
According to him the business models that have worked for OSS are (it would take much space to explain them all but you should get a good idea from the companies mentioned). Let me know if you think there are other business models as well out there
An interesting tidbit about paid “volunteers” vs “pure” unpaid volunteers in Open Source was revealed by Tony who said “In the top 100 OSS projects paid volunteers FAR OUTNUMBER the pure volunteers”.
My only regret is that I couldn’t clone myself and be at multiple sessions at the same time! Wish I could have come earlier and stayed longer!
by jcannon on August 14, 2006 12:00pm
A couple weeks ago I was put in the ‘hot seat’ at the O’Reilly Radar Executive Briefing at OSCON in Portland. Danese Cooper from Intel had a lineup of questions to ask and we had fun (really) discussing many of the issues about Microsoft and Open Source. In his blog, Tim mentioned one of my quotes from this session about dealing with ‘close mindedness’ around the issues of Microsoft and OSS, as it’s something that I deal with daily. Being on the hot seat answering these types of question and dealing with close mindedness is part of my job, I’m not complaining, but I did think it was worthwhile to expand on what I meant by this comment.
In 2 ½ years in this job, I’ve learned a lot. But maybe one of the most interesting is related to the seemingly obvious fact that everyone has an opinion about Microsoft. Good or Bad, but rarely indifferent – for what it’s worth, this is not common of most companies. There are many reasons why this happens, of course, but it does introduce an opinionated, subjective element into every conversation I have.
In many ways, this perspective (and bear with me here) is not unlike how many people feel about country music. Most people have an opinion about country music – some love it, some hate it, but rarely do you find people who hear country music who don’t have an opinion one way or another. Many people hear country music at some stage in their life and make a judgment call on country music forever – this happened to me with some really old recordings from Marty Robbins I heard on my Dad’s eight track player when I was six. In a similar way, I meet people who view Microsoft through their experiences with NT 4.0 or even Windows 95 and assume that the products we have today must be the same as they experienced back then. I realize people don’t think we still sell these specific older products today, but their perception is rooted in these product experiences. Of course this happens with all sorts of things, not just music and technology, but it does build a ‘mindedness’ about the subject that is often dated and stale.
This is topical for me as I just returned from a trip to Montana where I attended my first country music concert. Up to this point, I’ve been listening to Marty Robbins-era country on occasion, mostly Johnny Cash, so my perception about country music is behind the times to say the least. Sure, I’ve heard country now and again, but not really listened to anything recently. So sitting in this concert at the Montana State Fair in Great Falls, listening to a present day country star, Trace Adkins, I realized a lot has changed. Sure, there’s still the hat, boots and giant belt buckle thing, but the music has changed – lots more pop, rock and of course more contemporary lyrics.
Back to the initial ‘close mindedness’ issue. The issue I typically face is one of perception, typically a historically based perception. I’m talking about perceptions of both Microsoft and open source software here, and both external and internal to Microsoft. Sure, I realize there’s always history, good and bad and indifferent, but in some of my conversations, I hear a lot of opinion based on rather old experiences. IMHO, what really helps progress any conversation is taking this historical experience with a complete and open minded understanding of the present day and then making an assessment – good, bad, or indifferent. What we’re trying to do with Port 25 is to bring some contemporary insights into what Microsoft is doing in OSS. I’m also hoping it allows people to take a look at our software overall, to see what we’re building and why (if you pull the ‘port25’ off the technet.com url you can find a load of useful Microsoft technical information, including software downloads and howtos). We’re a commercial software company and we strive to build great products – sounds like marketing and it is, because we’re proud of what we do. So you may see some hats and boots here and there but you’ll also probably hear some rock and roll in the music now too. And I’m not here to sell you. You may decide you don’t like our music, and that’s fine, because what I’m hoping for is a more accurate, up to date perspective so that the conversations and mindedness can strive to be more open and more productive. This is my approach both externally and internally, and about both Microsoft software and open source software.
Changing perceptions is challenging but important. And it takes time. For me, I’m attending my next country music concert this week here in Seattle, Tim McGraw and Faith Hill. Although it’s certainly not Marty Robbins or Johnny Cash, I’m starting to appreciate the changed genre. Just don’t expect to see me wearing giant belt buckles anytime soon.
by MichaelF on August 14, 2006 11:30am
We had a large presence there, because we do believe that open source as a development model is here to stay. Bill Hilf was at the conference and Port 25 has some of the interactions he had with open source luminaries Tim O’Reilly and Matt Asay. While Bill was having these interesting conversations, we at the OSSL (Open Source Software Lab) were busy attending the talks at the sessions and collecting “swag” on the exhibition floor! I do have swag from HP, Google, Intel, Dell, AMD, Oracle, ActiveState, Solid and MindTouch but interestingly IBM was missing. Anyone out there have an IBM t-shirt to exchange for our Port 25 t-shirt? (see accompanying pictures)
The buzz in the air, appeared to me to be about open source both as and in business. The talks I gravitated towards (naturally) were about open source development practices. These ranged from taking closed source products and turning them loose as open source projects to driving pure open source development to using experts in a particular domain as contributors for a project not thought suitable for open source.
There was a common thread running through all these talks – the critical nature of development practices. No, there wasn’t anything earth shattering – these were development practices that are accepted as “goodness”. But the forces surrounding open source development made the use of these practices almost a necessity for projects to get of the ground. This is not to say that closed source companies do not follow these practices, but due to co-location, centralized management and other circumstances that go along with commercial development, some of the practices may not be rigidly enforced and the lack of these practices may not impact the product as much. Open source development does not have that luxury (I refer only to successful open source development projects, not the long tail of open source projects that are fossilized on SourceForge and other repositories).
The practices fall into the following categories
Consider this, you browse to an open source project which is new to you and download it (could be from repositories such as Sourceforge or Codeplex). It doesn’t install and takes a lot of wrassling to run. More often than not, this first impression decides your level of participation. If you can’t find something cool, try it and run it – there are other fish in the open source sea!
The initial install is not the only thing that has to run right, as an open source developer working on a module, adding/modifying some source code, building the source code and running it are part of the iterative process that lets developers be productive. A system which doesn’t make the dependencies transparent and which doesn’t have a build system to include all the necessary files (and NOT include the unnecessary ones) will probably not get good developer input.
The easy build thing has been known for a while. At Microsoft, product groups have the concept of daily builds – if you as the developer “break” the build, you don’t go home till you fix it. In order for this to work, each developer should be able to build the system on his desk easily.
Quick iterative program development in the large without hassle is the name of the game. The very nature of open source development which needs to attract developers to gain momentum leads to a focus on easy builds.
There’s doc (or a blog or a newsgroup) for everything
Most open source developers don’t work in the same building. I am talking about open source developers in the community, not those employed full time by commercial open source companies. (Though most commercial open source developers have to interact with community developers on their virtual team). This means you can’t walk to the next office and ask the developer about how the API call really works!
It follows that at any time of day or night, answers to questions like “how does the API call really work?” should be available through internet accessible means. This could be doc, a newsgroup, a wiki , a blog or any other easily accessible repository.
Documentation by developers, you ask, isn’t that a mythical being?
That’s exactly the point – open source developers do write docs, they just don’t recognize that they are doing so. In order for an open source project to truly take off, education of new developers is a must - both when they are viewing code and when they are looking at documents explaining interfaces, how things work and the meaning of life! Ok, maybe the last thing is not strictly necessary, but it does make reading documentation much more fun. Who wouldn’t want to work on Ruby after reading “why’s (poignant) guide to ruby”?!!!
Every lifecycle stage and artifact is important
The way you work your way up the “committer” chain in open source projects is to prove yourself useful. The path to building credibility is to write documents, find bugs, review codes and make yourself useful in a pretty stiff meritocracy. Even when a developer achieves the golden “commit” privilege they continue to participate in those activities.
Not having departments with people exclusively devoted to test, doc or reviews makes the development of a “caste system” difficult! Development managers cannot put pressure on test managers to shortcut tests – because the development managers and test managers could be the same person!
This is a little bit more subtle than the “more eyes make all bugs shallow” argument – that is only true if those eyes don’t think of looking at bugs as work that is to be done by other eyes!
This is even truer of documents and education – it isn’t some tech writer with expertise only in writing who writes the important documentation, but the luminaries of the community. When the “Gurus” (which in Sanskrit means teacher) do what they are meant to do – then nirvana is attained (I loosely paraphrase from the Bhagavad Gita!).
Sprints not marathons
Consider having developers in the US, UK, India and Australia – when is the best time for a meeting? When it’s morning in the US, its night in India – and who knows what time it is in Australia? Software companies whose code is all developed by their own employees can have coordination meetings on a schedule decided by someone – not so in open source. This means that coordination can’t be complex and long drawn out.
So open source makes use of XP principles, work on a small feature that doesn’t take more than a month (ok, so it isn’t the extreme XP where the feature shouldn’t take more than two weeks!). Based on community pressures, priorities can be decided. Longer term projects are either done by a single person or by a co-located team (by commercial open source companies for example).
That means planning horizons are small, and mistakes can be rectified without huge loss of time. Releases happen when there is a critical set of features ready. The community is able to get their hands on new features early and give early feedback, which further cuts down the time for stable development.
Of course this means that customers are running hard just to stay in one place, if they accept all the releases! But at least they have the choice…!
Visibility into EVERYTHING
Open source is not just about the source being visible. More people do look at server side code, but even on servers the number of code readers is small compared to the number of contributors.
Visibility in open source is about everything – how many and what bugs are there, time to resolve bugs, prioritization of bugs, who is contributing what, what comments were made about whose code, whose code was included and whose wasn’t etc. etc. etc.
This not only acts as a great feedback mechanism to users, it provides for real and open debate about priorities and execution. As long as the project is handled on a rational basis, people can predict the state of the project. They can anticipate when a feature they want or a bug they have will be fixed. It also allows users to submit code to fix bugs for their own problems and see it transparently go through the system.
My full time job at a previous position was mediating features for a group of customers (numbering in the 10s). This required the full energy of a team, which was not a development team, to gather this information in a closed source environment and then disseminate this information. Of course there were mistakes in information gathering and communication, since the customers only got a view of the project through an intermediary. Building trust with the customers took the better part of the year and resulted in a development process that was not as efficient as it could have been.
So flame wars notwithstanding, visibility into everything is an advantage for open source projects.
Community, Community, Community
In order for developers to be productive they have to communicate with whom they want, when they want and get back what they want. This means there is a burden on the open source product’s leaders to make sure that this responsiveness is part of their community. The artifacts used are answering 250 e-mails a day, having IM on all the time and putting systems in place that make this possible. One open source company I know of uses categorization software just so that the appropriate person can look at an e-mail and has fast mail systems that allow sub-second previews of the e-mails!
What this gains the company is encapsulated in a quote from an OSCON presenter – “When we had a closed source product people worked 9 to 5, but with open source there is so much interaction with the community that our developers are strongly motivated to work on finding solutions and building features and they are much more productive!”
Actually come to think of it, there isn’t a thing here that I have said that wouldn’t work where I work! In fact these processes are already at work at Microsoft. I am not only talking only about the work that we are doing at Codeplex, releasing source for products such as Power Toys for Visual Studio Collection available on Codeplex or the WiX tool available at Sourceforge but also about the code sharing internally within Microsoft. Since Microsoft does both platform and application development, application developers often need and have access to the bug databases and source code of platform level components. There is a lot a give and take between teams of users within Microsoft. This visibility has also been expanded to users such as academic, government and enterprise users under license agreements with Microsoft.
by MichaelF on August 11, 2006 09:30am
In this, the last of our interviews from the LANG.NET Symposium, Sam sits down with Miguel de Icaza, VP Development Platform at Novell and co-founder of Ximian. Miguel is also responsible for starting two Open Source project you may have heard of: GNOME and Mono.
In this interview Sam and Miguel talk about the history behind Mono, the current state of the project and Miguel's thoughts on Mono as it relates to .NET.