Follow Us on Twitter
by MichaelF on February 02, 2007 06:36pm
Following up on our post yesterday here is an interview Sam did with ASP.NET Technical Evangelist: Steve Marx.
Steve discusses the three components of ASP.NET AJAX and shows us a demo of the software formerly known as ATLAS running on top of PHP on Linux to demonstrate some of the front and backend extraction capabilities.
If you are interested in looking a bit more deeply at ASP.NET AJAX, as well as the PHP support Steve released to Codeplex, here are some links he provided:
ASP.NET AJAX: http://ajax.asp.net/ Direct link to download the Microsoft AJAX Library: http://ajax.asp.net/downloads/library/default.aspx?tabid=47 PHP for Microsoft AJAX Library: http://codeplex.com/phpmsajax Steve's Blog: http://smarx.com/
by hjanssen on February 28, 2007 06:06pm
When I started working at Microsoft in May of 2006 I wanted to chronicle my adventures here. So my first blog was posted on June 7th 2006 Titled What is a guy like me doing in a place like this, I had every intention writing frequently about my experiences. As you can see, I have not been very consistent with that. Something that I will try to improve in the future.
If you would have told me 1 year ago that I would work at Microsoft I would have laughed. I still walk around looking with amazement at my badge, and when I go to other MS buildings I shake my head when I have to swipe my badge on the reader. When I talk to people I continue to refer to them and us (them being Microsoft, Us being the rest of the world )
I am happy to report that I continue to be the department's skeptic , something I will continue to be.
So I wanted to take this opportunity to talk about a bunch of my experiences since a lot has happened in the last 10 months (more about this later)
First of all, contrary what people believe, I do not know of a greater Microsoft plot to take over the world and destroy Linux and OSS. If there is such a thing, we at our level are unaware of it. And since this department in many ways is on the front lines working with OSS and Linux, I would have expected to see some evidence.
There is no helicopter pad (not on campus anyway) where Microsoft stores its black helicopters. There are no dispensers of Microsoft Kool-Aid. (They might have some dispensers in the water coolers though.) And the articles, blogs and posts that I read on what is going on here are most of the time completely off the mark.
Is Microsoft competing with Linux and OSS?? You bet they are. Just like every other company is competing against other companies/people/products that create similar products.
Is Microsoft working to better interface with some of the Linux and OSS products? You bet they are too! We are frequently working on those things as well.
I am not being censored or restricted in any way. I actually have access to a very wide array of things. More so than I thought I would when I started.
The department has a unique position inside of Microsoft. We get to talk to and work with a very wide swath of Microsoft product lines. Just to highlight a typical week that took place a few weeks ago; in the same week I spoke to the Robotics guys, people from the embedded department, People from IIS, SQL server department, the Powershell developers and the cardspace group. And this is a typical week. I am not sure how may other places in Microsoft have the same breadth.
And more and more groups are becoming aware of what we do and contact us to work with us.
Is Microsoft changing?? Yeah, I think it is. In some places it is going very fast, in other places not so much.
Yet if I look back over the last 10 months, I have seen some great changes happen. To name a few:
But we have been touching a lot of items people never thought a few years ago would be likely. Getting Mozilla people on site for one. Another one that would have been considered impossible is Microsoft writing plugins for Firefox. Here is a cool one for example Photosynth, and you can listen to my podcast in which I interview Ian Gilman one of the Photosynth developers. Here is a link to the blog metioned in the podcast: http://labs.live.com/photosynth/blogs/
Just think about that for a second, Microsoft writing Firefox plugins!!!
I will leave you all with a few more observations:
There seems to be a perception that we are not moving fast enough. But I believe we have been able to move at a pretty good speed! And, there are quite a few more things that we are working on that will show up in the future.
Looking back over the past 10 months I have come to the realization that I am really enjoying the job. There certainly are frustrating times, if you are on the front lines like we seem to be you are likely to get smacked every once and a while. If you are not, than you are not doing your job . But we are seeing noticeable change on all fronts. And it is a blast to be able to work with so many groups inside and outside of Microsoft.
So I will close with the following, I am not drinking the Kool-Aid, quite the opposite; I continue to question everything that is going on inside of Microsoft. And I will continue to be a voice for Open Source inside of Microsoft.
by MichaelF on February 08, 2007 11:00pm
We get quite a few requests to provide more information about Powershell so Hank and I decided to go straight to the person who just finished writing the book on Powershell: Powershell in Action, Bruce Payette. Bruce was one of the founders of the Powershell team here at Microsoft and is an expert on not only Powershell but according to Jeff Snover, on any number of popular and obscure languages.
In this interview Hank and Bruce discuss Powershell and Bruce gives us a demo. While we couldn't show everything here, if there is interest, we can go back and have Bruce show us specific scenarios and examples. Let us know if there is anything you'd like to see...
Check out the link above to see Bruce's book and read a couple of sample chapters.
by Sam Ramji on February 14, 2007 11:06am
Based on the email I received I would say that many Port 25 readers noticed my post last week on job openings in my new lab. Thank you for your positive responses (and especially the resume submissions)! Brad Cutler, my counterpart at Novell, has been overwhelmed with responses as well, so thank you on his behalf.
I’ve called this a sneak peek because there is much work ahead of us, but it’s time to talk in a little more detail about what the lab will be doing. I and my colleagues at Novell and within Microsoft have been putting in long hours for the last several weeks – nights and weekends as well – detailing the plans for our work together. As you may have seen covered in the news this week (“Microsoft and Novell Announce Collaboration for Customers”), we’ve got a solid long-term plan that covers our cooperation in the following areas:
Why are these the most important areas for us to work on?
As part of the Interoperability Customer Executive Council, I heard from the heads of IT from Goldman Sachs, UNICEF, American Express, NATO, and 25 more global organizations that server consolidation is essential in allowing them to reduce costs. In order to fully achieve server consolidation, they need to be able to move their existing workloads – both Windows and Linux – to a common set of server hardware. Without interoperable hypervisors, IT shops would be forced to support two separate sets of hardware, software, and personnel in order to consolidate their servers: one set for Windows and another for Linux. We don’t think that’s good enough.
Hypervisor interoperability is critical, but for this scenario it isn’t enough to deliver the full benefits of virtualization for an enterprise. Once the workloads are running on the same server and the same hypervisor, access control and authorization needs to work across the entire environment consistently – otherwise you’re just shifting the interop problem up the stack, only to suffer later. This is where WS-Federation is essential – implementing an open specification to federate identity between existing directory servers enables you to have consistent security policies across your heterogeneous workloads. This is a continuation of work we’ve done with IBM, Apache, Ping Identity and SXIP Identity.
Operations relies on strong management tools to provide availability and reliability across a broad server environment. Ops teams typically have training on specific toolsets to monitor, administrate, and manage their infrastructure. Realistically, moving Windows and Linux workloads onto the same set of servers requires that existing management tools be extended to the new environment. We believe (as do HP, IBM, BMC, CA, and many others) that WS-Management is the solution. Implementing this open specification will enable servers, applications, and services to communicate with management consoles from multiple vendors.
I’ve had a few people approach me about this project who pontificated “If you [Microsoft] would just implement the specifications as they’re written, you wouldn’t have to do all this work!” In fact, this is an incorrect understanding of software engineering and interoperability. Making protocols truly interoperate in every realistic circumstance is one of the great challenges in engineering. In real life, you have to implement the specification correctly – and then the work begins. Were there platform-specific assumptions in the code (as basic as big-endian vs. little-endian format)? Were there parts of the spec that were subject to interpretation? Due to the extensive development and testing embedded in technologies like TCP/IP and HTTP, it’s easy to forget that it took years of work by many parties to deliver what we now take for granted.
This work across virtualization, identity, and management is a pretty awesome undertaking, and I expect that as we continue to progress here we’ll discover new things we need to do in order to deliver interoperable computing. I’m looking forward to reporting on it here, and have submitted a presentation abstract for OSCON ’07 to walk through the Joint Interoperability Lab’s operations in detail. Hopefully I can shed a little light on what makes interoperability so challenging, even in an age of open specifications.
by MichaelF on February 01, 2007 06:34pm
I wanted to take a moment to let the Port 25 community that Microsoft has officially released ASP.NET AJAX to the web under the Microsoft Permissive License (Ms-PL). Under this release developers are free to modify the Microsoft AJAX Library Scripts and can distribute derivative works per the terms of the Ms-PL.
As part of the release some improvements have been made:
This week we also released the ASP.NET 2.0 AJAX Extensions source code under the Microsoft Reference License (Ms-RL). This release was intended to help the community with debugging, maintenance and interoperability challenges with the additional hope that the transparency helps establish better coding patterns and guidelines.
You can find ASP.NET AJAX here.
Check back tomorrow when we'll post an interview with Steve Marx, Technical Evangelist for ASP.NET AJAX who talks about the release and shows us a demonstration of the technology.
by MichaelF on February 02, 2007 08:00pm
If you live on the eastside in the greater Seattle area, you may be among the million-plus people who were out of power during the massive windstorm last month. As Bill Hilf pointed out in his previous blog “what matters” (see blog), these are rare intermissions which allow us to step back and understand what sets us apart as “intelligent” beings. After reading Bill’s blog, I started pondering over the role that technology plays in our daily life and my reaction is that this role is a bit too exaggerated. Why you may ask - well, let’s see: it took the million plus people spread across all over the eastside neighborhoods, close to nine days before power was fully restored all over. A lot of us took the time and caught up on chores we had been putting off for a while, some of us shared childhood stories around a fireplace and talked about things that sat underneath the surface, suppressed by constant engagement with a laptop or an MP3 player. Unfortunately, there were also a handful of others who didn’t make it through this trying time and died of carbon monoxide poisoning.
We talk a lot about “technology” here on Port25 and being involved with Port25 gave me a perfect opportunity to draw parallels in what I was experiencing and comparing it to the “hype” about the “advancement” of technology and how it has improved the quality of life. Making a radical shift here, let’s at least agree on a simple fact that without an “evolved” attitude, these advancements will and cannot replace the “self introspection” we all need to do from time to time. The power outage was a perfect time for that, a time that made us all realize that there is a larger reality out there which we should never be oblivious of. We need to acknowledge that to fully understand and take-in the benefits that technology has already given us rather putting our mind to think about “Bigger, Better, Faster, More”.
So is technology bad – NO ..HELL NO!! Just because we can text messages to others, while listening to our collection of 10,000 songs on MP3 and driving an electronically controlled, fuel injected vehicle, let’s NOT make the assumption that playing around with a few gadgets has made us a better person. Let’s examine this more closely, shall we, and try to truly comprehend the role technology played when we were all sitting without power:
1. Abstinence: No matter how juiced up my MP3 player was or how many extra battery packs I had for my laptop, by Day 3, I was probably staring at a cozy fireplace wondering what my life was like before High-Definition football broadcasts and movies streaming through extenders. For many of the folks who put their comments on local newspaper and blogging sites later, this was a “revelation”. My question at this stage was “Gee…there’s very little technology in this present moment, isn't there ?”
2. Chaos: I remember, it was our second day without power and we were driving through a major traffic intersection in Downtown Bellevue. As we drove past a gas station, we saw police cruisers on both sides of the entrance to the gas station with troopers standing guard w/ guns. Later someone told us that fights had broken out at the gas station as people started to cut each other off and next thing you know, it was chaos. I was also told that in a grocery store, they ran out of bottled water and tempers flared between those who had bought crates of water and those who were left without any to buy. So my question again “would any type of technology or device have truly helped the situations here in some way”……NO, not without us taking a more civil, educated and an more evolved approach
3. Awareness: As hundreds of thousands of people tried to cope with loss of power, in the middle of December, all sorts of stories were emerging on how people survived. My boss, for instance, was left with no choice but to use construction grade wood and cut it into 2x4’s for his fireplace to keep his kids warm. Others resorted to picking up downed trees and gathering them for burning wood. A lot of folks flew or drove out to other towns to their friends and relatives who did have power. All through this a thought kept gnawing at me repeatedly, and does so even today. I thought “We live in one of the most technologically aware, literate and advanced places in the US and it takes DAYS to restore power ???”. With every passing hour I kept reflecting back to my days in IT Operations where outages were measured up to three decimal points. I got a chuckle out of thinking back and wondering what “metrics” could be applied to this situation, I guess the curve is really messed up now…. ***. And guess what “there’s no technology anywhere in sight unless you lived in the Seattle downtown area, which wasn’t affected”
4. Fragile Infrastructure: I was also compelled to ask myself and those who had been living here for longer than me, if this is how fragile our infrastructure really is. After using my daughter’s stash of quarters from her piggy bank (no joke), I drove by a pay-phone not too far from my house twice a day and checked the status w/ the power company to get an estimate on when we could see the lights coming back on. This is no exaggeration, no matter how many times I called, I ALWAYS got a different answer. This led me to think about technology in a the context of the power company. A few random thoughts that followed were:
So did I get the answers I was looking for, nope. Did I learn a lot of lessons on how to be prepared for an emergency – you betcha. And as far as my curiosity on the role of technology in power restoration process, no one would answer my questions and I can understand to some degree. But I’m still curious as to what role technology adoption would have played in this scenario. Whether a specific technology, if adopted pre-emptively or post-incident, could have made this less cumbersome and trying for everyone. Are they using the right technology for the right job ? Are they fully aware of the potential of technology and the possibilities ?
In the end, I came to a simple realization which was…it's not all technology…there’s MUCH more to life and to our existence...
by anandeep on February 05, 2007 09:25pm
I am an avid reader of Joel On Software – I find his insights great and very revealing.
I was reading a recent blog post by Joel entitled “The Big Picture” in which he has this to say about open source
“Open source doesn’t quite work like that. It’s really good at implementing copycat features, because there’s a spec to work from: the implementation you’re copying. It’s really good at Itch Scratching features. I need a command line argument for EBCDIC, so I’ll add it and send in the code. But when you have an app that doesn’t do anything yet, nobody finds it itchy. They’re not using it. So you don’t get volunteers.”
This was in the context of the review of the book “Dreaming in Code” by Scott Rosenberg. Scott was on campus at Redmond a few days ago, talking about his book. The book is a “Soul of a New Machine” type look into an open source startup that was trying to make a PIM (Personal Information Manager) code-named Chandler – and the travails that startup went through. Scott begins his introduction by saying ”… the art of creating it (software) continues to be a dark mystery”, so you can guess the whole thing didn’t go very well!
This got me thinking – what is open source really good for? Is it only good for copycat or scratch-an-itch type of software? Could there be a limit to what open source process can achieve in terms of software artifacts?
First of all – being “copycat” or “scratch-an-itch” type of software is not bad at all. It can be argued that Firefox falls into the former category being based on closed source browsers but while gaining feature parity with other browsers it added new features. I think this was the benefit of the general public because not only did Firefox get better in the spirit of competition other browsers (including Internet Explorer) got better. As for the latter category, what good is software if it isn’t working for its’ users or “scratching an itch”?
I think the point Joel was trying to make was that bootstrapping an open source project requires either a user need or the need for an alternative. A community is more easily built when there is a shared need for functionality or alternatives.
But something about that bothered me. After all isn’t Open Source all about the “love of the game”? Why wouldn’t a community want to do something that was experimental and didn’t have any immediate payoff? Coming from a university research environment, I knew there were people out there putting out experimental code into open source including everything from Robotics to the ALICE Educational Software Authoring System (I knew the person behind ALICE – Randy Pausch from Carnegie Mellon). ALICE has a pretty vibrant open source community behind it.
That said, all the top open source projects (based on a poll by O’Reilly) fall within Joel’s characterization.
So can futuristic experimental projects be developed using the open source process?
I think that the answer is yes. But these kinds projects cannot be developed in a pure open source community process like that of Linux. An institution like a university or a company has to bring to it critical mass. The US government paid for a lot of ALICE – before it could be put out there in a true community process.
BTW I just looked at the ALICE website – and Microsoft has also supported ALICE financially. That wasn’t the case back when I was at Carnegie Mellon – I remember thinking, “How is an educational software package which is a rage with art students going help the US defense department?”. Almost all its funding at the time came from DARPA (Defense Advanced Research Projects Agency)!
There has to be a lot of money/resources/people put into a software project to bring it to a stage so that a community sees that an itch is going to be scratched, and then gets on board.
I was chatting with Hank Janssen and Kishi Malhotra about the “top” open source projects and stated that the top open source project I wanted to see was a “Cloud OS” which wasn’t yet around. I was waiting for the day when a system call made on my laptop would kernel trap on a machine in a data center in India, without my knowing or caring to know which data center or which machine. Ruminating on this I postulated that some of the early components are already there with the Google File System and the Google Cluster Architecture. Then I realized that even though those were Linux based they were by no stretch of imagination open source!
by MichaelF on February 09, 2007 03:41pm
In response to some requests from Port 25 visitors, this week I am going to start a weekly series of posts focused on new Codeplex releases. Because we haven't done this in the past, I'm going to look back about a month on this first post but moving forward I'll provide a weekly update. These updates are primarily going to focus on releases made by Microsoft but if you own a Codeplex project and you'd like to see it listed here, please shoot me a mail: firstname.lastname@example.org.
by billhilf on February 12, 2007 10:50pm
This morning I’ve been catching up on the status of Perl 6. Years ago, Perl was something I used a lot for all sorts of tasks, including a lot of backend data wrangling for a couple of the startups I worked at. One of the benefits and banes of Perl is its tremendous flexibility. It is powerful, but given enough rope…. When I came across one of Damian Conway’s contributions to Perl 6, ‘junctions’*, I was excited but simultaneously jealous that it didn’t exist in the early 90s when I really needed it. Junctions are single scalar ‘overlay’ values that can represent multiple values at once. So if you have a long list of things (like SKU numbers, passwords, customer IDs, etc.) you can superimpose them down into a single value to do a single test against the junction to see if it equals any of them. Junctions are interesting because you can parallelize this type of test against the single value (junctions are unordered which lets the compiler evaluate the tests in parallel). At first I thought it sounded like a fancy pointer or a tied hash/tied array trick common in Perl, but junctions allow you to write in simple English things like:
if $customerid eq any(@possible_ids)
…and get a single true or false from the test in this conditional statement. Of course, there are other, more complicated ways to do this in many languages, but junctions makes it much easier to program with data in Perl and something I’m looking forward to experimenting with more as Perl 6 comes to life.
Two other interesting finds:
Some clever folks have used XNA to create a ZX Spectrum 48K emulator for Xbox 360. I’m a classic computer collector and avid gamer so this goes on my list of things to experiment with. It is a community project up on Codeplex: http://www.codeplex.com/zx360 This is a hard one to explain, it’s a video montage about Web 2.0, done by the Digital Ethnography folks at Kansas State University. Check it out, worth the watch: The Machine is Us/ing Us Till next time,
* Those of you who are Perl geeks may remember Damian’s Quantum::Superpositions module in Perl 5, which was an initial cut at this idea. In classic Conway style the concept is drawn from the Schrödinger's cat thought experiment in quantum physics.
by MichaelF on February 13, 2007 11:36pm
At least four academic research papers in the last 12 months have observed IT vendors appear to have made investments in open source software in order to combine open source assets with their proprietary software portfolios or other revenue drivers-- using open source to strengthen a “value chain” that might extend across other software products, hardware, and consulting services. The largest publicly stated investment in open source—by IBM—has been observed to be potentially related to its lack of a successful x86 operating system(1), failure to write successful web server(2), as an anti-Microsoft competitive tactic(3), and to position its proprietary AIX (UNIX) operating system as “the easiest and most compatible upgrade path for Linux.” (4)
Matt Asay recently blogged that “IBM has been given a lot of love for its open source support” but he seems to agree with the idea that “like any good corporate citizen” where that support starts and stops may be explained in terms of its fiduciary obligation to its shareholders. –Or, as the author of one recent study published out of Harvard Business School states, positioning open source as a “complementary” asset to existing “proprietary” assets per “the old saw of the razor/razor blade business model.”
Matt previously commented on this same Harvard study, lamenting “I am surprised at how little creativity apparently goes into thinking through complements and substitutes, and making open source bets accordingly.”
To a point, I’ll defend publicly financed corporations in general—including Microsoft, IBM, Oracle, and Novell—since their employees do indeed have a fiduciary obligation to look out for the interests of the company’s shareholders. Thus the bounds for creativity are not wide open, and the more “unusual” some sort of “bet” appears, the harder it can be to determine whether that bet is complementary to the core assets of any company. But here’s the beauty of the fact that here at Microsoft we have a different business model from IBM or Oracle or Red Hat. It means we have different opportunities to do something that makes business sense and, in this case, supports open source community-driven development.
I won’t argue whether it means we at the OSS Lab @ MS are more creative or not, but I will trumpet that fact that we have teamed up with Paula Bach, a PhD Candidate at the Computer-Supported Collaboration and Learning Lab in the Penn State Center for Human Computer Interaction and Professor John Carroll, a giant in the human-computer interaction (HCI) field, to present a special interest group (SIG) on “Usability and Free/Libre/Open Source Software SIG: HCI Expertise and Design Rationale” at CHI 2007.
Why? Usability is perceived as a challenging areas for OSS development. Usability is tough for any type of development, but if you have capital to invest and the organizational capacity to it, there is a rich body of knowledge about a systematic research & development (R&D) process—Microsoft follows one which is, IMO, quite impressive, rigorous—and tough from a developer perspective. But if you step back from the term “open source,” you realize that any software production endeavor that is not equipped to apply those commercial best practices—like, hint, hint:
By now you may see a pattern: increasing the ability of small groups of people to collaborate to produce better applications using platform technologies like Windows, Office, and .NET—whether they consider themselves “open source” or not (like end-user programmers trying to solve a problem or a half-dozen folks trying to run a traditional software business) is, I would argue, uniquely consistent with Microsoft’s business model. Paula will be joining us as a lab intern and we hope her learnings about HCI in community-driven development will lead to enhacements for Codeplex (and maybe elsewhere as well).
There’s a saying: “Only Nixon could go to China.” It refers to the idea that only a politician perceived as a hardliner could take on tough issues (--Nixon was regarded as a staunch anti-Communist) that would expose someone with a less extreme reputation to politically crippling accusations of “selling out.” It was also re-purposed in Star Trek VI to explain why Captain Kirk was chosen as a key player in the Klingon-Federation peace process, which is the primary reason I had it on my mind. (I am not sure whether I should be impressed, feel validated, or be frightened that a Wikipedia entry actually documents all this.)
If any company is perceived (rightly or wrongly) as = Nixon to OSS = China or =Kirk to OSS=Klingons, as of today it would seem it is Microsoft. I’m not going China but I am going to CHI. And I think this type of “mission” will be one step toward a uniquely mutually valuable relationship between community-driven development projects (of all types, including open source projects) and Microsoft.
I think the community is getting creative about this opportunity (including people who deeply study OSS development like Paula and Jack, without whom this wouldn’t have happened)—but get ready, so are we.
Warp six, Mr. Sulu.
(The four papers I referenced, most of which are available online, are: (1) Samuelson, IBM’s Pragmatic Embrace of Open Source (2) Mann, The Commercialization of Open Source Software: Do Property Rights Still Matter? (3) Fitzgerald, The Transformation of Open Source Software (4) Iansiti, The Business of Free Software: Enterprise Incentives, Investment, and Motivation in the Open Source Community, )
by MichaelF on February 15, 2007 06:29pm
Two updates this week but I also have some bigger news to report:
Starting February 10th, any registered user can start their own Codeplex Project! Check out the details here: http://www.codeplex.com/Wiki/View.aspx?ProjectName=CodePlex
by anandeep on February 15, 2007 08:18pm
I am writing this from the Big Apple. The Linuxworld Open Solutions Summit is in New York and running from February 13th through February 15th. I questioned the wisdom of holding a conference in February on the East coast rather than the milder shores of the West Coast - since I arrived the middle of a snowstorm after a day's delay caused by flight cancellations. But once I got here the bright lights on Broadway dissipated all doubts!
The Open Solutions Summit is a much smaller conference than Linuxworld conference (which I covered in a previous blog). But it has many more IT professionals who show up than vendors. This makes for more OSCON-like ambience, except that it is IT focused rather than developer focus.
Microsoft held an evening mixer in the conference hotel (the Marriot Marquis) in the cool rotating restaurant on top of the building. The NYC skyline was spectacular, but what I really enjoyed were the conversations with the people. I got to chat with people such as Jeremy White of CodeWeavers. When he asked me if I knew what WINE was, I answered "Wine is Not an Emulator?". I think Jeremy appreciated me knowing that!
Lincoln Durey from EmperorLinux was there and told me about the new tablet laptop that used Jarnal based handwriting recognition. He did say it wasn't as good as the Windows Tablet PC handwriting recognition! I think OneNote is the killer app for the tablet. Is there a open source application like OneNote out there? What was cool about the tablet was that it has a WACOM film on top of a regular LCD that talks to a serial port. He even had a toughbook with the same stuff on it.
It was a pleasure to meet Tony Luck who is a Principal Engineer at the Open Source Technology Center at Intel. I think we could identify with each other because we worked in similar organizations (OSTC / OSSL). He is a kernel maintainer - which means he is royalty as far as Open Source goes. He works on making sure that Intel chip features work with and are fully utilized by Linux in his day job. Way cool - and he's a nice guy to boot.
Another person I met was Gianluca Brigandi who is with a small company called Novascope in Argentina. His company is building out a product based on Java Open Single SignOn (JOSSO) standard besides providing open source code for JOSSO. He is very concerned with interoperability and was keen to hear about Microsoft's open source efforts. I pointed him to http://www.codeplex.com/ and probably will be interacting with him on interoperability further. We spoke about everything from APIs for developers to use to interact with identity standards to open source licenses and what they mean for companies like his.
What brought home the power of conversations to me was a conversation with Rob Donath from SpikeSource. He approached us with an interoperability and packaging questions about Windows and open source software in a virtualization environment. And he said that he wouldnt have known who to talk to if we hadn't been reaching out to the Open Source community through events like this and Port 25.
I love my job! :-)
by MichaelF on February 20, 2007 02:22pm
Just a quick note to let Port 25 readers know that yesterday Microsoft released Virtual PC 2007 as a free download.
With Virtual PC one can run multiple os's on a single physical machine which should be compelling for those developers needing to test and debug across multiple platforms. From community feedback I know a number of readers are generally interested in virtualization and I'd be curious to hear about your experiences after giving this a try.
Ben Armstrong's Blog is a good resource for information regarding Microsoft and Virtualization (Here is some info on Linux and Virtual PC from his blog).
On an unrelated note Mary Jo Foley wrote about Ian Murdock's visit to campus today and I have to say I'm really looking forward to his presentation. I mentioned it before but this is one of the things I really enjoy about this job: I get to meet all sorts of interesting and intelligent people (both in person and virtually) many of whom I'd never expect to meet as a result of working at Microsoft. Miguel de Icaza...Ian Murdock? Who would have guessed?
by MichaelF on February 22, 2007 06:06pm
This paper will cover the installation and initial configuration of PostgreSQL 8.2 on Windows up to a point where a database is created and plpgsql is installed in it. We assume an ability to walk through the install wizard in general, though screens that do require additional information will be covered (with screenshots). Important options in the postgresql.conf and pg_hba.conf will be covered, as will database creation in PgAdmin III.
Attachment: postgresql on windows_final (revised).pdf
by MichaelF on February 26, 2007 06:47pm