Posted by Ted MacLeanGeneral Manager, Open Solutions Group
Last week, I took part in a Gartner Symposium panel discussion about rights and responsibilities of cloud computing customers, a panel that mirrored the analyst firm’s recently published report developed with their Gartner Global IT Council. It was a great discussion about IT governance and the nuances that cloud computing brings into the client relationship. There were representatives from Accenture, Salesforce.com and handful of enterprise companies, so it was a virtual guarantee that opinions would vary.
But one thing stood out through all the discussion: despite all the customer case studies, technical specifications and beta testing that vendors provide, there’s still a high level of uncertainty among customers. Specifically, companies want to ensure that any relationship with a cloud computing vendor is designed, first and foremost, to meet the needs of their business. They also want to make sure that any applications or data deployed on a public cloud are an extension of their IT network, rather than an “island” in the cloud. As I heard I heard one CIO on the panel say, “The issue is to balance the compelling financial case with the risk of execution and what it means for your business.”
For all the benefits of cloud computing, business and IT executives can’t let go of the fact that they’re giving up a certain amount of control of some of their business resources. To help address these concerns, Gartner convened representatives from more than 300 enterprise companies. Their goal was to develop a bill of rights and responsibilities for cloud services that could help “standardize” what customers should and shouldn’t expect from their cloud computing vendors. The expectation is that this list will expand as cloud computing continues to develop and mature.
Microsoft was one of a handful of vendors that provided feedback in a resulting “Vendor Response to Bill of Rights” report and we applaud efforts like Gartner’s to help customers address their concerns and fully embrace the potential of cloud computing. Finding the right balance in addressing customer as well as vendor concerns will be critical to the development of cloud computing. And we look forward to continuing this dialogue with our customers. Microsoft has also worked with the Business Software Alliance to develop the BSA Cloud Computing Guiding Principles which list factors and policies that are key to promote cloud computing.
More and more government officials are recognizing that, for their country or region to thrive, they need to foster local innovation. And to do so, they are increasingly looking to students – especially those studying science, technology, engineering and math (STEM) – as the key to success.
Over the past 4 months, Imagine Cup students from across the globe who won their regional competitions have been celebrated by their government leaders for their technological feats. These leaders recognize that it’s not enough to just hope that students study STEM fields. They acknowledge the importance of prestigious technology competitions such as the Imagine Cup in inspiring students to get excited about and pursue an education or career in science and technology.
With more than 325,000 students registering worldwide last year, the Imagine Cup is now the world’s premier student technology competition – challenging students to use technology to solve the world’s toughest problems. As you’ll see in the photos below, the finalists of last year’s competition have been hailed as national heroes because of their creative thinking and passion for designing solutions to solve real-world problems.
Last week, New York City mayor, Michael Bloomberg, literally brought the point home about the importance of innovation to drive growth when he announced the city’s partnership with Microsoft to host the Imagine Cup 2011 Worldwide Finals next July.
Enjoy this celebration of the confluence of ingenuity and social consciousness!
On October 18th, two Imagine Cup 2010 finalists were invited to participate in the first annual White House Science Fair. Wilson To from the Mobilife team and Christian Hood from BeastWare had the opportunity to meet President Obama, standing among 60 students from across the nation that were recognized for their creative thinking and innovations in science, technology, engineering and math. You can read more about the White House Science Fair on the Imagine Cup blog.
Posted by Anthony Salcito Vice President for Worldwide Education
This week I’m in Cape Town, South Africa and lucky enough to be surrounded by some of the most innovative education leaders, teachers and administrators in the world. We’re all gathered here for the sixth annual Worldwide Innovation Education Forum (IEF), the first time for the event ever to be held on African soil.
Attendees of this event include more than 500 educators, school leaders and government officials representing over 60 countries that continue to creatively and effectively use technology in their curriculum to help improve the way students learn. This is the worldwide finale of a year’s worth of country and regional events, during which 200,000 participants were whittled down to 125 teacher finalists presenting at IEF this week.
Posted by Dan ReedCorporate Vice President, eXtreme Computing Group and Technology Strategy & PolicyThroughout the history of science, back to the days of the Renaissance, data has been scarce and precious. But today, riding the same technological economics that have given us inexpensive computing and ubiquitous sensors, scientists have the ability to capture data at a previously unimaginable scale. In all domains, scientists and researchers are drowning in data. They've gone from scarcity to an incredible richness, necessitating a sea change in how they extract insight from all this data. In a parallel shift, our scientific questions and problems increasingly lie at the intersections of traditional disciplines—for example, the recent U.S. oil spill in the Gulf. Understanding the complexities of what it means for oil distribution in water is a problem related to computational fluid dynamics, but understanding the impact of that oil on the marine ecosystem is a biological problem. To fully understand the issue, researchers from multiple disciplines—from different cultures, using different research tools—have to unite to build models and analyze data from diverse sources. With this has come an insatiable demand for easy-to-use tools and computing support, unfortunately requiring many researchers to assume additional systems administrator roles. These researchers often spend inordinate amounts of time maintaining the computing systems they require to do their research rather than devoting their time and talents to the research itself. The cost to maintain and refresh this computing infrastructure is becoming a larger and larger burden, and the economics are unsustainable. As a result, much of our research funding climate has focused (because of the power of computing for scientific discovery) on refreshes and repeated deployments of infrastructures on research campuses and laboratories. Yet at even the best funded research organizations, the majority of researchers do not have access to the computing resources they need.
Posted by Peter CullenChief Privacy Strategist
This week, more than 400 policymakers, privacy advocates and industry representatives will be converging in Israel for the 32nd International Conference of Data Protection and Privacy Commissioners.
The conference has commenced this morning in Jerusalem, a city of both ancient traditions and thoroughly modern influences, and I was reminded of how that same dynamic is true of privacy in the Internet age. Yesterday marked the 30th anniversary of the OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. These privacy guidelines have served as the basis for numerous privacy laws in place across the globe. Yet, even these privacy principles need to keep pace with the changing information environment. In my remarks today at a panel discussion titled “Notice and Consent: Illusion or Reality?”, I suggested that individual participation through mediums such as notice and consent remains important to safeguarding users’ privacy, but by itself does not afford enough protection. This is particularly true given the explosion of information collection and use that is the fuel of today’s Internet economy. The same is true of the various legal frameworks that govern data collection, usage, and sharing. Both are important, but neither is sufficient on its own.
Alongside individual participation and regulatory oversight, another vital aspect of privacy protection is often overlooked: the role and responsibility of the organization in maintaining and protecting personal data.
Microsoft’s view, as outlined in a new white paper released today at the conference, is that organizations’ privacy policies and data management practices most directly influence whether users’ personal information is kept safe or exposed to risk. Therefore, we believe that organizations—including Microsoft—must hold themselves accountable for acting to protect users’ interests and taking appropriate measures to safeguard privacy and personal data, even in the absence of specific regulatory mandates.