The content of this post was based on Windows Server 2008 R2. However the concepts remains applicable and the implementations are much the same with those in Windows Server 2012.
The ability to deliver a desktop with full fidelity over a network, while deploying applications on demand and with hardware independence, is an IT reality with Windows 7, Windows Server 2008 R2, and Application Virtualization (App-V) which is part of Microsoft Desktop Optimization Pack (MDOP). This screencast highlights how these three amazing technologies work as a solution platform, by demonstrating key user scenarios. Notice that if to implement the VDI solution in a Windows 2003 functional level domain, one must extend the AD schema to Windows Server 2008 level.
For more information, I have also published a number of blog posts and screencasts on Microsoft virtualization solutions including:
This exciting milestone represents the end of development and testing. Office 2010 embraces trends in computing such as social networking and is ready for the cloud from the ground up. The launch of Office 2010 and SharePoint 2010 is on May 12th. You can pre-order Office 2010 and be among the first to get the product when it ships in June. Meanwhile, register a Launch 2010 (full-day) event or a Launch 2010 Highlight (half-day) event in a city near you. It’s an exciting time. Don’t miss it.
Office 2010 Launch Events
CT - Farmington
Thursday, May 13, 2010
DC - Washington *
Tuesday, May 18, 2010
FL - Miami
Tuesday, April 20, 2010
FL - Orlando *
Thursday, April 22, 2010
GA - Atlanta *
Thursday, May 20, 2010
MA - Boston *
Thursday, April 29, 2010
MD - Baltimore
Friday, June 04, 2010
ME - Augusta
Tuesday, May 11, 2010
Thursday, June 03, 2010
NC - Raleigh
Wednesday, June 02, 2010
NJ - Parsippany
Tuesday, June 15, 2010
NY - Hempstead
Wednesday, June 16, 2010
NY - New York City *
NY - Rochester
Thursday, May 06, 2010
PA - Philadelphia *
Thursday, May 27, 2010
PA - Pittsburgh
Wednesday, June 02, 2010
RI - Providence **
Wednesday, June 09, 2010
* Full day Launch Events
** Community Event
This is the day and now is the time. Microsoft Office 2010 is official launched today, May 12th, 2010. Looking back, it is amazing to realize how far we have come and how much impact the Office family products and solutions have been making in our everyday life, education, and businesses.
So begin your experience with Microsoft Office 2010, Microsoft SharePoint 2010, Microsoft Visio 2010 and Microsoft Project 2010 by test-driving the new wave of business productivity with virtual labs, videos, free downloads.
Here I thought to share some of my favorites in PowerPoint 2010 to highlight the new in Office 2010. These features are what I now use on a daily basis to save time, carry a productive conversation, develop better content, and deliver an effective session to my audiences. Here they are.
PowerPoint Broadcasting has to be the first one to talk about since I have used it so often. It is simple to do and what a difference it can make. Anytime, either in a phone conversation, instant messaging exchange, or presentation, a few mouse clicks will enable me to broadcast out PowerPoint slides to facilitate a discussion. As needed, a viewer can use cell phone as a viewing device. It makes it easier for every one to follow and be on the same page. This feature is a must-have for those who are mobile while still need to carry out an in-depth or lengthy conversation or presentation while on the road. There are a few limitations to be aware of.
Inserting a screenshot right from PowerPoint 2010 is another great feature I can’t stop talking about. Previously I need to jump back and forth between PowerPoint and a screen capture tool and with many repetitions of copy-n-paste to get a screenshot into a slide the way I want it.
Now, just bring up your browser and go to an intended URL, then in PowerPoint simply click Insert Screenshot. It really can’t be any easier. Once the image is inserted, I also hyperlink it to the URL. In this way, during a presentation, I can always show the screenshot and if with internet access and time permitting, I will click the inserted screenshot and show the linked URL in real-time since the inserted image may be out-dated or I want to point my audiences to a particular part of the page.
For VDI, Microsoft’s licensing model is VDI Standard and Premium Suites. These are great offerings and simplify the process to acquire Microsoft VDI. And in my view for many customers, this is a cost-effective solution and I encourage IT decision makers to examine and compare Microsoft offerings with others out there in the market. The VDI Suite includes not only the basic infrastructure needs, but critical management components to ensure a successful VDI deployment. So customers can, for instance, employ System Center Virtual Machine Manager and take advantages of the many integrations among System Center family, Windows Server 2008 R2, and Windows 7. For a more comprehensive offering, the VDI Premium Suite includes additional desktop and application deployment options.
Still many may have a fundamental question, why a new licensing model is necessary for VDI? To better understand it, the following chart details a number of VDA or Virtual Desktop Access scenarios and we should also further look into how software was deployed yesterday and how software can be deployed today and beyond.
Traditional software deployment is per device. Which assumes an OS will be associated with a particular hardware device like a PC or laptop, once installed. And as well, an application will be associated with a particular OS instance, once installed. However, with VDI, we now can deploy dynamically and roam personal desktop, applications, or both based on an IE session of an authenticated user, and not necessarily need to install and tie an instance of OS or an application to a particular physical device or a particular instance of OS, respectively. Therefore, traditional licensing model does not correctly reflect the usage of licensed software in a VDI deployment. Further, a key factor for the success of any virtualization initiatives is the management as explained. There are also licensing implications on including a VDI deployment into a software deployment and management infrastructure and can and probably will further complicate the overall licensing solution. A licensing model specifically addressing VDI deployment scenarios is essential. Both the VDI Standard Suite and the VDI Premium Suite are licensed per client device that accesses the VDI environment, and thereby allow for flexibility of server infrastructure design and growth.
Just got back from TechEd 2010 with a great experience. I met some old friends, made some new, attended some great sessions, and loosened up a little… I mean a lot during the nights. Working hard form 8 AM to 6 PM everyday attending sessions and taking notes, and taking care of business and working overtime during the nights in Bourbon Street, I was… with my fellow Evangelists, John, Bob, Blain, and Kevin and other folks, of course. Many thanks to Kevin who also managed to keep a video diary and share with us in addition to all the rest things were going on. Take a look and I hope you will join us next year in Atlanta, GA.
TechEd 2010 Video Diary by Kevin Remde.
The Microsoft Assessment and Planning (MAP) Toolkit 5.0 is an agentless tool designed to simplify and streamline the IT infrastructure planning process across multiple scenarios through network-wide automated discovery and assessments. This Solution Accelerator performs an inventory of heterogeneous server environments and provides you with usage information for servers in the Core CAL Suite and SQL Server, SQL Server 2008 discovery and assessment for consolidation, Windows 2000 Server migration assessment, and a readiness assessment for the most widely used Microsoft technologies—now including Office 2010.
What's new with MAP Toolkit 5.0?
Download: Microsoft Assessment and Planning (MAP) Toolkit 5.0
View full catalog of Solution Accelerators
Like the Windows Server 2008 R2 component poster, the Hyper-V poster is a great visual reference to better understand the key features and components of Hyper-V, and Microsoft virtualization solutions in general including:
I use it a lot myself and highly recommend.
What Are Office Web Apps
The concept of Office Web Apps is essentially your Microsoft Office in the cloud. Enterprise customer can deploy Office Web Apps in a private cloud, while for Windows Live users Microsoft makes Office Web Apps available free in the Internet.The following is a screen capture of editing a presentation with PowerPoint Web App. A quick review is also available in Office Web Apps Overview.
Office Web Apps are online companions to Word, Excel, PowerPoint, and OneNote giving you the freedom to work on Microsoft Office documents with browsers including Internet Explorer 7 or later for Windows, Safari 4 or later for Mac, and Firefox 3.5 or later for Windows, Mac, or Linux.Office Web Apps are entirely Web-based, and there's no additional software to download or install. Office documents can be created and stored in a server supporting Office Web Apps right from the browser session without the need of a locally installed Microsoft Office client.
Using Office Web Apps a user will be able to view Office documents seamlessly in the browser with great fidelity, create new Office documents and do basic editing using the Ribbon. There are however some differences between the features of Office Web Apps and the Office 2010 programs. When making changes requiring functions beyond what are available in an Office Web App, or as preferred, one can easily open and edit the document in Office locally installed on your computer, and later save it back to the server. The ability to open Office documents directly from Office Web Apps into the desktop application is available on computers running a supported browser and with Microsoft Office 2003 or a later version of Office (for Windows PCs). This functionality will also be available on computers running a supported browser along with the forthcoming Office for Mac 2011.
What Is SkyDrive
A free, password-protected online storage available with a Windows Live ID by Microsoft, SkyDrive is. With a Windows Live ID, a user can store up to 25 gigabytes (GB) of files as of July, 2010. The upload operation accepts a file up to 50 megabytes (MB) in size. A user can arrange files with folder and subfolders, and keep private files in the personal folder while placing those to be public in a shared folder. To share a folder or individual file, a user can set permissions accordingly followed by inviting others with email. Shown below is one way to create Office documents in SkyDrive.
Although SkyDrive provides a location for storing files online, it is nevertheless not an FTP site, nor does it function with an FTP client. Further Microsoft may limit the number of files that each user can upload to SkyDrive each month. Individual seeking support on SkyDrive can participate the conversations and look for answers in SkyDrive Forum.
Office Web Apps , SharePoint, and SkyDrive
For enterprise customers with on-premise SharePoint installation, Office Web Apps require SharePoint Foundation 2010 which is free from Microsoft. On the other hand, Office Web Apps does require volume licensing. Office Web Apps can deliver Word, Excel, and PowerPoint files on many devices. Supported mobile viewers for Office Web Apps on SharePoint include Internet Explorer on Windows Mobile 5/6/6.1/6.5; Safari 4 on iPhone 3G and 3GS; BlackBerry 4.x and later; Nokia S60; NetFront 3.4, 3.5, and later; Opera Mobile 8.65 and later; and Openwave 6.2, 7.0 and later. To roll out the services in an enterprise environment, TechNet has documented specifics including planning and deploying Office Web Apps.
For consumers, Office Web Apps are part of the Windows Live offerings. A user with a Windows Live ID can user Office Web Apps to create, upload Office documents which are stored in SkyDrive. Supported mobile viewers for Office Web Apps on Windows Live include Safari 4 on iPhone 3G and 3GS, and Internet Explorer 7 on the upcoming Windows Phone 7. Viewing Excel files via a mobile browser is currently only available with Office Web Apps on SharePoint 2010.
Start Using Office Web Apps with SkyDrive Today
A supported browser and a Windows Live ID are all you need to create, view, edit, and share your Office documents in the cloud. Your teammates can now work with you on projects regardless if they have a locally installed copy of Microsoft Office.
<Next: Office Web Apps Overview>
Office Web Apps are online companions to Word, Excel, PowerPoint, and OneNote giving you the freedom to work on Microsoft Office documents with browsers. This screencast gives a high-level overview of the requirements to deploy Office Web Apps in a SharePoint environment, what is a SkyDrive, how you can experience Office Web Apps today, etc. Additional information is also available from the post, Office Web Apps with SharePoint 2010 or SkyDrive Explained.
is a free download from Microsoft. It is a low-cost entry-level Web-based collaboration solution for small organizations or departments. SharePoint Foundation (SPF) 2010 is the underlying infrastructure for SharePoint Server 2010 and the new version of Microsoft Windows SharePoint Services (WSS) 3.0. Frequently SPF is also used as a pilot or proof of concept before enterprise roll-out of a SharePoint solution.
Almost everything an IT pro needs to know about SPF including requirements, what’s new, Getting Started, planning, deployment, and operations is in a technical library on the web and in Help (or chm) format for downloading as shown below. Similarly technical libraries of SharePoint Server 2010 on the web and in Help format are also available. These technical libraries are must-have references and my recommended bedtime readings for IT pros serious about SharePoint.
Notice in the Help format version, once downloaded if the text in the Help file does not appear as expected, instead "Navigation canceled," "Action canceled," or "The page cannot be displayed" is displayed, please proceed with the following steps to unlock the file:
The following table highlights the minimum requirements of Windows SharePoint Services (WSS) 3.0 and SharePoint Foundation (SPF) 2010 for preliminary upgrade planning for SPF2010. Since SPF2010 is the underlying infrastructure for SharePoint Server (SP) 2010, information presented here is applicable to upgrade planning for SP2010 as well.
Notice that the following scenarios are not supported in upgrading to SPF2010:
Upgrade from a farm running WSS 3.0 earlier than SP2
Direct upgrade from a farm running WSS 2.0/SharePoint Server (SPS) 2003. In this case, one must go from WSS 2.0/SPS2003 to WSS 3.0/MOSS2007 before going to SPF2010/SP2010.
For production deployment, please do reference the links in Official Requirements of the table to get the latest information.
SharePoint Foundation 2010
64-bit, four cores
4 GB for developer or evaluation use
8 GB for single server and multiple server farm installation for production use
For large deployments, see the "Estimate memory requirements" section in Storage and SQL Server capacity planning and configuration (SharePoint Server 2010).
3 GB for installation
80 GB for system drive
For production use, you need additional free disk space for day-to-day operations. Maintain twice as much free space as you have RAM for production environments. For more information, see Capacity management and sizing for SharePoint Server 2010.
Windows Server 2003 (Standard, Enterprise, Datacenter and Web editions)
Windows Server 2008 (as of WSS 3.0 SP1)
64-bit edition of Windows Server 2008 Standard with SP2
SQL 2000 SP4
SQL 2005 SP2
SQL 2008 (as of WSS 3.0 SP1)
64-bit SQL 2005 with Service Pack 3 (SP3) with Cumulative update package 3 for SQL Server 2005 Service Pack 3
64-bit SQL 2008 with Service Pack 1 (SP1) and Cumulative Update 2 installed with Cumulative update package 2 for SQL Server 2008 Service Pack 1
http://technet.microsoft.com/en-us/library/cc288751(office.14).aspx including software prerequisites
WS2008 R2 Trial, SQL2008 R2 Trial, SPF2010, SP2010 Trial
(A cross-posting from SharePoint Experts Blog)
I am very excited to announce our upcoming TechNet events for the remainder of 2010. Based on the feedbacks from those who attended our events in the past few quarters, we have made some changes on our deliveries. We will have two tracks simultaneously delivered in the US east coast. The ten Windows 7 Deployment events are delivered by John Baker and Blain Barton. While eight SharePoint 2010 are by Yung Chou and Bob Hunt.
These are so called Firestarter events with the first session starts from 9 AM and the last and fourth finished by 5 PM. And throughout the day we will have sessions relevant to a specific technical focus. We are hoping with this format more technical depth can be delivered within a short period of time. The following is the schedule. The registration links have the location information and abstracts. Most events are to be delivered in Microsoft offices and the seats are very limited. So register early. You have been here advised. :)
MVPs, user group leaders, and IT influencers relevant to each track, please let us know you are coming. We would like to know the activities in your areas and how we may better assist you growing the communities.
See you all at the event near your city.
BI is a concept encompassing many areas of IT and like many other IT terms, it means different things to different people. One simple definition of BI is “Using of analytic and visualization tools to better understand and interpret data.” Recently there have been active discussions on BI as a priority in CIO’s list. Interestingly, the more we talk about BI, it seems the bigger the scope BI has. Indeed, PowerPivot, Excel Services, PerformancePoint, etc. these tools and features can sound confusing and overwhelming. To better understand BI, I find a great review discussing How SharePoint 2010 brings BI to the next level and a nicely done poster, Getting started with business intelligence in SharePoint Server 2010, are both very interesting and informative.
Notice there are three areas of BI, i.e. at individual, community, and organizational levels. SharePoint 2010 addresses these areas as a whole with various vehicles including: Excel and PowerPivot Add-in, Excel Services, PerformancePoint Services, Visio Services, and Reporting Services and Report Builder as depicted below. And it is important to keep the context in mind of a BI scenario that is being assessed such that the best vehicle, namely right tools and the best-fit features, will become evident.
(Source: Getting started with business intelligence in SharePoint Server 2010)
This is an overview of a series of articles to review the following five BI vehicles in SharePoint 2010:
Also I highly recommend reviewing a great series of Office and SharePoint relevant content publsihed by Dan Stolts, one of my fellow Evangelist based in Boston area.
(A cross-posting from Microsoft SharePoint Experts Blog)
This is the second article of a series to review the following five BI vehicles in SharePoint 2010:
Excel 2010 and PowerPivot introduces a fascinating integration. PowerPivot, a data analysis tool and free add-in to Excel 2010, gives users the power to create compelling BI solutions right at from an individual's desktop. With this add-in to Excel, a user can transform mass quantities of data with significant speed into meaningful information to facilitate decision making. Excel with PowerPivot is, in essence, a user-driven, self-service solution model with minimal infrastructure dependency. For those who need a quick and user-driven BI solution with minimal infrastructure dependency, Excel 2010 with PowerPivot is a great candidate.
PowerPivot, natively supports various data stores, SQL Analysis Services, and data feeds as shown above. Once in place, PowerPivot has an in-memory engine capable of processing millions of rows of data with impressive performance. At minimum, 1 GB RAM is expected to run PowerPivot. The actual RAM needed will obviously depend on the amount and business logic that PowerPivot carries. The computing power is delivered directly within Excel with a consistent user experience. At an operational level, an Excel user will be able to employ PowerPivot very easily with just routine Excel end-user operations, i.e. mouse-clicks, cut-n-paste, etc. It's a very cost-effective way to analyze large amount of data for achieving business insight and shortening decision cycles. There is also an integration of PowerPivot with SharePoint 2010 (see below) that scales this self-service BI model to an enterprise level.
PowerPivot for SharePoint adds services and infrastructure for loading and unloading PowerPivot data. After creating a PowerPivot workbook, one can save/publish it to a SharePoint server or farm that has PowerPivot for SharePoint and Excel Services installed. This adds collaboration and document management support for a published PowerPivot workbook. In SharePoint, while the PowerPivot services processing the data, Excel Services renders it in a browser session. The SharePoint integration enables users to share data models and analysis. By configuring refresh cycles, the data can stay current automatically. Further, a published workbook may become the basis for Reporting Servicesreports created by other authorized SharePoint users, repurposed in other PowerPivot workbooks, or linked to from different sites, possibly in different farms. There are many interesting business scenarios and possibilities with PowerPivot. To learn more, Http://powerpivot.com/ is a great resource.
Is There A Need for Processing Millions of Rows of Data
A key delivery of PowerPivot is the ability to process millions of rows of data. Here the amazing capacity of sorting more than 100 million rows on a desktop delivered by Excel 2010 and PowerPivot is shown below.
In some of my TechNet events, a few IT professionals nevertheless told me they would never need to process millions of rows of data in Excel. I was not surprised with this response and, in a way that was very true.... till the introduction of PowerPivot. In my view, the capacity and performance limitations then existed in hardware and software making it not practical to process extremely massive amount of data in a desktop environment. That was not because there had had no need for processing very large amount of data. Companies would need to spend a lot of time and money; contracting it out or having a developer team to develop, produce, and maintain reports for making business decisions. I can vividly remember in mid 90's while working as a consultant, many of my engagements were to fix business logic and improving the performance of COBOL reports based on large amount of data. When it comes to statistical models, demographics, trend analysis, optimization, information portal, etc. data will never be too much and the demands have been always there. The difference is reports used to take hours of a team of specialists and operators to implement and much CPU time to generate, now available in seconds at the finger tip of an information worker running Windows 7 desktop and Excel 2010 with PowerPivot.
And just like many of us once argued 1024x768 resolution would be more than enough for word processing and email routines, while today few works with a low resolution screen anymore. Not too long ago, I thought Instant Messaging was counterproductive, while today it is a necessity for me to be productive and take care of business every day. So, does everyone need the ability to process millions rows of data? Maybe not, not yet. I do however believe as PowerPivot is becoming a standard add-in for Excel 2010, businesses will soon expect the ability and performance to process extremely large amount of data from multiple data stores are and will be readily available with a PC desktop. The question is and will not be if data can be analyzed, but what to analyze and how good the analytical model is. Above all, PowerPivot is for:
The empowerment through a self-service model to derive business intelligence right on the desktop and the immense capacity offered by PowerPivot as a free add-on, together makes this solution a must-have for conducting data analysis.
This is the third article of a series to review the following five BI vehicles in SharePoint 2010:
First introduced in Microsoft Office SharePoint Server 2007, Excel Services provides server-side calculations and browser-based rendering of Excel workbooks. On the right is the architectural concept of Excel Services. In the core is Excel Calculation Service (ECS) which is the calculation engine. Excel Web Access (EWA) is a web part which displays and interacts with a workbook. The access to methods and objects is through APIs provided by Excel Web Services (EWS) hosted in SharePoint Services.
Excel Services allows a user to publish a workbook or selected spreadsheet cells as a webpage. Because the content is published without exposing the underlying business logic, intellectual properties are protected and as well applied in a standardized/consistent fashion. The motivation is to publish "one version of the truth" such that users always view a consistent set of values if published as read only, and results derived on business logic that is consistently defined. In a large organization, consistency and synchronicity are key productivity enablers which are many SharePoint features are about. Both Excel 2010 and Excel 2007 have the ability to publish an Excel workbook to a SharePoint site.
By naming selected cells in an Excel workbook, an author can then indirectly letting a user change the cell values and apply them as parameters of an analytical model. For example, as shown on the left, a user provides interest rate, loan period, and loan values in the Parameters Task Pane to calculate a monthly mortgage payment. While any of the three parameters varies its value, the derived monthly mortgage payment changes accordingly. Since the business logic, i.e. formulas, embedded in these cells are not exposed; Excel Services can display the results with the business logic implemented in a consistent and protected way. In this example, the mortgage calculation happens to be a well-known formula and the protection may appear trivial. However in a production application, this may be a work order estimate or a marketing program discount rate calculator. In this case, Excel Services not only can protect the underlying business logic perhaps based on proprietary knowledge, but as well ensure the logic is applied in a consistent and predictable fashion.
In other words, in addition to publishing one version of the truth as read-only data like KPIs, charts, and tables, Excel Services can also allow a user to enter values as parameters to a protected analytical model and carry out what-if analysis.
This is the fourth article of a series to review the following five BI vehicles in SharePoint 2010:
A picture is worth a thousand words. This cannot be more applicable to what Visio Services can deliver. A feature of SharePoint 2010, Visio Services enables data-bound Visio drawings to be viewed in a web browser. This feature is for sharing Visio drawings and letting authorized users view Visio diagrams in a SharePoint library without having Visio or the Visio Viewer installed on their local computers. Visio Services can also refresh data and recalculate the visuals of a data-connected Visio drawing hosted on a SharePoint 2010 site. So a user will always see the latest and up to date information in a visual form. For instance, a complex manufacturing supply chain can be presented with clarity and simplicity, and up to date status with Visio Services as shown below. A Visio Services overview is a good starting point to better understand this feature. And the installation and administration of Visio Services are very easy to follow.
Visio Services can display Visio drawings using a Web Part without having a locally installed Microsoft Visio 2010 on the client computer. However Visio Services is not for creating or editing Visio diagrams. To create, edit, and publish diagrams to Visio Services, an author must have a locally installed Microsoft Visio Professional 2010 or Microsoft Visio Premium 2010.
Available only with SharePoint Server 2010 Enterprise Client Access License (ECAL), Visio Services must be deployed, provisioned, and enabled before first use. In addition, one must have Microsoft Visio Professional 2010 or Microsoft Visio Premium 2010 in order to save diagrams to SharePoint as Web drawings.
To view a Visio drawing based on a SharePoint list or an Excel workbook connected to an Excel Services, a user must be authenticated and authorized by the SharePoint 2010 hosting the content. And three authentication methods are supported:
While developing enterprise service architecture, planning for services that access external data sources is something not to overlook. For a service application as one the following using a delegated Windows identity to access an external source, the external data source must reside within the same domain with the SharePoint 2010 farm where the service application is located or the service application must be configured to use the Secure Store Service.
Namely Delegation of a Windows identity, Windows domain, and Secure Store Service are a few things to keep in mind if a service application to access a data store beyond the SharePoint farm where the service application is running. In other words, do the right thing to plan your Visio Services deployment.
This is the fifth article of a series to review the following five BI vehicles in SharePoint 2010
was a separate product. Now included in SharePoint 2010, PerformancePoint becomes a set of services configured as a service application, and surfaces itself in a web part page with Key Performance Indicators (KPIs), Scorecards, Analytic Charts and Grids, Reports, Filters and Dashboards, etc. Each of these components interacts with a server component handling data connectivity and security. This integration with SharePoint 2010 brings opportunities to better analyze data at various levels, while SharePoint security and repository framework provides consistency, scalability, collaboration, backup and recovery, and disaster recovery capabilities. One very interesting analytics tool in PerformancePoint is the Decomposition Tree which enables a user to navigate through mass amount of data in a visual and initiative way to decompose, surface, and rank data based on selected criteria. The user experience is shown below.
PerformancePoint is installed by default in SharePoint 2010. It can be easily configured as a service application in Central Admin and deployed in a SharePoint farm as shown below. Overall, this integration makes Business Intelligence much more approachable in system integration and administration. PerformancePoint planning, administration, developers and IT pros centers, and MSDN blog are good resources to find out more information.
This is the sixth and last article of a series to review the following five BI vehicles in SharePoint 2010:
Business reports back in mainframe and early PC days used to be tedious to generate, ill to read, and painful to share. The administration and skills needed to organize, develop, and distribute data and reports are not trivial. I can still remember my consultant days working on JCLs and COBOL for customizing business reports in various mainframe shops. Today with some key integrations and tools, it is much easier to generate reports using web services and report generator.
In SharePoint 2010, a report server can be configured as part of a SharePoint deployment. The integration is provided through SQL Server and the Reporting Services Add-in for SharePoint Products. This integration provides benefits in storage, security, and document access. Once configured, opening a report in SharePoint will behind the scene establish a session with the associated Report Server which retrieves and processes the data followed by displaying the results in Report Viewer Web Part in SharePoint. Essentially the reporting services can now be consumed directly from SharePoint document libraries with SharePoint content management and security models. The following depicts the architecture and the steps to enable this integration:
In addition, SQL Reporting Services is also integrated with Report Builder 3.0 which is a feature-rich report authoring tools for end users. Sparklines and data bars, maps, and indicators are some of the new features to enhance data visualization of KPIs in a report. For those who would like to learn more, there is much information readily available for mastering Report Builder 3.0.
One way to describe cloud computing is to base on the service delivery models. There are three, namely Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS) and depending on which model, a subscriber and a service provider hold various roles and responsibilities in completing a service delivery. Details of SaaS, PaaS, and IaaS are readily available and are not repeated here. Instead, a schematic is shown below highlighting the various functional components exposed in the three service delivery models in cloud computing compared with those managed in an on-premises deployment.
Essentially, cloud computing presents a separation of a subscriber’s roles and responsibilities from those of a service provider’s. And by subscribing a particular service delivery model, a subscriber implicitly agrees to relinquish certain level of access to and control over resources. In SaaS, the entire deliveries are provided by a service provider through cloud. The benefit to a subscriber is there is ultimately no maintenance needed, other than the credentials to access the application, i.e. the software. At the same time, SaaS also means there is little control a subscriber has on how the computing environment is configured and administered outside of a subscribed application. This is the user experience of, for example, some email offering or weather reports in Internet.
In PaaS, the offering is basically the middleware where the APIs exposed, the service logic derived, the data manipulated, and the transactions formed. It is where most of the magic happens. A subscriber in this model can develop and deploy applications with much control over the applied intellectual properties.
Out of the three models, IaaS provides most manageability to a subscriber. Form OS, runtime environment, to data and applications all are managed and configurable. This model presents opportunities for customizing operating procedures with the ability to on-demand provision IT infrastructure delivered by virtual machines in cloud.
An important take-away is that we must recognize and be pre-occupied with the limitations of each service delivery model when assessing Cloud Computing. When a particular function or capability like security, traceability, or accountability is needed, yet not provided with a subscribed service, a subscriber needs to negotiate with the service provider and put specifics in a service level agreement. Lack of understanding of the separation of responsibilities in my view frequently results in false expectations of what cloud computing can or cannot deliver.
On December 2, 2010, Microsoft’s announced its cloud infrastructure (data centers) has received Federal Information Security Management Act of 2002 (FISMA) Authorization to Operate (ATO). This ATO was issued to Microsoft’s Global Foundation Services organization which provides a trustworthy foundation for the company's cloud services, including Exchange Online and SharePoint Online, which are currently in the FISMA certification and accreditation process. This ATO represents the government’s reliance on Microsoft’s security processes in compliance with
For all the IT pros out there, this is a great reference to keep. I have downloaded, installed in all my machines, and made it readily available for me either in office or on the road.
Although mainly developer-focused, this nicely packaged content explains Microsoft’s Platform as a Service (PaaS) solution well with labs, samples, presentations, videos, and demos. And there are core scenarios that IT pros should be familiar with in developing and deploying cloud applications to successfully assess the pros and cons of running on-premises IT and in cloud. Make no mistake about it. Cloud is here. And in my view understanding Windows Azure and services is not just about learning a different technology. It is about staying in the game and taking advantage of the opportunity, or becoming obsolete sooner than expected and worrying about losing job. The more you ramp up your skill set with cloud, the clearer and bluer sky you will get. That is what has been happening to me.
YES, the wait is over. SBS 2011 (which is based on Windows Server 2008 R2 technologies) has been released to manufacturing today. And around mid-January, trial will be available for download at the SBS web site. Here’s some additional information:
The series focusing on cloud essentials for IT professionals includes:
Cloud computing, or simply cloud, is changing how IT delivers services and how a user can access computing resources at work, from home, and on the go. Cloud enables IT to respond to business opportunities with on-demand deliveries that are cost-effective and agile in the long run. Much happening in enterprise IT now is a journey to transform existing IT establishment into a cloud-friendly, cloud-ready, cloud-enabled environment. To start off, there are key concepts we, as IT pros, must grasp to fully appreciate the transformation that is going on and forward.
What Is Service
In the context of IT, “service” is a term frequently used to describe a form of delivery or availability. In a Windows machine, for example, core services to authenticate users and process commands automatically start and run behind the scene to provide essential functions for running a desktop session. In the context of cloud computing, I simply explain a service as something delivered “on demand.” Namely, a computing resource delivered as a “service” is available on demand to an authorized user. Specifically in cloud computing, “on-demand” also carries additional connotations.
On-demand in the context of cloud computing suggests that how a resource is made available is transparent and not a concern of a subscriber. It implies computing capacities can be adjusted dynamically according to demands. In other words, a subscriber can increase the capacities as needed and decrease them when no longer required. On-demand also means there is a business model in place to support “pay as you go” and “pay according to how much you have consumed.” In a production environment, there may be administrative as well operational constraints on how much and how fast a subscriber can change the resource allocations. This can and should be negotiated and stated in a service level agreement between a subscriber and a service provider. Conceptually, a service delivered through cloud is a set of computing resources available, scalable, and consumable based upon demands. On-demand essentially conveys the characteristics of Cloud.
Characteristics of Cloud Computing
Cloud similar to many IT terms like: database, networking, security, collaboration, portal, workspace, etc. is something that too often means different things to different people. Accessing your company’s application via Internet, is that Cloud Computing? Employing VPN to authenticate into your private network, is that Private Cloud? Is remote access considered some form of Cloud Computing? These questions may seem trivial, yet they are fundamental to preclude ambiguity, uncertainty, and uneasiness when we are facing changes and transitioning from an infrastructure-focused deployment to a service-centric, i.e. Cloud, deliveries. For technical professionals, Cloud may mean: utility computing, high speed grids, virtualization, automatic configuration and deployment, on-demand and remote processing, and combinations of them. For non-technical users, Cloud is simply the Internet, a cable form a service provider, or just something out there networked with my computer. Either public, private, or in between, the conventional wisdom, as published in The NIST Definition of Cloud Computing, assumes noticeable characteristic regarding how computing resources are made available in Cloud including:
And realize that based upon a delivery model, these characteristics apply to different user experience. For instance, on-demand self-service may imply the ability to: acquire an account and create a user profile as in SaaS, code and publish an application in PaaS, or configure and deploy a virtual machine in IaaS. This may not as apparent without a clear understanding on how services are deployed and delivered in cloud.
[To Part 1, 2, 3, 4, 5, 6]
In Part 1, I talked about what “service” in the context of cloud computing means. Cloud is all about delivering services, i.e. making resources available on demand based on needs, paid by use, and.with the characteristics of ubiquitous network access, resource pooling, etc. Still we need to clearly define what cloud is. Without a common definition for a subject as broad as cloud computing it is hard to navigate through the overwhelming business and technical complexities. So here’s the six million question.
What Is Cloud
It is important to understand that there are services delivery models and deployment models. And both are needed to fully describe what cloud is. There are 3 ways to deliver services via cloud.
or SaaS is a model where an application is available on demand. It is the most common form of cloud computing delivered today. Microsoft Office 365 including: Exchange Online, SharePoint Online, Lync Online and the latest version of Microsoft Office Professional Plus suite is an SaaS offering to businesses.
or PaaS is a platform available on demand for development, testing, deployment and on-going maintenance of applications without the cost of buying the underlying infrastructure and software environments. Windows Azure Platform is a cloud-computing platform on which Microsoft’s internal IT (MSIT) organization has quickly built and deployed the Social eXperience Platform (SXP) to enable social media capabilities across Microsoft.com as documented.
On deployment, there are two base models. Public cloud is cloud computing made available through Internet to the general public or targeted users and is owned by an organization offering cloud services. An example is Microsoft Windows Live as free public cloud offerings for consumers, and Microsoft Online: Office 365 for businesses. Private cloud, on the other hand, is cloud available solely for an organization regardless if the cloud capabilities are managed by the organization or a third party and exists on premise or off premise. Based on the two models, some derive additional models like hybrid cloud, community cloud, etc. to highlight the implementation or intended audiences. For private cloud, two service delivery models: PaaS and IaaS are applicable since in a private setting, one can not deliver SaaS without having PaaS in place. Noticeis a solution for building private cloud. Hyper-V Cloud is a set of initiatives, guidelines, and offerings to help emperies deliver IaaS in a managed environment. Also the above mentioned delivery models are significant since once a model is selected to fulfill business objectives, responsibilities are implicitly agreed upon and accepted by the party hosting the cloud facility and the other subscribing the services.
Separation of Responsibilities
An important attribute of Cloud Computing is the separation of a subscriber’s responsibilities from those of a service provider’s. And by subscribing a particular service delivery model, a subscriber in essence agrees to relinquish certain level of access to and control over resources managed by the service provider. As I have discussed in Cloud Computing Primer for IT Pros, we must recognize and be pre-occupied with the limitations of each service delivery model when assessing Cloud. When a particular function or capability like security, traceability, or accountability is needed yet not provided with an intended delivery model, a subscriber needs to either negotiate with the service provider and put specifics in a service level agreement, or employ a different delivery model such that a desired function becomes available. Lack of understanding of the separation of responsibilities in my view frequently results in false expectations of what Cloud Computing can or cannot deliver.
In Part 2, I basically said cloud is to provide “Business as a Service” i.e. making a targeted business available on demand. In digital commerce, much of a business is enabled by IT. Therefore, cloud is to in essence deliver “IT as a Service” or IT available on demand, i.e. anytime, anywhere, on any device. This is what we want IT to become via cloud. Realize that “on-demand” in the context of cloud computing also implies a set of attributes as describer in Part 1 including: ubiquitous network access, resource pooling, pay per use, and so on.
Nonetheless, IT is not about implementing technologies which is a means and not the end. All the infrastructure, servers, desktops, SaaS/PaaS/IaaS, public cloud, private cloud, etc. is about one thing and one thing only. That is to provide authorized users “applications” so that with which transactions are made and businesses are carried out. Either in the cloud or on-premises, it is about applications. So, how is a cloud application different than a traditional one? If so, in what way as far as IT pros are concerned.
Traditional Computing Model
A typical 3-tier application includes front-end, middle-tier, and beck-end. For a web application, the front-end is a web site which presents an application. Middle-tier holds the business logic while connecting to a back-end where the data are stored. And along the data path, load balancers (LB) are put in place to optimize performance, as well clusters are constructed for high availability. This analytical model is well understood and modeled. And the 3-tier architecture represents a mainstream design pattern for applications recently developed prior to the emerging cloud era. The concept is illustrated below and some may find there are some similarities to the idea applicable to architecting a cloud application.
Cloud Computing Model
Microsoft Windows Azure abstracts hardware through virtualization and provides on-demand, cloud-based computing, where the cloud is a set of interconnected computing resources located in one of more of data centers. Generally speaking, like a 3-tier design there are 3 key architectural components of a cloud application based on Windows Azure: Compute, Storage, and Fabric Controller, as shown below. In this model, Compute is the ability to execute code, i.e. run applications. Storage is where the data resides. In Windows Azure, Compute and Storage are defined with Roles, and offered as system services. A Role has configuration files to specify how a component may run in the execution environment. While Fabric Controller is a subsystem which monitors and makes decisions on what, when, and where to run and optimize a cloud application. I will talk more about Fabric Controller in Part 4 of this series, meanwhile here let’s examine more on Compute and Storage components.
Specifically, in Compute service, there are Web Role, Worker Role, and VM Role. Web Role implemented with IIS running in a virtual machine.is to accept HTTP and HTTPS requests from public endpoints. And in Windows Azure, all public endpoints are automatically load balanced. Worker Role on the other hand does not employ IIS, is an executable for computation and data management, and functions like a background job to accept requests and perform tasks. For example, Worker Role can be used to install a user specified web server or hosting a database as needed.
Roles communicate by passing messages through queues or sockets. The number of instances of an employed Role is determined by an application's configuration and each Role is assigned by Windows Azure to a unique Windows Server virtual machine instance. An employment of Windows Azure computing model for a real-life shopping list application is shown below. The actual development process and considerations are certainly much more, as discussed elsewhere.
On the other hand, VM Role is a virtual machine. A developer can employ VM Role (namely upload an OS image in VHD) to run Windows services, schedule tasks, and customize the run time environment of a Windows Azure application. This VHD is created using an on-premises Windows Server machine, then uploaded to Windows Azure. Once it’s stored in the cloud, the VHD can be loaded on demand into a VM role and executed. Customers can and need to configure and maintain the OS in the VM role. The following outlines the methodology.
Do keep in mind that VM Role is however stateless. Specifically, VM Role is designed to facilitate deploying a Windows Azure application which may require a long, fragile, or non-scriptable (i.e. can-not-be-automated) installation.This role is especially suited for migrating existing on-premises applications to run as hosted services in Windows Azure. There are an overview and step-by-step instructions readily available detailing how to successfully deploy Windows Azure VM Role.
The other component in a cloud application is Windows Azure Storage services with five types of storage including:
And within a Compute node, there are two types:
There are tools to facilitate managing Storage instances. A graphical UI like Azure Storage Explorer can make managing and viewing stored data a productive experience. Notice the above mentioned storage types are however not relational databases which many applications are nowadays built upon. SQL Azure, part of Windows Azure platform, is SQL in the cloud. And for DBAs, either Microsoft SQL server on the ground or SQL Azure in the cloud, you manage it very much the same way.
A example of using Windows Azure storage is presented with the following schematic. This is a hosted digital asset management web application. It uses a Worker Role as the background processor to generate and place images into and later retrieve by Web Role as the front-end from the store implemented with Windows Azure BLOB services.
In summary, much of our architectural concepts of a traditional on-premises 3-tier application is applicable to designing cloud applications using Windows Azure’s computing model. Namely, employ Web Role as front-end to accepting HTTP/HTTPS requests, while Worker Role to perform specific tasks like traditional asp.net services. There are various types of storage Windows Azure provides. There is also SQL Azure, Microsoft SQL Server in the cloud, making it convenient to migrate existing data or integrate on-premises databases with those in the cloud.