What’s hot on SQL Server 2008 R2?

As written in the last article from my colleague Bernd (SQL Server 2008 R2 Master Data Services Master Data Management announced), SQL Server 2008 R2 is not a rewrite or rework of the SQL Server 2008 but rather provides new features and capabilities. In this article, we will take a look at some of the new features and how they translate in value. In general, SQL Server 2008 R2 delivers new capabilities to enable self-service business intelligence, multi-server management and master data services:

  • Master Data Services, which will ship with SQL Server 2008 R2, provides customers with a single authoritative data source to ensure the integrity of the data they use to make critical business decisions.
  • Microsoft’s innovative new low latency complex event processing platform allows customers to gain insights in near real time from event streams.

Other features included in SQL Server 2008 R2:

  • Self-Service analysis (Project ‘Gemini’) empowers users to intuitively create their own BI solutions, enabling them to easily share and collaborate on personal BI results. 
  • Self-service reporting tools enable end users to access, aggregate, slice, dice, and report on data with minimal dependence on IT, thereby enabling them to gain information and insights that can help make faster, smarter decisions – without having to learn new and specialized BI skills.
About MDM…

The typical definition of Master Data Management comprises a set of processes and tools that consistently defines and manages the non-transactional data entities of an organization. Those data entities if often called reference data as well. MDM has the objective of providing processes for collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure consistency and control in the ongoing maintenance and application use of this information.

The purpose of using MDM in an organization is mainly not using multiple versions of the same master data through multiple systems and different parts of the operation. Other problem statements include consistency of the data used in the systems and in the organization and data classification as well.

A good article to gain understanding of MDM has been published in the  MSDN from Roger Wolter and Kirk Haselden and can be found here: https://msdn.microsoft.com/en-us/library/bb190163.aspx

About MDS in SQL Server 2008 R2…

Master Data Services (MDS) shall help organizations standardize and streamline the business data customers use across their organization to make critical business decisions. SQL Server MDS is a Master Data Management (MDM) application built from platform components which may be deployed as an application or extended by use of the platform components to consistently define and manage the critical data entities of an organization. MDS is an any-domain hub that supports but is not limited to domains such as product, customer, location, cost center, equipment, employee, and vendor. Using MDS, you are able to manage critical data assets by enabling proactive stewardship, enforcing data quality rules, defining workflows around data changes, notifying impacted parties, managing hierarchies, and sharing the authoritative source with all impacted systems.

Master Data Services provides MDM capabilities that helps manage the data organizations rely on to create a ‘single version of the truth.’  In large organizations, data is generated by multiple systems and parties across organizational and geographic boundaries.  If that data isn’t reconciled in a central location, decisions may be made based on data that is inaccurate, incomplete or stale. 

There are a couple of scenarios for customers to leverage MDS:

  • Financial Application – Providing a unified and consistent view of cost center information, chart of accounts, product and customer hierarchies.
  • ERP Systems – Product management of 100’s of attributes, classifications and hierarchies and consistency across multiple systems.

Master Data Services includes a Master Data Hub, a set of Services, and an Interface that enables organizations to manage important data assets for both line of business and analytic applications. More specifically MDS is a SQL Server Database, Windows Communication Foundation (WCF) Services, and an ASPX application that includes:

  • Master Data Hub – Central Storage, Authoritative Source, Versioning, Rules, Transactions
  • Stewardship Portal – Model Management, Documentation, Workflow, Integration

At this point in time, it seems that MDS is going to be purely a capability built in SQL Server 2008 R2 where SharePoint will provide a foundation for meta data management. Master data describes core business entities such as customers, locations, products, and so on. Metadata is "data about other data." It’s structured data which describes the characteristics of a resource. Meta-data answers the who, what, when, where, why, and how about every facet of the data that is documented. Therefore, MDS shipping as part of SQL Server 2008 R2 will not include a dependency on Microsoft Office SharePoint Server.

Kirk Haselden, the product unit manager for MDS described the big deal about MDS in his blog: Master Data Services – What’s the big deal

The existing external roadmap for Master Data Services is available at https://www.microsoft.com/sqlserver/2008/en/us/mds.aspx

About complex event processing…

An event processing platform captures data from system-level, application-level, and external events and correlates them into patterns. CEP automates the capture, analysis, and response to activity patterns in the database. The patterns detected by a CEP system can be analyzed and used to make business decisions.  The benefits vary by industry and scenario, but all are a result of the ability to process large volumes of time varying events/data with very low latency and to take action as a result.  For instance, a financial institution – which streams millions of events per millisecond – can use low latency complex event processing to support algorithmic trading, monitor for data compliance, detect fraud in trading, and manage risk.

While OLTP databases have matured and are capable of processing thousands of transactions per second, they are not optimized for processing continuous high volume, low latency streams of data. A low latency CEP platform allows customers to use their own proprietary algorithms to build custom applications, ISVs to build industry specific solutions and embedded system developers to offer low latency processing.

Roland Lenz