As written in the last article from my colleague Bernd (SQL Server 2008 R2 Master Data Services Master Data Management announced), SQL Server 2008 R2 is not a rewrite or rework of the SQL Server 2008 but rather provides new features and capabilities. In this article, we will take a look at some of the new features and how they translate in value. In general, SQL Server 2008 R2 delivers new capabilities to enable self-service business intelligence, multi-server management and master data services:
Other features included in SQL Server 2008 R2:
The typical definition of Master Data Management comprises a set of processes and tools that consistently defines and manages the non-transactional data entities of an organization. Those data entities if often called reference data as well. MDM has the objective of providing processes for collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure consistency and control in the ongoing maintenance and application use of this information.
The purpose of using MDM in an organization is mainly not using multiple versions of the same master data through multiple systems and different parts of the operation. Other problem statements include consistency of the data used in the systems and in the organization and data classification as well.
A good article to gain understanding of MDM has been published in the MSDN from Roger Wolter and Kirk Haselden and can be found here: http://msdn.microsoft.com/en-us/library/bb190163.aspx
Master Data Services (MDS) shall help organizations standardize and streamline the business data customers use across their organization to make critical business decisions. SQL Server MDS is a Master Data Management (MDM) application built from platform components which may be deployed as an application or extended by use of the platform components to consistently define and manage the critical data entities of an organization. MDS is an any-domain hub that supports but is not limited to domains such as product, customer, location, cost center, equipment, employee, and vendor. Using MDS, you are able to manage critical data assets by enabling proactive stewardship, enforcing data quality rules, defining workflows around data changes, notifying impacted parties, managing hierarchies, and sharing the authoritative source with all impacted systems.
Master Data Services provides MDM capabilities that helps manage the data organizations rely on to create a ‘single version of the truth.’ In large organizations, data is generated by multiple systems and parties across organizational and geographic boundaries. If that data isn’t reconciled in a central location, decisions may be made based on data that is inaccurate, incomplete or stale.
There are a couple of scenarios for customers to leverage MDS:
Master Data Services includes a Master Data Hub, a set of Services, and an Interface that enables organizations to manage important data assets for both line of business and analytic applications. More specifically MDS is a SQL Server Database, Windows Communication Foundation (WCF) Services, and an ASPX application that includes:
At this point in time, it seems that MDS is going to be purely a capability built in SQL Server 2008 R2 where SharePoint will provide a foundation for meta data management. Master data describes core business entities such as customers, locations, products, and so on. Metadata is "data about other data." It’s structured data which describes the characteristics of a resource. Meta-data answers the who, what, when, where, why, and how about every facet of the data that is documented. Therefore, MDS shipping as part of SQL Server 2008 R2 will not include a dependency on Microsoft Office SharePoint Server.
Kirk Haselden, the product unit manager for MDS described the big deal about MDS in his blog: Master Data Services – What’s the big deal
The existing external roadmap for Master Data Services is available at http://www.microsoft.com/sqlserver/2008/en/us/mds.aspx
An event processing platform captures data from system-level, application-level, and external events and correlates them into patterns. CEP automates the capture, analysis, and response to activity patterns in the database. The patterns detected by a CEP system can be analyzed and used to make business decisions. The benefits vary by industry and scenario, but all are a result of the ability to process large volumes of time varying events/data with very low latency and to take action as a result. For instance, a financial institution – which streams millions of events per millisecond – can use low latency complex event processing to support algorithmic trading, monitor for data compliance, detect fraud in trading, and manage risk.
While OLTP databases have matured and are capable of processing thousands of transactions per second, they are not optimized for processing continuous high volume, low latency streams of data. A low latency CEP platform allows customers to use their own proprietary algorithms to build custom applications, ISVs to build industry specific solutions and embedded system developers to offer low latency processing.