Like TechNet UK on Facebook
TechNet Team Blogs
Angela Cataldo works for Firebrand Training as a subject matter expert and instructor for SQL Server and System Centre. For over 10 years Angela specialized in SQL Server, delivering training and consultancy services to a number of companies throughout the UK and Europe guiding and mentoring customers to follow Microsoft Best Practice and assist in their understanding and adoption of SQL innovative features.
Before SQL Server 2012, databases have always had a degree of portability. But with SQL Server 2012 we can now embrace the powerful new manageability and security features of Contained Databases - which make a database much more portable.
Why do we need Contained Databases?
SQL Server security has always been managed at two levels:
Logins are managed at Server level, and users are managed at Database level. This means permissions for SQL Server have to be defined in two or more locations, and this can cause confusion.
Also, having to manage logins and users separately can cause problems when it comes to maintaining high availability and disaster recovery solutions. And the need to regularly synchronise logins against failover and secondary servers; to avoid problems such as orphaned users.
So with the introduction of containment and the concept of boundaries in SQL Server 2012; a database can become free of external dependencies, server level metadata, settings and security logins.
For a Database Administrator this can also potentially help with the problem we have all faced after recovering a database: repairing a large numbers of logins using the sp_change_users_login stored procedure.
What is a Contained Database?
In simple terms it is a database that is isolated from other databases, and isolated from the instance of SQL Server that is hosting the database.
There are four ways that SQL Server 2012 helps to isolate databases from the instance:
How to create a Contained Database
In this example I am going to demonstrate - in four steps - how to create and authenticate against a Contained Database:
First I need to enable contained database authentication, by executing the following code as a New Query in SQL Server Management Studio (SSMS) against the master database:
The first sp_configure line reports on the current setting and the second sp_configure line enables the server-level setting. This allows SQL Server to defer authentication to the database, provided that we have configured the users correctly with the right authentication.
Now I can create a contained database, executing the following code as a New Query in SSMS:
If we take a look at the Database properties of this database we can see on the Options Select a Page the menu option for Containment Type:
You can also use SSMS to configure containment for databases.
Now we have a contained database we next need to create a user by executing the following code as a New Query in SSMS:
For further syntax of Create User see SQL Server Books Online: CREATE USER, examples cover:
You can also use SSMS to create a contained user, for User Type selecting SQL User with password:
We can also take an existing user and convert it to a contained user executing stored procedure sp_migrate_user_to_contained.
For an explanation of syntax see SQL Server Books Online: sp_migrate_user_to_contained
Now we can take the final step and login in as a Contained Database user, ensuring that in the Connection Properties, under Connect to database, is our Contained Database AdventureWorks2012.
When connecting to a contained database, if the user does not have a login in the master database, the connection string must include the contained database name as the initial catalog. The initial catalog parameter is always required for a contained database user with password.
In four simple steps I have enabled database level authentication, created a contained database and contained user then logged into SQL Server Management Studio as the new user.
What else do I need to know?
As a Database Administrator, security is a major concern and there are unique threats when using Contained Databases that must be considered. Thankfully SQL Server Books Online has a dedicated page on these implications: Security Best Practices with Contained Databases.
An example is passwords in a database require to be strong, complex passwords - and cannot be protected by domain password policies. Therefore, wherever possible create contained users for domain logins and take advantage of Windows Authentication.
Contained databases are set to be one of the top new features for DBAs. Plus AlwaysOn Availability Groups are also new to SQL Server 2012 - helping to simplify environmental and failover concerns, to ensure a highly available disaster recovery solution. SQL Server 2012 Database Containment is simply one of the best things to have happened to SQL Server.
By Paul Gregory
System Center configuration Manager introduced many new features. One of the features revolving around the new User Centric element of the product is the Application Catalogue which allows users to select software they would like to install and if required have it Approved by an Administrator.
One question I get asked a lot is supporting this functionality it untrusted forests and this is possible. To enable this support a few things need to be considered
· The Application Catalogue server has to be able to authenticate the users that connect to it
· Configuration Manager needs to know about the users that will request applications
To enable this cross forest support the following steps need to be performed
1) Install the Application Catalogue Web Service in the same forest as the SCCM database
2) Install the Application Catalogue Website in the untrusted forest giving SCCM credentials to deploy the role to a member server in the remote forest
3) The Application Catalogue Web Service and Website will communicate using Self-Signed certificates these can be replaced with certificates from a PKI infrastructure if needed
4) Enable User Discovery or User Group Discovery for the remote forest in SCCM. This is needed because applications displayed in the catalogue are based on the collection targeting so the applications will need to be targeted within SCCM to the users in the remote forest.
Paul Gregory is one of QA’s principal technologists – specialising in delivering training around Microsoft Server operating systems, virtualisation and systems management. During a 29-year career within IT, Paul has helped many international organisations develop infrastructure solutions based on Microsoft technologies, as well as supply training services during the last 14 years. Paul has helped QA deliver numerous Microsoft partner training skilling programmes for Microsoft – particularly around the areas of Microsoft Server operating systems, virtualisation and System Center. Paul was also heavily involved in the recent Microsoft Windows 8 / Server 2012 TAP programme where he played a key role in the testing of core Windows Server 2012 technologies and positioning this information back to product specialists in Redmond. With the advent of the Microsoft Private Cloud solutions based on System Center 2010 & 2012 Paul have been responsible in helping Microsoft prepare the Partner channel both in the US and Europe for these technologies.
<a href="https://saraallison.wufoo.com/forms/m7p7a7/">Fill out my Wufoo form!</a>
If you would like a detailed version of these tips please mail: firstname.lastname@example.org. For further information on 1E’s integration capabilities with System Configuration 2012, please visit: http://www.1e.com/it-efficiency/solutions/system-management-services/
Often customers I come across install SCOM and panic the main reasons for this are:
1) Trying to do too much too soon
2) Not fully understanding their environment
3) Not understanding SCOM tries to predict issues
There are a few other reasons but we do not need to worry about them here. But this gets me to where I want to be the noise. Starting with item (3) it is important that SCOM tries to predict events so there is always a balance between being noisy and missing events which need to be reported to predict a future event and where that line needs to be drawn will vary from one organisation to another.
One area I see people struggle with this is managing basic hardware capacity issues. For example monitoring free disk space. The main problem is most systems today will have fairly small OS drives and much larger data volumes so different thresholds need to be set. However the default rules for managing free disk space apply to all drives in a computer. To be able to manage this correctly a number of things need to be put in place for best practise.
1) Standardize Server Builds – I often here that server builds are a bit random, it is never too late to standardize the build.
2) Create SCOM Groups for each drive (steps below)
3) Set Overrides for each drive group and for each OS type.
This model will then allow different disk space thresholds to be set for each group of hard drives.
1) From within the SCOM administration console select the Authoring panel
2) Select Groups and choose Create Group on the right
3) Give the group a name and description and create a new management pack for storing Windows Server Hardware Monitoring Overrides in if one does not exist
4) Press Next until on the Dynamic Members page
5) Press the Create/Edit button
6) In the drop down box choose either “Windows Logical Hardware Component” or “Logical Drive (Server)”. These allow you to select drives based on name or other properties. Press Add
7) In the table change the first Drop Down box to “Display Name” in the third box enter C: Press OK
8) Complete the wizard
9) Repeat to create any other groups for other Drive letters you wish to set separate rules for.
Several key goals of the WS2012 Server rewrite were to accommodate requirements of the cloud business. Datacenters which provide cloud capability are generally massive and have unique requirements which are normally a superset of those found in a smaller environment. These requirements include uptime, resiliency, flexibility, cost savings, manageability, speed and security.
In this article, I would like to explain some of the new features that provide dramatic improvements in Speed of the WS2012 servers sold by our OEMs in specific environments that are dependent upon throughput, such as storage, financial, industrial control, gaming, printing, communications and broadcast verticals, as examples.
In order to expand performance on a server, many things must be in place.
Multiple processors, and the more the better.
More memory with dynamic allocation.
Higher bandwidth through the networks.
An unbounded connection to disk farms.
Self healing capability.
Microsoft, in order to increase the performance of datacenters servicing the cloud, implemented support for up to 640 physical processors (not just cores).
An expansion of memory was needed to service all these processors and the resulting virtualized instances that would be running on them. Ergo, not only were the upper limits increased for memory (to 4TB, but also the way in which they were allocated and used. Also, support for SSD is incorporated.
Storage spaces allowed the combination of multiple storage technology types into a single storage pool. And this is no ordinary storage pool. WS2012 can run ReFS -Resilient File System, which allows doing file checks on the fly (imagining doing a FSCHK of 300 million files in seconds--- a direct result due to the files being checked in advance), in essence providing self-healing data that did not have to go through cleansing and correction at the storage level.
Also implemented was support for the off-loading of storage transfers from the server computer to the storage vendors like EMC, Hitachi, Network Appliance. (This was not only for SAN environments, but now, also support for NFS was added). This optimizes usage of the pipe, and expands our ability to work in heterogeneous environments and service existing disk farms.
In order to maximize the speed of the disks, the use of 4K blocks have become the standard. This is what disk drives want to see.
SMB 3.0 to allow multichannel support and a refactoring of SMB 2.0 allowed Network transfers to reach 97% of DAS (direct attach storage) speed. Additionally, files are always opened in a “write through” mode.
(As an aside, de-duplication at the file level was incorporated. This not only saves space, but does assist in reducing time spent searching for files, and support for 256 i-scsi targets was supported.)
And finally, NIC teaming, with the capability of creating a failover cluster of 16 channels or aggregating 32 ports (with load balancing in both scenarios) to maximize throughput was incorporated. NIC products from multiple vendors could even be blended together. With this kind of throughput, and support for networked storage, you can see how an IPSAN can be built with throughputs of up to 320Gb/sec (using 10Gb/E network cards). What used to be hard is now easy.
We measured 1 million IOPS with some of our tests, and we did not even reach the upper bound of WS2012. (I spelled “Million”, because I was certain some would think that number was a typographical error). This is due to the previously mentioned features, as well as SMB Multichannel with SMB 3.0… this means that very powerful storage can be built but for a tenth of the price companies pay today to EMC/NTAP.
If these performance enhancements are of interest to you, go to http://www.microsoft.com/windowsembedded/en-us/evaluate/windows-embedded-server.aspx download and begin testing a demo copy of WS2012.
Mark ‘Fitz’ Fitzgerald is the principle technologist for business intelligence, covering SQL, PerformancePoint and ProClarity at QA. He is a twenty-year veteran of the IT industry, with experience that ranges from mainframes, help desk and MIS systems. Mark has experience developing business applications in a range of products which enhance and distribute accurate, timely information within organisations. Mark has been with QA since 2000, and in 2003 & 2006 he won QA’s Trainer of the Year Award. Mark’s enthusiasm knows no bounds and training sessions often spill into breaks, lunch times and early evenings if not interrupted!
Business Diagrams using SSRS Map Control
Many businesses need to be able to produce business-oriented diagrams using SQL Server data. This can pose a problem for the SQL Server user, many of whom rely on Visio services or third party tools to be able to produce the reports which the business demands.
However, it is possible to use spatial maps within the reporting services element of Microsoft SQL Server 2008 R2, to create diagrams for use within business reports – whether this is from data stored in SQL Server as geometry/geography data types, embedded within the control itself (US only) or by using an ESRI shapefile. The diagrams below illustrate the level of reporting capability possible using this technique. All of the diagrams below are calculated from a standard parent and child relational source.
Below is a list of types of diagrams typically requested by the business:
Making these available using SSRS will allow clients to visualise the data better and give the developers additional options for display. It is not likely to replace the common chart types available within the product, but with a little thought and effort most diagrams are possible.
All of the diagrams below are possible using standard TSQL objects (user defined table data types, user defined functions and stored procedures). No CLRs are used in creating the diagrams and each performs adequately.
Chart Type and description
Hierarchy : hierarchical view of items dependent upon parent and child arrangement - organisation chart, hierarchical KPI, viewing a decision tree
Multiple proportional pies : growth of sales over time with the proportion of each sector
Nightingale Rose : changing sizes and proportions over time
Geometric map with Sparkline pies included : proportion of sales by category split regionally
Gantt Chart : tasks to the performed with dates
Network : tasks and dependencies between them
IPD Guide for System Center 2012 - Operations Manager now available!
The Infrastructure Planning and Design (IPD) Guide for System Center 2012 - Operations Manager outlines the infrastructure design elements that are crucial to a successful implementation of Operations Manager. It guides you through the process of designing components, layout, and connectivity in a logical, sequential order. You’ll find easy-to-follow steps on identification and design of the required management groups, helping you to optimize the management infrastructure.
· Download the IPD Guide for System Center 2012 - Operations Manager.
· Learn more about the IPD Guide Series.
Determine Windows Server 2012 Readiness with MAP 8.0 Beta
Accelerate your Windows Server 2012 migration with Microsoft Assessment and Planning (MAP) Toolkit 8.0 Beta. This latest version of MAP adds new scenarios to help plan your environment with agility and focus while lowering the cost of delivering IT. Included in MAP 8.0 Beta are hardware and infrastructure readiness assessments to assist you in planning the deployment of Windows 8 and Windows Server 2012, preparing your migration to Windows Azure Virtual Machines, readying your environment for Office 2013 and Office 365, and tracking your usage of Lync.
· Download the MAP Toolkit.
· Learn more
· Join the beta.
Secure your environment with new SCM 3.0 Beta!
Secure your environment with new product baselines for Windows Server 2012, Windows 8, and Internet Explorer 10. The latest version of SCM offers all the same great features as before, plus an enhanced setting library for Windows 7 SP1 and Windows 2008 R2 and bug fixes. The updated setting library gives you the ability to further customize baselines, and also improves GPO Import feature affinity. SCM 3.0 provides a single location for creating, managing, analyzing, and customizing baselines to secure your environment quicker and more efficiently.
· Download SCM.
· Learn more about Security Compliance Manager.
· Join the SCM 3.0 Beta.
Six Steps to Windows Azure launched last week with over 160 attendees over the 2 days at our London kick off events which were run in partnership with the UK Windows Azure User Group. Our first event Azure in the Real World showcased some fantastic real life solutions. Our second day focused on Advanced Topics in Windows Azure which included Windows Azure Media Services and Web Services. Overall the feedback has been fantastic.(#sixstepsazure) The audience was a real mix of those who have just started with Windows Azure to those considering it in the coming year.
Here is the content that was delivered by a great line up of speakers on the 8th and 9th November.
Hitting the Limits of Azure Storage presented by Richard Wadsworth
Drillboard:A sports app case study presented by IQ Cloud
Sheffield University: Environmental Projects on Azure presented by Shaping Cloud
Bootstrapping a start-up with Windows Azure presented by Labtrac Solutions -
How SaaS Changes an ISV’s Business Model presented by David Chappell
Hitting the Limits of Azure Storage. Richard Wadsworth, Amido
Drillboard: A Sports App Case Study. Matt Quinn, IQCloud
Sheffield University: Environmental Projects on Azure. Carlos Oliveira, Shaping
Bootstrapping a Start-Up with Windows Azure. Damian Otway, Labtrac Solutions
Exploring the Micro: .NET MicroFramework and Windows Azure. Andy Cross, Elastacloud.
How the Cloud Changes Financial Services. George Kaye, Derivitec and Andy Cross, Elastacloud.
David Gristwood talk to David Chappell.
Windows Azure Media Services presented by Nuno Filipe Godinho
Windows Azure and Active Directory presented by Steve Plank, Microsoft.
What’s next in Six steps to Azure?
Step 2: Architecture and Design for Windows Azure - Join us online on 26th November:
· Windows Azure Architecture for Developers - 10:00am
· Windows Azure Architecture for IT Professionals – 12:00pm
Step 3: Integration with Mobile and the New World of Apps – Join us online on 4th December
· Integration with Mobile and the New World of Apps Part 1 – 10:00am
· Integration with Mobile and the New World of Apps Part 2 – 12.15pm
We also have a number of Windows Azure Developer camps, which will take attendees from knowing nothing about the cloud to actually having deployed a simple application, and made it available on the public internet.
Other steps to follow but you can find out about the entire programme here.
Microsoft recently introduced System Center 2012, a tightly integrated management solution built from the ground up for automated private cloud application and infrastructure management. IDC interviewed a range of System Center 2012 early-adopter customers about their private cloud strategies and the role that System Center 2012 is playing in support of those programs. This white paper discusses IDC's industry-wide views on private cloud management trends and priorities, describes how System Center 2012 is addressing these needs, and highlights System Center 2012 customer experiences and lessons learned. The goal of this paper is to equip IT decision makers with a context for designing their own private cloud management evaluations and pilot projects.
Download the WhitePaper