Yesterday I announced the line up for Day 1 of Tech Days Online. Now it's time for Day 2, which will be the first day for IT Pros.
You will notice from below that we have some guest speakers for Day 2. It's too early in the week to let you know who they are but what I will say is that we will be doing a live interview with a senior technical individual from Microsoft in Seattle.
Here is the agenda for:
Day 2 - 25th April
If you want to view the full agenda then you can do so here:
If you've not yet registered, then you can do so here: <Register>
Keep tuned to the blog for further announcements.
By Vicky Lea
In a previous blog I talked about what you could do with free software, and referred to TechNet subscriptions and MSDN subscriptions. In this blog I am going to look into some more detail as to what your TechNet or MSDN subscription will allow you to do, starting first with TechNet.
There are a number of ways to gain access to a TechNet subscription; one is as a Microsoft partner with a silver or gold competency where you will be eligible for TechNet for Microsoft Competency Partners subscription, another way is as an Action Pack Solution Provider subscriber where you will receive TechNet for Microsoft Action Pack Solution Provider. Alternatively if you are not a Microsoft partner you can purchase a Microsoft TechNet subscription.
But what does your TechNet subscription give you access to?
Well, TechNet subscriptions allow the licensed user to download and evaluate the latest full-version software and beta releases as well as giving access to extensive technical information about Microsoft technologies. Meaning that you can confidently evaluate Microsoft software and plan deployments. There are two main subscriptions you can purchase, TechNet Subscription Standard and TechNet Subscription Professional. The benefits that you receive do vary between the two subscriptions but include:
· Access to full-version software for 12 months with no feature limits – for evaluation purposes only
· Microsoft E-Learning
· 24/7 online chat for site assistance
· Priority Support in TechNet forums
· Access to Microsoft infrastructure products
· Two complimentary Professional Support Calls – TechNet Professional Subscription only
To see more detail around these benefits can I recommend you visit this site: http://technet.microsoft.com/en-gb/subscriptions/bb892759.aspx
It is also worth being aware that customers with Software Assurance on qualifying products do also have access to certain TechNet benefits such as the TechNet Subscription SA Services which provide IT Professionals with answers to technical questions from industry colleagues, and TechNet Plus Direct which consists of the same benefits as TechNet Subscription Professional. Check out the Product List document to see more information about the qualifying products for these SA benefits.
The other area to discuss is what you can do with your MSDN subscription.
MSDN subscriptions can be purchased for individual users either alongside Visual Studio or as a standalone subscription in the form of MSDN Operating Systems. Each MSDN subscription has its own set of benefits including:
· Software and Services for Production Use – only with Visual Studio with MSDN
· Software for Development and Testing
· Technical support incidents
· Priority support in MSDN forums
· MSDN Magazine
· MSDN Flash Newsletter
· MSDN Online Chat
The key element that varies with each MSDN subscription is the list of software that is available for production use and the software available for development and testing. To see a complete list of the benefits per subscription check out this site: http://msdn.microsoft.com/en-us/subscriptions/buy.aspx
And if you wish to see more detail on the individual benefits can I recommend you look here: http://msdn.microsoft.com/en-us/subscriptions/aa718661.aspx
As with TechNet, MSDN subscriptions are licensed per named user, allowing that user to access the software, services and support associated with the particular subscription. But whereas with a TechNet subscription access to Microsoft software is for evaluation purposes only, with an MSDN subscription the licensed user has access to Microsoft software for design, development, testing and demonstration purposes as well as being able to evaluate software and simulate customer environments in order to diagnose issues related to their programs.
With just over a week until Tech Days Online, we thought it was about time we started to let you know who was going to be presenting.
When the team sat down to pull the agenda together, we wanted an agenda that had Microsoft technical experts as well as technical experts with more of a "real-world" perspective on the technology. We have therefore pulled together a line up that contains presentations from our partner community and MVPs.
I thought I would write this blog in a similar way to how festival line ups are announced. With that in mind, enclosed are the headliners for:
Day 1 - 24th April
If you want to view the agenda you can see the full agenda here.
We will be making further announcements over the next few days as we lead up to Tech Days Online. What I will say at this stage is that we have some surprises lined up :)
In this deployment sessions video I take a look at the Deeplinking process using Windows Intune to install an app on Windows 8. The deeplinking process essentially lets you place apps from the Windows Store in your own company store, or Company Portal – essentially letting you curate the best Windows Store apps for your users. You don’t need the code just a link to the app in the Windows Store, so there is nothing for you to store on your servers or in your cloud and the app publisher keeps the app up to date through the Windows Store.
The videos is, as always, split into sections:
If you want to see more videos like these then check out The Deployment Sessions and please Like the YouTube videos.
By SQL Server Team UK
In February 2013 Microsoft received the Common Criteria (CC) certificate for SQL Server 2012 SP1 Enterprise Edition (English) x64 (Version 11.0.3000.0) at EAL4+ and compliant with U.S. Government Protection Profile for Database Management Systems, Version 1.3, 24 December 2010. See the actual CC certification screen shot below.
This is the 7th CC certification for SQL Server, starting with SQL Server 2005 SP1 in 2007. These certifications enable governments and major enterprise customers understand the security functionalities and quality of those functionalities of SQL Server 2012.
This certification is formally recognized by the governments of 26 countries that have signed the Common Criteria Mutual Recognition Arrangement (CCRA) and by as many as 40 more governments on a product by product basis. The Common Criteria is more than just the concise definitions of security functionalities and assurance requirements. It is also a precise evaluation process defined in the Common Evaluation Methodology document. In addition, it is a formal and approved evaluation scheme for each nation performing CC evaluations. And it is a government certification based on government working with a private evaluation lab certified in that country.
For more detailed information about SQL Server certifications, please go to Security & Compliance site and Common Criteria site.
We can’t wait to see you at the Best of MMS UK 2013.
We recently announced the availability of new solutions to help enterprise and SMB customers manage hybrid cloud services and connected devices with greater agility and cost-efficiency.
System Center 2012 SP1, Windows Server 2012, Windows Azure and Windows Intune are key solutions that deliver against our Cloud OS vision to provide our customers & partners with a modern platform to address their top IT challenges.
We will be hosting the Best of Microsoft Management Summit (MMS) UK 2013 on Tuesday 21st May 2013 at Microsoft London, Cardinal Place, 80-100 Victoria Street, London, SW1E 5JL.
This 1-day event will provide you with the opportunity to attend the “best” sessions from MMS 2013 interact with our sponsoring & exhibiting partners, and understand the next step forward in our Cloud OS vision, strategy and roadmap – with two tracks for you to choose from:
· Infrastructure, application & cloud management for transforming the datacentre
· Desktop & device management for empowering people-centric IT
This is an opportunity you won’t want to miss!
Register now to secure your place – only 200 places available.
Infrastructure, Application & Cloud Management TRACK:
Invite Code: 03C987
Desktop & Device Management TRACK:
Invite Code: 602315
Domino’s Pizza makes and delivers more than 1 million pizzas a day worldwide. Its store servers are critical to receiving orders, taking payments, scheduling staff, and every other aspect of store operation. The pizza giant plans to switch its 10,000 US store servers to the Hyper-V virtualization technology in Windows Server 2008 R2 to eliminate reliability and performance problems it experienced with its previous virtualization solution. Since moving its first 1,500 servers to Hyper-V, Domino’s has seen virtualization-related help-desk calls and performance glitches practically disappear, which removes barriers to taking orders and serving customers. By taking advantage of its Microsoft license, Domino’s has a cost-effective virtualization solution. Domino’s uses Microsoft System Center data center solutions to manage 15,000 servers with only two people, a huge efficiency achievement.
Situation Domino’s Pizza is a recognized world leader in pizza delivery. It ranks among the world’s top public restaurant brands, with a global network of more than 10,200 stores in more than 70 international markets. In terms your stomach can understand, Domino’s delivers more than 1 million cheesy, delicious pizzas a day to hungry customers all over the world, generating sales of more than US$7.4 billion in 2012. It’s estimated that the Domino’s system employs more than 205,000 franchise and corporate employees across the United States.
Domino’s has long used technology to fuel business growth and customer convenience. It is consistently among the top five companies in online transactions and today receives one-third of all US orders over the Internet. Domino’s diverse mobile ordering apps let customers order from 80 percent of the world’s smartphones.
As online orders poured into the company’s store computers in rising volumes, it became increasingly critical that those computers be up and running to receive them. “In the early days, computers were secondary to our business,” says Lance Shinabarger, Vice President of Global Infrastructure for Domino’s Pizza. “Every store had a small server—a PC, really—that ran our green-screen point-of-sale system. If a store system went down, it wasn’t a big deal; the staff could continue to take orders on paper. Sometimes a store server was down for a week, but the business rolled on without it.”
However, in 2000, Domino’s developed a proprietary point-of-sale (POS) system called Domino’s Pulse that replaced the green-screen system with a powerful, graphical interface-based program that computerized nearly all aspects of store operation, from inventory to order taking to staff scheduling. Employees grew so dependent on Domino’s Pulse that many could not function without it. If the computer went down, some stores actually closed their doors.
Between the rising dependence on Domino’s Pulse and the increase in online orders, Domino’s store server uptime became absolutely critical to the business. “We lose money and potentially customers when a store computer is down,” Shinabarger says. “We had to figure out how to improve the stability of our store environment.”
The IT staff decided to put a second server in each store for backup. However, store employees were pizza makers, not IT pros, and it was painful and time-consuming for them to grapple with switching Domino’s Pulse to the backup computer, especially in the middle of a rush of orders. Domino’s needed a solution that its Ann Arbor, Michigan–based IT staff could manage without troubling store employees.
The IT staff decided to use virtualization. Domino’s had already proven the success of server virtualization in its data center and achieved phenomenal results in server consolidation, increased server utilization, and faster server deployment. The IT staff realized that virtualization might also hold the answer for store-computer reliability.
In 2007, Domino’s extended its data center virtualization solution to its nearly 5,000 US stores. It still maintained two physical servers in each store but ran its Domino’s Pulse POS on a virtual machine inside the primary server. If that server failed, the Ann Arbor staff could easily and remotely “float,” or transfer, the virtual machine to the backup server.
While this solution worked, the Domino’s Pulse virtual machine would not start in the morning after the physical computer restarted, which slowed store startup and generated scores of help-desk calls. The virtualization software also consumed lots of compute power, which caused performance and stability to suffer during peak periods, such as Super Bowl Sunday, when pizza orders came in fast and furious.
Domino’s use System Center 2012 Configuration Manager to deploy applications and software updates to all 15,000 servers and to 2,000 client devices in the company. Domino’s also plan to upgrade store servers to the Windows Server 2012 operating system to take advantage of increased disk I/O, easier virtual machine live migration, and faster disaster recovery.
To learn more about these products, have a trial:
· System Center 2012
· Windows Server 2012
Find out how Domino’s met their requirements by viewing the Case Study here
By passing an MCP exam during the months of March, April or May 2013 you will be able to enter a prize draw to win one of the fantastic prizes below. What's more, if you pass another exam in a later qualifying month you can enter again (see terms and conditions for full details).
With the advent of the public cloud, cheap compute resources used on a pay-as-you-go and only-pay-for-what-you-use basis, a new phenomenon is emerging. The pop-up lab.
Pop-up labs are used by Devs and IT Pros to create multiple server lab environments that are popped-up for the few hours they are needed and then popped back down again, ready to be re-ignited on a different occasion.
At the end of the day it’s just the start-up/shutdown of collections of virtual machines, but the things that make this different are that the machines live in a public cloud operator’s data-centre; like Windows Azure. If you think about it – when you need to say, replicate a problem, you only need the lab usually for a few hours, probably less than a day. Great if you’ve got a large data-centre to build the virtual environment in. But what if you don’t have the luxury/money to have such a resource at your disposal?
That’s where pop-up labs come in. They are used for dev and test purposes, for self-education/career advancement, training, problem replication and just simply getting experience with a new technology. The free Windows Azure trial subscription means you can often run pop-up labs for free. But say if you want to fire up a 2-server SharePoint farm with SQL Server and AD to replicate one of your customer’s problems: you could do that for an afternoon’s work for less than £2.
If you want to know more about this emerging phenomenon – go to this URL: http://aka.ms/Popuplabs
Neil Hodgkinson has provided a step by step guide to getting started with System Center 2012 Configuration Manager. This is part of a 15 part series which will cover the installation, setup, configuration and usage of Microsoft System Center 2012 Configuration Manager. To find the additional articles in the series please take a look at Neil’s site.
http://SCCM2012 IIS Default for group policy is not needed if you are using SCCM push, read more about it here http://technet.microsoft.com/en-us/library/bb632380.aspx
Remote Differential Compression for site server and branch distribution point computers
Site servers and branch distribution points require Remote Differential Compression (RDC) to generate package signatures and perform signature comparison. By default, RDC is not installed on Windows Server 2008 or Windows Server 2008 R2 and must be enabled manually.
Use the following procedure to enable Remote Differential Compression for Windows Server 2008 and Windows Server 2008 R2 and now 2012
Delegate Permission to the System Management Container
Open Active Directory Users and Computers. Click on view, select Advanced Features. Select the System Management Container, and right click it, choose All Tasks and Delegate Control.
When the Welcome to Delegation of Control Wizard appears click next, then click Add. Click on Object Types, select Computers. Type in your SCCM server name and click on Check Names, it should resolve.
Click Ok, then Next. Choose Create a Custom Task to Delegate, click next, make sure this folder, existing objects in this folder and creation of new objects in this folder is selected.
Click next, select the 3 permissions General, Property-Specific and Creation-deletion of specific child objects are selected then place a check mark in FULL CONTROL, and click next then Finish.
Extent the AD schema for sccm
Perform the below on your Active Directory server, simply browse the network to your AD Server server \\adminserver\c$ and copy the contents of SC2012_SP1_RTM_SCCM_SCEP and find \SMSSetup\Bin\x64\Extadsch.exe, right click and choose Run As Administrator.
Open SQL ports
Create an OU for your SCCM server and allow port 1433 and 4022 for SQL replication with group policy – Select Computer Configuration, Policies, Windows Settings, Windows Firewall with Advanced Security and select Inbound Rules, choose New and follow the wizard for opening up TCP port 1433, repeat for port 4022.
If using group policy refer to step 2 below Image
To open a port in the Windows firewall for TCP access
To open access to SQL Server when using dynamic ports
Install .net frame work and IIS WCF activation and BITS
In Server Manager select Features, Add Features, Select .NET Framework 3.5, also select WCF Activation and when prompted answer Add Required Role Services click next and next again. (Make sure the BIT and IIS service is running/restart after install).
SQL Server 2012
Install SQL on D:\Program Files... and when running setup.exe right click and choose Run as Administrator, Select all options on install, click on the account name and enter the admin username and password.
Click next and finish install (takes a long time).Make sure SCCM computer is a member of the built-in administrators.
Check TCPIP properties for listening IP address in SQL Server Configuration Manager Start up the SQL Server Configuration Manager, and expand SQL Server Network Configuration on the left pane, highlight Protocols for <Instancename> and double click on TCPIP in the right pane
Click on IP addresses
Change IP2 to enabled yes
Leave default IP
Change IP4 to enabled yes
SQL Memory Configuration
The logon account for the SQL Server service cannot be a local user account, NT SERVICE\<sql service name> or LOCAL SERVICE. You must configure the SQL Server service to use a valid domain account, NETWORK SERVICE, or LOCAL SYSTEM. SEE BELOW PIC
Installation of System Center 2012 Configuration Manager with SP1
Here is the download link for the Assessment and deployment kit http://www.microsoft.com/en-us/download/details.aspx?id=30652 this is one of the prerequisites.
Also restart your server
When the wizard appears, click on Install, click next and then select Install a Configuration Manager Primary Site
Click next, and then create a folder on your D/E Drive called rc_updates
Click next on your Language of choice and enter your site installation settings install on D/E not C:
Install as the first site in a new hierarchy
Click next, leave the FQDN as default
Select Configure the Communication method on each site system role and review all setting.
Client Computer Communication Settings (HTTP or HTTPS). Select Configure the communication method on each site system role.
Any warnings can be fixed after the install
Make a brew this part can take a while!
After the install has finished restart the server.
The next step in the guide we will be going through the different discovery methods and creating boundary Groups.
Head on over to http://www.technodge.co.uk for more Deployment guides.
Neil Hodgkinson has been working in the IT industry for 14 years with 9 of those working in the education sector, I have worked with many versions of Windows Server, Exchange and Group Policy . Over the last few years I have been specializing in Deployment methods starting with Microsoft's deployment tool kit and the migrating over to Microsoft System Center, the Holy grail of servers- for Endpoint Protection, Deployment, App Control for windows 8 and the ability to manage smart phones.
I also do a lot of free consultancy for all the local primary schools on the best way to deploy and control their windows environment Via system center and Group policy's.
IT is a passion and I feel you have to be passionate about the IT industry for things to keep things moving forward.