Like TechNet UK on Facebook
TechNet Team Blogs
Rachel has asked us (Simon & Andrew) to identify the key skills that will help your career survive and thrive as businesses start to transition some or all of their services to the cloud. We aren’t suggesting you learn all of these, but if you focus on at least one technology plus the ‘soft’ skills below your future will be more secure as a result. We’ve also pulled in some resources to get you started.
Authentication and Identity
No matter where your data and services are, your users are going to need to get at them, and to do that they need to identify who they are and be accorded the appropriate privileges. Traditionally Active Directory has been providing this service in the local data centre and has been extended over the years to allow users and services on other operating systems e.g. Linux and Apple Macs, and more recently iPads, to work seamlessly across them to get mail, docs and services. Active Directory Federation Services (ADFS), extends this interoperability to the cloud - not just Microsoft’s, but third parties like SalesForce - and supports OpenID used by Yahoo Google and others.
To find out more:
I am an ex-DBA so my top recommendation is to get to know SQL Server. SQL Server has provided me with a great career and is another area of technology that won’t be much changed by the cloud. First of all no matter where the data is, DBAs will be needed in some capacity to manage it. Not only that but databases are only ever-increasing as storage becomes cheaper and cheaper.
Not every database will move to the cloud, and alongside the CloudPower messaging from Microsoft you’ll also see that in partnership with the server manufacturers there's a raft of new appliances specifically designed for different workloads of SQL Server e.g. business intelligence, data warehousing and OLTP (On-Line Transaction Processing), so there will still be some high-end databases that remain un-virtualised. Many organisations will be moving to hybrid clouds – I can see many web applications having a SQL Azure backend database, and SQL Azure does actually remove some of the work of the DBA e.g. high availability and configuration. However it also presents new challenges such as synching data between cloud and local, audit, query optimisation.
A final thought: typically 12% mention SQL Server as a skill on any of the top job sites.
The best resources for learning what SQL Server can do are
Process and automation
Given that cloud is supposed to be about agility you would think that standardising business processes and adopting ITIL standards for process management might be at odds with each other. The answer is simply process automation – taking those standard procedures and making them happen across the disparate systems in your infrastructure. So as well as looking at getting ITIL accredited you might want to think about putting your processes into code. You could become a PowerShell/PowerCLI guru and write miles of code to do this but there is a better way - System Center Orchestrator (SCOrch), the glue and gaffer tape in the System Center armoury. This allows you to hook your help desk up to the hardware, the hypervisor (theirs or ours), the virtual machine , the operating system and the application itself and map out your process so the business can understand it and sign it off. This then gets rid of the drudgery, leaving you to get on with the new project work that's been backing up on your desk over the last year.
It's an interesting time to get into this space, as the System Center lineup is in the process of being completely overhauled. There are beta releases of some of the new versions available now including SCorch, Virtual Machine Manager, Operations Manager, and Configuration Manager, with more due over the next couple of months. This will mean more detailed training on the Microsoft Virtual Academy, as well as various events we will be running in the UK this autumn.
Not really my world but the network engineers are all doing very well out of the cloud, the storage explosion, and the challenges of running lots of virtual machines on consolidated physical servers.
I put this in partly to be controversial but mostly because it’s too important to be left to the marketing department. The IT department and its members in particular need to be much better at promoting their work to the business and setting out the services on offer. On an individual level you need to build your internal brand, hopefully as a can-do trusted advisor who understands the business. Some of this is part of the day job - the way you respond to requests and follow through to see that your users are happy - some of it should be proactive, like going to business meetings to brief them on your latest projects and ideas. Building your brand outside the office can also be useful, so joining special interest groups on LinkedIn and Facebook, and helping and asking for advice on Twitter and forums can establish your credibility for your next role.
Not so much a networking role as skills around bringing different parts of cloud architectures together as one holistic unit. I think it's difficult for businesses to identify solutions that can be "pure cloud" beyond those Internet-based services that they moved long ago and new solutions they create from scratch - a far more realistic approach is hybrid cloud in which public and private clouds are stitched together. In addition to the networking and identity/authentication skills above, there's a strong need for people who understand the plumbing. That is, how to connect a cloud service to data stored in a private data centre or how to connect two public clouds together. Parts of this role depend on having some developer skills and some IT pro skills.
You'll see increasingly from Microsoft that management of the public cloud is a BIG deal. It's not just about a place to put something and have it run cheaply (although it partly is), it's also about being able to make sure that what happens there is what you expect is happening. To that end you'll see more and more that Windows Azure and System Center work well together. Right now the best thing to do is to try out monitoring a Windows Azure application using this evaluation for Windows Azure monitoring with SCOM. You don't even need an application to monitor as there's one included in the eval. As we move forward to System Center 2012 you'll see even more deep integration and you can try the betas for System Center here: System Center Configuration Manager 2012 Beta 2 and System Center Virtual Machine Manager 2012 Beta
Data Security Architecture
Deep understanding of data and what it contains has always been a pivotal part of security, and this is going to become even more important in the future. We currently live in a world without harmonised data laws and that means it's hard to place all your data into the public cloud immediately. What you can do is place safe portions of it there. By understanding the data you know what's safe to place in the cloud and what could create too much risk for your business to accept. This role is much more about providing insight back to the business to enable it to make better decisions about what to do, rather than preventing it from doing anything. It's important not to paralyse your business by over analysing risk without understanding of reward.
The list above doesn't really look all that different from one we'd have written a couple of years ago; the roles and the content of those roles have changed somewhat. but guess what - that's the nature of our industry. Technology changes. The really big shift here is one that's been happening for years, it's the integration of business skills and understanding them as a core of the technology skill set. Are we seeing this spread happen across all segments? To answer that lets look at things simplistically - big businesses and small - big having large dedicated IT depts and small having as many as four IT guys. IT guys in small businesses have always had to have many strings to their bow, often requiring them to have more general skill sets and giving them less time. These folks have always needed broad business knowledge as a part of that skill set. For them, cloud technologies, especially public cloud, are likely to free up more time to do more interesting things. In large IT teams business skills were often forsaken for deep, deep technical skill sets that the business didn't understand so they needed more people to translate those deep skills back to the business. Today we see a shift towards the end users becoming more tech savvy and so the need for translators is lessened as there's a smaller gap for the deep, deep techies to fill when translating things back to the business. It looks as if that gap is shrinking further as some of the clunky work done at that deep level is automated with the cloud, again giving the deepest of techies more time to do what the business actually values.
Microsoft Virtual Academy
The brand new Building Windows 8 blog, launched yesterday, is the place to keep up with the latest news about Windows 8 engineering and, importantly, give your feedback to the Windows team. Suggestions for blog posts are welcome, as are your thoughts and comments as specifics and features are discussed - you can mail Steven Sinofsky direct via the blog.
Visit the Building Windows 8 blog.
Oh, how welcome this is. Microsoft Research have a free NoReplyAll Outlook add-in download available which, as you might have worked out from the name, stops people replying to all the recipients of a message or forwarding it on.
Get the add-in and some more info here from the Microsoft Research website.
Step inside the global datacentres that provision hundreds of Microsoft's services and over a billion customers worldwide.
The consumerisation of IT trend is hurtling towards most IT shops and it’s clear from those I talk to that they’re just trying to deal with things as they happen. The MD wants to attach his cool new device but what does that mean for IT – are they expected to support it? What’s the cost of doing that? What else needs to change? It’s clear to me we need a more strategic approach to consumerisation that allows for flexibility and helps reduce costs whilst still permitting the choice that end users now demand.
It’s something Microsoft has been thinking about and you’ll start to see us talking about consumerisation in terms of devices, security and management, productivity and application development. A clear understanding of an evolving trend is always going to be difficult to build, but it’s good to see that we’ve thought about a way to frame our thinking. Whilst it clearly needs deep thought, it’s a good place to start from.
At the forefront of the trend is probably the fact that new devices are coming into organisations at an uncontrollable rate. More tech-savvy consumers are bringing their kit into the office excited by the potential that those devices hold. People expect access to their email at any time and many even expect to converse with their friends or organise their social lives when they’re in the office. They will find whatever way they can to use their devices, and sometimes that will comply with IT policy, but often it won’t.
We’ll tackle the management of those devices in a few paragraphs. First, let’s just have a little look at the potential advantages that using those devices will bring to your business. Strategically you should consider allowing a couple of options over device choice for your users. The first is to allow them to Bring Your Own Computer, or BOYC. BYOC has a number of advantages for you as an organisation, not least of which is that you don’t need to own the asset or have it on your books and depreciate it over time. You could consider a couple of ways of doing this; one might be to give your employees a “technology allowance” that works in a similar way to a car allowance. Obviously there are tax implications for doing this for your employees, but it would move the cost to Opex from Capex.
That wider choice will also help make your employees feel more valuable and more trusted because you’ll be giving them the chance to make their own decisions. You can still centralise purchasing control and exercise some guidance around devices by bringing in a computer leasing company, just like with a car scheme. Just be aware that, unlike cars, computers are actually quite cheap and this could backfire on you if people choose not to lease from your list. It may just be better to allow your employees to buy whatever they like off the shelf.
The other option for device acquisition is to spruce up your list of approved devices. Select kit that appeals to your user base but that is still worthy of your support and the time required for your IT team to support it.
You should also think about the types of devices you’ll support. You also need to be really crisp and clear about what “support” means to your end users. This is where clear communication comes in and it leads to the idea of having a communications team or (better still) a marketer whose job it is to communicate IT services updates to your organisation. If you’re wondering why I’ve suggested a marketer it’s because marketers understand the environment into which they are selling (and your IT department is now selling itself). You may find that you need to redefine the term “support” within your organisation, changing user expectations dramatically.
Not sure what I mean by redefining support? Well, with consumerisation you need to focus on providing flexibility, and that will probably mean evolving your support functions into connection functions, ensuring that any device can be connected in a safe and secure way that meets business requirements. Realistically, you want to be looking at a way to support the people for whom you need to be most flexible (you know the ones - usually they have a C at the start of their job title!) in a way that seems similar to everyone else – it’s far easier to play to the highest common denominator in this case.
We’re starting to get into some familiar ground here around security and management, but before we do and whilst we’re on support, it’s important to note that you probably need to do some heavy lifting using self-service to reduce the load for simple fixes. General things like “how do I do this formula in Excel” are best handled by a Bing search or something similar internally. You can find out more in this post about why self service is so important to consumerisation and cloud.
Device selection is an obvious area for concern. It would be helpful if you could guide your employees to use the right kind of kit, because if they’re buying their own devices you need to make sure they will still be securable and manageable. Think, for example, about how you remote wipe a device. It’s really easy when you have a device with a 3G connection, but how do you remote wipe a device that only has WiFi if it gets stolen? Food for thought.
Security and management
When you think about management and security you probably first think of managing and securing Windows PCs. Given that you’re reading this on a Microsoft blog you might be thinking I’d be extolling the virtues of that. I am, but it’s about far more than that. Your management software and security strategy needs to be able to manage your users’ Windows devices, but it must also be able to manage and secure other devices. If your CEO wants to use his iPad you need to be able to secure it, and critically you need to be able to remote wipe it if it goes wrong and he’s syncing his corporate email. Tricky if it only does WiFi. So what do you do in that case? Well, firstly you only allow the devices that you trust to access some parts of your IT. For example, it’s fine to trust people to access their email on a mobile device, but to ensure security and to reduce operational risk you probably want to ensure your users have access to (and know how to use) rights managed email. With that technology you can ensure that sensitive emails are only accessible to the intended recipient and also that they can only access that specific, sensitive email on a secure device, or possibly just through a HTTPS secured web page.
You can probably see now that security and management in a consumerised IT shop needs to take a data-security led approach, but one that differs to most you might have come across before. Traditional data security has a (user perceived) focus on preventing access by working against a lowest common denominator model of ‘block access to people who shouldn’t have access’. It’s been a good approach for the greater part but has led to disenfranchisement of the user base in many organisations. Far better to promote a security model based on circumstance.
The HR Director has access to all personnel records, for example, except if she’s accessing the HR system from a PC that’s facing the window. If you don’t think this is possible then you should have a look at some of the solutions for Remote Desktop from Quest. Perhaps the HR Director also shouldn’t be able to have access to the HR system, which is web based, from a slate device or even from a PC that doesn’t have up-to-date anti-malware. Again, perfectly possible scenarios using solutions like the Forefront family. The big thing to do then is understand the data in your organisation and grant access based on circumstance and identity. Deep understanding of data is something you need for the cloud, too, so it’s a good project to kick off.
Flexibility in security and management solutions is also required, because in order to deal with security based on circumstance you need to be thinking about a devices lifecycle. When you think about lifecycle you start to realise that a device tends to go through stages - things like power on, load OS, pre-logon, sleep, hibernate, wake from sleep, power down, internet connected, no internet connected, LAN connected, WAN connected, VPN connected…the list goes on. Here you soon start to notice you need security solutions that start and stop as early as possible and remain constantly pervasive.
This is where solutions such as DirectAccess (a remote network solution enabled by Windows 7 and Windows 2008 R2 and enhanced by ForeFront) come in. DirectAccess starts early on in the lifecycle of a Windows 7 device and creates a tunnel back into your corporate network that effectively brings the devices onto your LAN and into your management sphere. This means it’s possible to quickly deliver patches, do remote control and manage every aspect of the device. Windows Intune provides a similar solution in a different way. Rather than forming a tunnel into the corporate network, the management agent simply talks to the cloud. That immediately means that patches, antivirus and policies can be deployed, and soon you’ll be able to deploy your own software over the Internet, too – a feature already in the beta.
Questioning the idea of “secure” also needs to be a prime concern when dealing with consumerisation. Do you trust your LAN? Unfortunately the answer should probably be no. You’ve probably had to deal with a virus outbreak already in your life, possibly more than one, and they typically happen because a device on your network doesn’t have enough security to prevent infection. That infection will spread and eventually take hold, leading to lost weekends and overtime. Technology like Network Access Protection (NAP) allows you examine devices connecting to your network and if they don’t match your standards they don’t receive an address, or are placed into a “remediation” network. A remediation network can provide access to services like Windows Update for patching but perhaps doesn’t allow access to your internal HR or email systems. In a consumerised IT shop, though, it could be a good idea to treat your remediation network as the Internet – give people access to everything if at all possible.
In part 2 we’ll take a look at some of the thoughts you need to keep in mind around productivity and application development. For now though knowing that a modern desktop and management are key parts of the puzzle I’d suggest deepening your thoughts about getting off XP and onto Windows 7 and implementing management with System Center. The Springboard resources that we have available are a good place to start investigating Windows 7 deployment.
Welcome to part two of this post the first part of Building your consumerisation of IT Strategy can be found here
Consumerisation and productivity go hand in hand, but it might not be top of mind when you first think about consumerisation. Being productive in some way, however, is the main driver for consumerisation in the mind of the user. Sure, they might also be thinking about using a device that makes them look good, but most people are really motivated by doing something well (I’d like to think), and if having a shiny device in front of a customer when making a sale helps the sale, well…
When we think about productivity, though, we almost always think about software, and having good software that people know how to use and want to use is essential. People need to have software that lets them achieve the same results at work as at home, so the idea of productivity changes a little. First and foremost productivity software does need to do the things that it’s always done: write documents, understand data, create presentations, organise information, collaborate. Modern productivity software needs to go far beyond that remit, though, because people simply expect to be able to do more with what they have. They might not use everything all the time but they do expect to be able to do it when they need to.
Take the example of creating a presentation. When you’re in that particular flow of organising information into a digestible form, people find it useful to be able to embed and edit video, to crop, alter and colour pictures and to be able to insert eye-popping charts. It would be a far more interesting world if everyone used all the tools open to them but it’s better that they’re there than for users to have to go grab £60 of video editing software or £200 of photo editing software. Having really strong basic productivity tools that do more than just the very basics makes good business sense, so it should still be a cornerstone of your strategy.
Seamless collaboration is something that’s become essential, yet few people even realise that it’s part of consumerisation. People collaborate around slightly different things in a consumer setting; not so much around documents but more around pictures and video. Being able to quickly and simply share a photo on Facebook is so ubiquitous that people do it now without even thinking about it, and they also expect people to be able to comment on those items immediately. This simple act of sharing is the first step to collaboration. How much more useful is this in a business setting when what’s shared is a tender document and colleagues working on it together can simply comment or make updates?
Collaboration as a part of consumerisation makes the tools for collaborating more natural for the end user. That could be a SharePoint site where people can upload, comment on or edit the document online, or it could be using Lync to have a real-time conversation about a document. Better yet, it could be having an online meeting using Lync, where one colleague presents the slides and can go back a slide or two when someone gets lost. Your strategy for productivity within a consumerisation strategy should be to naturalise workflows.
In my mind, Office 2010 is the starting point. Again, most people have Office at home, many people use some kind of sharing site and many use video conferencing or internet voice calling, like Skype, on a regular basis. In a business environment you need similar functionality but with some central control.
This area should be a no-brainer for everyone. Does it cost more to produce one version of an application or two? Wherever you can it’s wise to build your strategy around minimising the number of bespoke applications you need, but you should also try to minimise the need to build applications for different devices. Sure, it might be popular to have an iPad version of your internal sales tool, but at what cost? The reality is that with the proliferation of all these different types of devices it might seem like you need to invest money in having a custom application developed. There are, however, lots of other options.
One such option is to develop a web app. Most devices are capable of viewing web pages, so it’s perfectly possibly to do, even if it’s not all that popular. There are times, though, when a web app becomes the right move. I suspect we’ll see much more being made of the “touch-friendly web” or some such naming over the coming years as we see all devices, including PCs, becoming better at touch. A touch-friendly website is a good move in many ways, not least because the tendency is for touch-friendly to equate to user-friendly. The National Rail website is a great example.
Websites should work the same on a touch device as with a keyboard and mouse, and so should applications. If you’re currently running a project to develop any application ask yourself and your development/UX (User Experience) team if the website works well with touch. If it’s an internal-facing application ask the question until you get the right answer - in my experience they tend to hang around for about five to seven years between major iterations.
This is more than a question about UX. It’s a question about the OS you are targeting. If you build an application for Windows, you know it will work on Windows regardless of it being a on touch-friendly device like the current crop of great devices from Asus, Acer, Fujitsu and Novatech (to name just four). Someone will probably comment to say that’s not the case with Windows Phone, but I’ll pre-empt that by saying again that there’s more to device targeting and it goes deeper into the realm of code reuse. Someone else will probably comment to say that there have been countless application compatibility issues with Windows over the years, and there have, but there have always been numerous ways to mitigate the problems. The miriad ways to mitigate IE6 compatibility issues is a testament to that.
So how should one think about application development within a consumerisation sense? For me it’s simple: do as little development as you need to help your users. By that I mean it would seem wasteful to build different applications for each target device. Always look for at least one parallel of synergy: same language, same framework, same delivery vehicle.
Your delivery vehicle is another interesting consideration. It’s worth remembering that self-service application deployment (aka an application market place) is a normal thing now - everyone is used to using them on mobile devices. I've already written about the importance of self service in consumerisation, though.
So when thinking about building a consumerisation strategy you’ll want to consider:
One of the best things about cloud services is that every few months a new version comes out and users can instantly use all its new features.
One of the worst things about cloud services is that every few months a new version comes out and users are able to instantly use it and all its new features.
I repeat because in each release of a cloud service constraints and even pricing are changing, so while an offering might not be attractive or even initially possible, a year later it might not only be viable but attractive, too. A case in point is SQL Azure. When it was initially launched it could only handle 10Gb databases, now that’s 50Gb.
I thought some notes and queries around the state of SQL Azure might be in order, and I’ll add update posts to it as the Azure landscape (cloudscape?) changes.
One common thing across any type of product is that there are betas and community technical previews (ctp), though these are released more frequently. In SQL Azure there are a couple of ctps to note:
Reporting Services. This is like reporting services in SQL Server on-premises except that it’s limited to just reporting against SQL Azure i.e. you can’t change the data source. Just like with a local copy of reporting services you can use either BI Development Studio or Report Builder 3 to design reports, you just deploy them to Azure instead of your local copy of reporting services.
SQL Azure Data Sync Service. This allows you to replicate data between different SQL Azure databases, or between SQL Azure and a local copy of SQL Server. A typical use case might be the synching of local reference data (a list of products, for example) up to SQL Azure as part of an e-commerce application.
NoSQL and Azure
Just because SQL Azure is part of Azure does not mean you have to use it to store your data. There’s also Windows Azure Storage where you can store blobs and tables which you can use in an Azure application in much the same way as you can with a NoSQL. At the moment Azure Storage is also much cheaper than SQL Azure per Mb so my advice would be to develop a hybrid application where any data for which you don’t need particularly detailed queries e.g. blobs, xml etc. goes in Azure Storage and is linked to a corresponding record in SQL Azure.
Databases bigger than 50Gb
To break the 50Gb limit you’ll need to shard the database and even then you cannot write a cross-database query in Azure T-SQL as it doesn’t understand the three part names myserver.mydatabase.mytable. The only way to union results is via a local copy of SQL which could return a large number of rows and incur significant cost on your Azure account for the data transfers involved. This also means that each of the databases has to have its own lookup data within it. All this might sound very negative but there are large organisations for whom this is still an attractive proposition e.g. retailers with a database per store or an ecommerce application in different regions.
Design to fail
With an armour-plated super fault tolerant datacentre reliably connected to client applications, developers can afford to be lazy. With any cloud service, however, not just SQL Azure, you no longer have this luxury. The connection could easily time out or take longer to connect over the internet, so this needs to be factored into the design, even if it’s an Azure application you’re writing.
To help get you started there are a number of useful tools for SQL Azure on Codeplex. This is another part of SQL Azure that is constantly evolving as the capabilities of SQL Azure change. You’ll currently find:
An interesting use case of these tools is MVP Andrew Couch migrating his customers’ Access application to SQL Azure. SQL Azure is a lot more powerful than Access, though, most significantly around scalability. You can find some good examples of what’s possible in these case studies from start-up JustProud, and my old company IMGroup who use it to develop solutions for their customers.
If this sounds interesting you can sign up for a free trial with your liveID and a credit card even if you don’t incur any charges.
Find more SQL Azure info over on TechNet.
No more rummaging around for a pen and paper when you need to scribble something down; no more loose Post-its sticking to everything in your bag (or is that just me?). Grab the OneNote Mobile iPhone app now – it’s free for a limited time.
Get the OneNote iPhone app
Grab OneNote for your Windows Phone 7
August’s TechNet magazine is fresh off the press with the usual great line up of articles from expert authors. View the full magazine over on TechNet, or click through to this month’s top stories below.
IT Career Development: Bulletproof Your IT Career
IT Career Development: Develop Your Brand
Internet Explorer 9 : Accelerate Enterprise Application Compatibility
Windows Phone 7: Get Your Windows Phone 7 in Sync
Microsoft Exchange Server 2010: Practical Exchange Server Management
Business Insights Webcast: The Total Economic Impact of Internet Explorer 9 (Level 100)
In this webcast on 16 August at 5pm, Bob Cormier of Forrester Consulting presents findings from the commissioned total economic impact (TEI) study of Microsoft Internet Explorer 9.
Forrester interviewed six Microsoft Technology Adoption Program (TAP) customers to discuss each organisation's experience upgrading from Internet Explorer 8 to Internet Explorer 9. Then, Forrester created a TEI case study, in which it describes the costs and benefits of using a composite organisation. Forrester concluded that the composite organisation would achieve a risk-adjusted net savings of $3.3 million over three years as a result of upgrading to Internet Explorer 9.
Bob Cormier, Vice President, Principle Consultant, Forrester Consulting
Bob Cormier is a vice president and principal consultant in Forrester's Total Economic Impact (TEI) service and has been with Forrester for nine years. He is a leading expert on deriving business value from technology investments and specializes in advising clients on the TEI framework: services that help organizations understand the overall financial value of IT strategies and investments
Hye Jun, Senior Marketing Manager, Microsoft
Anurag Pandit, Product Manager, Internet Explorer 9, Microsoft
Tiffany Ashton, Digital Marketing Manager, Bridge Partners, LLC