We are offering three bonus months with every new TechNet Professional subscription, so that you can have 15 months for the price of 12 to explore the best that Microsoft has to offer and build something you can be proud of. All you need is the following promo code TN3M11 so get TechNet today! Terms and conditions apply – see below for details.
The TechNet professional benefits include:
· Full-version software of 70+ Microsoft products
· Priority support on TechNet managed forums with technical answers within 2 business days
· 2 Professional Support Calls to help resolve critical technical issues
· Microsoft E-Learning to keep your skills up-to-date and prepare for certifications
· As well as additional support and productivity resources!
You can find out more about what the Professional subscription includes here.
Terms and Conditions
For an additional three (3) months of service on a new twelve (12) month subscription to Microsoft TechNet Subscription Professional, for a total subscription term of fifteen (15) months, from now until June 30, 2011 (the “Offer”):
· The Offer is for an additional three (3) months of service on a new twelve (12) month subscription to Microsoft TechNet Professional (the “Subscription”) purchased prior to June 30, 2011 for £214; however, actual Subscription price may vary by country, exchange rate and local taxes.
· Each individual acquiring a Subscription in connection with the Offer (the “Subscriber”) is responsible for all taxes related to the Subscription.
· Due to government gift and ethics laws, government employees (including military and employees of public education institutions) are not eligible to participate in the Offer.
· The Offer is valid only for new TechNet Professional subscriptions and may not be combined with any other promotions, discounts or offers.
· The Offer does not apply to TechNet Professional w/media, TechNet Standard or renewals.
· The Offer is limited to one per individual Subscriber and may not be used on volume license orders.
· Each Subscription and related benefits may only be used by one individual Subscriber.
· All software delivered in connection with the Subscription and Offer is for evaluation purposes and may not be transferred away from the original Subscriber in any way.
· Subscriptions and/or the software under the Offer that have been transferred, shared, or sold may be subject to immediate cancellation.
· To receive the Offer, the Subscriber must place the Subscription order on or before June 30, 2011, via the Microsoft web site above or via phone though the customer service center in his/her region, and provide the promo code TN3M11.
For details on your use rights for evaluation software, evaluation and testing usage scenarios and a list of product titles included in each subscription visit: http://www.microsoft.com/technetsubscriptions.
To find out more about TechNet Subscriptions, visit www.microsoft.com/technetsubscriptions or contact a preferred Microsoft reseller.
Simon’s TechNet blog
A few days ago the public beta of Office 365 was announced - one of the most important software releases from Microsoft in decades. For those unfamiliar with Office 365 let’s just take a moment to gain an understanding of the offering before we dive into what’s available in there for IT professionals. If you know what it is skip the next paragraph.
Office 365 is summed up by the folks in marketing with the strapline, “Everything Microsoft knows about productivity” which I think is very accurate. It is NOT a new version of Microsoft Office. It is Microsoft Exchange, Microsoft SharePoint and Microsoft Lync running in the public cloud, meaning that we at Microsoft manage the servers to a 99.9% uptime service level from our own datacentres around the globe. There’s more though, it’s also Microsoft Office running in the cloud and Microsoft Office 2010 Professional Plus to install on your computers in some editions. The service is provided on a monthly subscription basis. For more information about the pricing and plans take a look at the Office 365 site. The pricing is why this is an uber important release for Microsoft. It moves Microsoft Office to a pay monthly option which frankly is huge…the whole capex to opex thing.
What’s in it for the IT pro though, what makes it tick and click for you? Well there’s a hefty helping of Powershell to keep you happy, there’s new technology to get to grips with in the form of Directory syncronisation and the excitement of federating your Active Directory safely over the internet. The icing on the cake comes in the form of SharePoint goodness which, although it’s mainly for developers is probably a place all you SharePoint heads will love digging about. So let’s take a look.
The first thing you’ll want to do is grab hold of some exciting word docs on the services within Office 365, so head to the service descriptions download (as it happens I’ll be taking these with me on holiday to read).
Next you’ll want to break open a PowerShell window and try some commands, there are lots (about 260 or so) but some of the goodness hidden within includes: Configuring Exchange Online mailbox sizes and limits, specify the email message format used for external recipients, requesting a Directory Service quota increase there are tons more things besides, but here’s a handy search of the KB to help you in your discovery.
Next we need to get you thinking about some of the key infrastructure stuff you’ll need to do, so how about learning all about ADFS as a primer? In fact you might prefer to learn it from Planky, our friendly UK Evangelist who not only understands ADFS but can also explain PKI to anyone in a way you’ll understand. To be honest you’ll probably benefit from a total understanding of how Exchange Online, part of Office 365, can co-exist with Exchange 2003. The 2007 and 2010 docs are coming but we are still in beta!
If you’re a SharePoint head you’re probably going to want to wrap your mind around a training kit and luckily MSDN already has materials materialising. There’s a training kit available that’s a thoroughly good read for anyone planning on doing anything deep with SharePoint.
Finally (and good for us all) are the skills you’ll need to rollout the full version of Office 365 which comes courtesy of a new TechCenter on TechNet, however the skills will be very familiar to anyone who’s rolled out Office 2010. It is the same product you know.
So there you have it, a whole bunch of handy resources to get you started on a journey with the beta, all you need to do now is visit the website and signup for the Office 365 Beta to start playing! If you want to know more, though, we are doing some in-depth stuff at Tech.Days 2011, so register to attend the Public Cloud for IT Professionals event @ TechDays, 25th May, London. Also check out the Office 365 Community and Office 365 Blog and Office 365 Wikis.
Exciting news from the Tech.Ed team!
We can now announce that Tech·Ed Europe is shifting back to the summer timeframe, resuming in 2012. Mark your calendars: the next Tech·Ed Europe will be held the week of June 25, 2012 at the Amsterdam RAI Exhibition and Convention Centre in the historical and picturesque city of Amsterdam, the Netherlands.
To ensure you have the opportunity to experience Tech·Ed in 2011, you are invited to attend Tech∙Ed North America in Atlanta, Georgia from May 16-19, 2011. For the European community, a discount of $200 USD off the conference price of $2,195 USD is available until Friday the 15th of April 2011. Please register on the Tech·Ed North America site using the promotion code ATGNEURO to receive your savings. For anyone with a European residence who already registered at the standard rate of $2,195, you are eligible to receive a refund of $200 USD by emailing the registration team by April 15th and including the special discount code ATGNEURO.
The Tech·Ed Europe 2010 keynote and session recordings are available for online viewing – no registration is required and all content is free to watch, download and share.
Be on the lookout for the Tech·Ed Europe newsletter to get all of the latest and greatest announcements on the conference.
It’s too good to miss – here’s a reminder:
This edition we’re covering migrating from Windows XP to Windows 7. The majority of companies not yet on Windows 7 are migrating from Windows XP, and the deployment process and support available to do this is a hot topic. Extended support is now being offered for Windows XP SP2 (and support for Windows XP SP3 is ending in 2014) so keeping ahead with your IT infrastructure will mean considering an update of your operating system.
NetBenefit had all its desktops and PCs on Windows and due to a significant amount of hardware acquisition over the past five years, the computers were running different images of Windows XP. That stratification became increasingly difficult for the IT department to manage, and the department wanted to upgrade and standardise its operating system, without investing in new client hardware. The company decided to simultaneously upgrade all 300 existing workstations to the Windows 7 Enterprise operating system. Read the full case study here to see all the benefits this gave them.
Atkins - As Windows XP began to age, the company recognised that it could provide a higher level of information services for its IT, engineering, and business users by updating its operating system. “We anticipate benefits over the lifespan of the operating system. Windows 7 is going to enable us to do so much more for our business in the future.” Tom Basham, Manager, IT Architecture and Planning Team. Read full case study here.
The Windows XP to Windows 7 Migration Guide takes you through all the migrating decisions and the tools and resources available from Microsoft to help you each step along the way.
If you would rather watch than read, these Windows 7 walkthrough videos will help. Covering topics including how to use the User State Migration Tool (USMT) to migrate user files and settings from Windows XP to Windows 7 using a default installation.
With all the migration tools, guides, and project plans available, are you wondering how to start your Windows XP migration? The Microsoft Deployment Toolkit 2010 provides a single, comprehensive guide to efficiently managing the Windows 7 deployment process. Learn more in this video.
You can view all the other topics we have covered with Windows 7 Business Insights here, including IT value, deployment, security and application compatibility.
If you fancy a trip to Utrecht, this SharePoint 2010 Governance and Information Architecture Master Class, presented by Antony Clay, Chief Strategy Officer at 21apps, offers two full days of real world examples, knowledge and techniques.
· Purchase tickets for the SharePoint 2010 Governance and Information Architecture Master Class (Utrecht, Netherlands). Enter “Microsoft” for a 15% discount.
· Register your interest for the 3rd SharePoint 2010 Governance and Information Architecture Master Class (UK)
Is this course for me?
Are you a Business Analyst or SharePoint Architect who needs to deal with the hard question of “Will users use the system”?
Have you tried SharePoint and failed (therefore experienced the pain and need it named)?
Are you a strategic management consultant or anyone who is looking for something where so far, the published material hasn’t quite done it for you up till now?
Are you an IT Manager who genuinely wants to hear real techniques behind abstract concepts like “user engagement” and “buy-in”?
Most people understand that deploying SharePoint is much more than getting it installed. Despite this, current SharePoint governance documentation abounds in service delivery aspects. However, just because your system is rock solid, stable, well documented and governed through good process, there is absolutely no guarantee of success. Similarly, if Information Architecture for SharePoint was as easy as putting together lists, libraries and metadata the right way, then why doesn’t Microsoft publish the obvious best practices?
In fact, the secret to a successful SharePoint project is an area that the governance documentation barely touches.
This master class pinpoints the critical success factors for SharePoint governance and Information Architecture and rectifies this blind spot. Based upon content provided by Paul Culmsee (Seven Sigma) which takes an ironic and subversive take on how SharePoint governance really works within organisations, while presenting a model and the tools necessary to get it right.
Drawing on inspiration from many diverse sources, disciplines and case studies, Paul Culmsee has distilled in this Master Class the “what” and “how” of governance down to a simple and accessible, yet rigorous and comprehensive set of tools and methods, that organisations large and small can utilise to achieve the level of commitment required to see SharePoint become successful.
What can I expect?
Master Class outcomes:
Course Structure: The course is split into 7 modules, run across the two days.
Module 1: SharePoint Governance f-Laws 1-17:
Module 1 is all about setting context in the form of clearing some misconceptions about the often muddy topic of SharePoint governance. This module sheds some light onto these less visible SharePoint governance factors in the form of Governance f-Laws, which will also help to provide the context for the rest of this course
Module 2: The Shared Understanding Toolkit – part 1:
Module 2 pinpoints the SharePoint governance blind spot and introduces the Seven Sigma Shared Understanding Toolkit to counter it. The toolkit is a suite of tools, patterns and practices that can be used to improve SharePoint outcomes. This module builds upon the f-laws of module 1 and specifically examines the “what” and “why” questions of SharePoint Governance. Areas covered include how to identify particular types of problems, how to align the diverse goals of stakeholders, leverage problem structuring methods and constructing a solid business case.
Module 3: The Shared Understanding Toolkit – part 2:
Module 3 continues the Seven Sigma Shared Understanding Toolkit, and focuses on the foundation of “what” and “why” by examining the “who” and “how”. Areas covered include aligning stakeholder expectations, priorities and focus areas and building this alignment into a governance structure and written governance plan that actually make sense and that people will read. We round off by examining user engagement/stakeholder communication and training strategy.
Module 4: Information Architecture trends, lessons learned and key SharePoint challenges
Module 4 examines the hidden costs of poor information management practices, as well as some of the trends that are impacting on Information Architecture and the strategic direction of Microsoft as it develops the SharePoint road map. We will also examine the results from what other organisations have attempted and their lessons learned. We then distil those lessons learned into some the fundamental tenants of modern information architecture and finish off by examining the key SharePoint challenges from a technical, strategic and organisational viewpoint.
Module 5: Information organisation and facets of collaboration
Module 5 dives deeper into the core Information Architecture topics of information structure and organisation. We explore the various facets of enterprise collaboration and identify common Information Architecture mistakes and the strategies to avoid making them.
Module 6: Information Seeking, Search and metadata.
Module 6 examines the factors that affect how users seek information and how they manifest in terms of patterns of use. Building upon the facets of collaboration of module 5, we examine several strategies to improving SharePoint search and navigation. We then turn our attention to taxonomy and metadata, and what SharePoint 2010 has to offer in terms of managed metadata
Module 7: Shared understanding and visual representation – documenting your Information Architecture
Module 7 returns to the theme of governance in the sense of communicating your information architecture through visual or written form. To achieve shared understanding among participants, we need to document our designs in various forms for various audiences.
Putting it all together: From vision to execution
As a take home, we will also supply a USB stick for attendees with a sample performance framework, governance plan, SharePoint ROI calculator (Spreadsheet), sample mind maps of Information Architecture. These tools are the result of years of continual development and refinement "out in the field" by Paul Culmsee and until now have never been released to the public.
Previous Master Class Feedback:
"This course has been the most insightful two days of my SharePoint career" "Easily one of the best courses I've been to and has left me wanting more!" "Had a great couple of days at #SPIAUK"
"The content covered was about the things technically focussed peeps miss.."
Ok, so not quite here – you need to pop over to the The Official SBS Blog. When you get there you’ll find a wealth of invaluable SBS content, from technical training and learning, to click through demos, guides and FAQs.
Part 2 now available here.
Christopher Pond, Microsoft UK
You’ll find some handy resources at the end of the article.
Project 2010 – Is this the release we’ve all been waiting for?
There are numerous differences in functionality between the 2007 versions of the Project family and the 2010 versions, but for simplicity I have focused on a few that I believe are the most significant and will provide users with the most benefit.
Project Professional 2010
Microsoft Project’s biggest competitor has always been Excel so in order to transform the user experience Microsoft has taken Excel’s ease of use for data entry and manipulation and embedded it in the table views in Project. This includes auto complete, text wrapping, filtering and sorting as well as auto column creation based on the information you enter. Taking the Excel interface further, users can now select a manual scheduling mode that ignores traditional critical path analysis and allows users to merely input data as and when they have it. This manual scheduling mode also supports top down planning now as well as the bottom up approach that has always been available.
Completing the enhancement to the user interface, Microsoft has introduced the Office ribbon so that this is now uniform across all the Office products, and a wizard-like Task Inspector that makes it easier to look at different sections of projects without being overwhelmed.
On the functionality front, although there are many changes, the two biggest changes are the timeline view and the new team planner.
The timeline view means users can select key elements from their plan, simply right click and add them to the timeline. Once on the timeline they can be formatted and the whole timeline itself copied and pasted into other Office applications such as Word and PowerPoint for reporting; something that for many will prove to be a real timesaver.
Team planner really transforms the whole planning process and is ideally suited to those organisations that are more resource than task centric by providing a graphical view by resource of all allocated and non allocated work. If you are working with Project Server then this work will be across all projects the resources you have selected are allocated. Once in the team planner users can immediately identify overloads and simply drag and drop work between resources in order to effectively level availability.
Project Server and Project Web Access 2010
Behind the scenes Microsoft has also taken major steps to improve the administration and access control functionality of Project Server in order to simplify and reduce the existing overhead.
The ribbon has been introduced into the web across the board. One of the most long awaited changes is the introduction of true web based planning to the web in Project Server. The new web functionality includes much of Microsoft Project’s Gantt charting capabilities and all of the new ease of use Excel-like manual scheduling. For the first time users can now plan their projects completely via the web browser. It’s only when more complex requirements arise, such as the ability to schedule multiple resources on a task, that a user will need to work with Project Professional instead.
Microsoft has spent time and effort listening to the user community and has implemented a truly usable and functionally rich progress and timesheet solution capable of enterprise level time recording and reporting, including audit trails, comprehensive workflow authorisation and a new delegation model. With the ability to easily plan all work not limited to purely project based work, Project Server is now definitely a complete work and time management solution.
The most important functional change, and indeed licence change, is how Microsoft has combined two tools - Project Server and Portfolio Server - into one: Project Server. Project Server now provides a truly integrated Portfolio, Project and Work management solution.
Project Server allows an organisation to define and prioritise their strategic objectives using an ‘in-built’ pair-wise comparison process and once agreed then allows all projects to be mapped against objectives and then prioritised accordingly. Once the prioritised list is in the system, Project Server then provides an optimisation module that can, on a simple level, (based on budget and the alignment of each of the projects) work out the which projects will deliver the most benefit based on the budgets available. The beauty of this solution is that it is easy to use and takes the subjectivity out of project selection. In an ideal world only projects that provide a high level of alignment would proceed and Project Server provides the objective analysis to support this.
Project Server now has a complete capacity management module built in. The capacity planning module allows the selection of any number of projects and then the viewing of resource demand for those selected projects against availability. The capacity planner also allows the modelling of hiring additional resource skills and its impact on the overall portfolio.
Also new for the 2010 release is a completely integrated workflow solution that makes use of SharePoint’s workflow engine meaning there are almost no limits to how and where you can introduce workflow to the project lifecycle.
For reporting, Project Server 2010 has set the benchmark by making full use of the new SharePoint capabilities. Organisations can now easily create reports using Excel and then render them through Excel services. Additionally for those upgrading from a Project Server 2007 installation it’s encouraging to know that you can run Project Server 2010 in a backwards compatibility mode. This allows users to still operate Project Professional 2007 until organisations are able to fully roll out the new 2010 version.
Project 2010 is much more than just Project 2007 with a few extra bells and whistles; it’s been developed into a far more user friendly and easily deployed solution for teams and businesses of all sizes. The improvements in functionality and flexibility mean it doesn’t just apply to one or two specific areas any more, it is a solution for the whole business and helps businesses capture, analyse and report information at the touch of a button.
Download the Microsoft Project Professional 2010 evaluation
Download the Microsoft Project Server 2010 evaluation
Visit the Project team blog
You’ll find a host of other resources in the TechNet library
Huge thanks to Jaap Wesselius for this great Dynamic Memory write up that he kindly allowed me to pinch from its original home on the Simple-Talk online technical journal and community hub. More about Jaap below.
Jaap Wesselius is an independent consultant from The Netherlands focusing on (Microsoft) Business Productivity solutions. Prior to becoming an independent consultant, Jaap worked for 8 years for Microsoft Services in The Netherlands, specialising in Exchange Server. Jaap has a Bsc in Applied Physics & Computer Science, is an MCSE, MCITP and MCT, and has consistently been awarded the Microsoft MVP Award (Exchange Server, fifth year now) for his contributions to the Dutch Exchange community. For his Dutch blog posts, you can visit www.exchangelabs.nl. Besides Exchange Server, Jaap is also very active in virtualisation and is a founder of the Dutch Hyper-V Community. If you'd like to get in touch, you can reach Jaap via email at Simple-Talk@jaapwesselius.nl, or on Twitter as @jaapwess.
Dynamic Memory in Windows 2008 R2 SP1
Windows Server 2008 R2 SP1 has introduced what Microsoft calls ‘Dynamic Memory’.The start-up RAM VM memory assigned to guest virtual machines can be allowed to vary according to demand based on the workload of applications running inside. Fine, but it requires careful setting up for certain applications. Jaap explains.
In Windows Server 2008 and Windows Server 2008 R2 assigning memory to virtual machines is a static process. When you assign 4 GB to a virtual machine it uses these 4 GB, no more, no less. When not all of this memory is used than it’s bad luck. Once assigned it cannot be used for other purposes. So, when you have a Hyper-V host with 32GB of memory, you can create 15 virtual machines each configured with 2 GB of memory. The last 2 GB will be used by the parent partition itself.
New in Windows Server 2008 R2 SP1 is a feature called “Dynamic Memory”. This feature can assign more memory to a virtual machine (while running) when the virtual machine needs more memory. It can also remove memory from the virtual machine when this memory can be used for other virtual machines.
Some people think VMware’s and Microsoft’s approaches to dynamic memory is exactly the same. In this article I explain both the similarities and the differences.
In earlier releases of Hyper-V memory assignment is static. Once memory is assigned to a virtual machine it will stay there as long as the virtual machine is running. If you need to reclaim memory the virtual machine has to be shut down and the configuration needs to be changed, either via the Hyper-V manager or via System Center Virtual Machine Manager (VMM) 2008 R2. Alternatively, if a virtual machine needs more memory it has to be shut down before memory can be assigned using the Hyper-V manager or via VMM (when physically available of course).
The challenge here is designing the Virtual Machines for the proper amount of memory. For There are plenty of guidelines, both from 3rd party software vendors as well as independent consultants. But all parties remain on the safe side; rather recommending too much memory than too little, because the latter could cause performance issues.
Windows Server 2008 R2 SP1 solves this issue by introducing dynamic memory. With dynamic memory a minimum amount of memory is assigned to a VM, but when the VM needs more memory the parent partition can assign more memory to the VM. Memory can grow while the VM is running, so it’s a fully dynamic solution. On the other hand, if the VM doesn’t need the amount of memory anymore, the parent partition can reclaim a portion of that memory from the VM and use it for other VM’s. So it really is a dynamic solution.
Note. When creating a VM the memory is set to ‘static’ by default, so you have to manually turn on dynamic memory, you cannot do this while the VM is running.
Dynamic memory is available on Hyper-V hosts when the host if (of course) running on Windows Server 2008 R2 SP1 or Hyper-V Server 2008 R2 SP1 and on the following operating systems as a guest:
An addition to this, the virtual machines need to be configured with the integration components of the parent partition, i.e. Windows Server 2008 R2 SP1. Unfortunately this is not true for all versions of Windows running in the VM. When running Windows Server 2008 Standard Edition in a VM the system needs a full installation of Service Pack 1.
Dynamic memory can be configured using the Hyper-V Management MMC snap-in. When the memory option is selected, memory management can be configured. By default memory management is set to static, just like previous versions of Hyper-V. When the virtual machine is not running, the setting can be changed from static to dynamic. The following settings are available:
Figure 1. Setting dynamic memory properties on a Virtual Machine
Memory weight is another setting used by Hyper-V for determining which virtual machine is more important than others when it comes to reassigning memory between virtual machines.
After running the virtual machine for some time Hyper-V will assign more memory to the virtual machine when it needs more memory. This is fully dynamic and no intervention is needed. When checking the Hyper-V Management snap-in while the virtual machine is running the amount of assigned memory can be monitored:
Figure 2. Monitoring dynamic memory using the Hyper-V Management MMC snap-in.
In this example the Hyper-V.nu Virtual Machine has 2048MB of memory assigned as “startup RAM” but only 1085MB is actually used. It should be possible to lower the amount of “startup RAM” on this particular VM.
The amount of memory that is requested by the virtual machine depends of the usage profile. We all know that Exchange Server is always in urgent need of memory and so is SQL Server. But running a webserver or a file server is a different story. I’ll get back to the (supported) scenario’s later in this article.
Dynamic memory is an interaction between the parent partition and the virtual machine. This is achieved through the use of the Integration Components. Integration Components in general consist of a server part, the Virtual Service Provider (VSP) and a client part, the Virtual Service Client (VSC). The latter is the one running in the virtual machine. The VSP and the VSC are connected via the VMBus structure. The VMBus is an in-memory bus structure, allowing the parent partition and virtual machine to communicate at very high speeds.
With SP1, new components are added to the VSP and the VSC to allow the use of dynamic memory. The VSC reports the memory requirements of the virtual machine to the VSP in the parent partition. The Memory Balancer coordinates all requests from all virtual machines and allocates memory to, or reclaims memory from the individual virtual machines.
When memory is added, the VSC report this to the kernel memory manager running inside the virtual machine which now can use the added memory. To reclaim memory from the virtual machine, a ballooning mechanism is used inside the virtual machine. The VSC interacts with the kernel memory manager inside the virtual machine, and the virtual machine can return unused memory pages to the VSC. The VSC in turn returns these memory pages to the VSP running inside the parent partition, allowing it to be used by other virtual machines. When these pages are not needed by the other virtual machines anymore, the memory pages are returned to the original virtual machine.
When the parent partition needs to reclaim memory, a mechanism called ballooning starts to reclaim these memory pages. When there are no free pages available inside the virtual machine, it starts paging to disk to reclaim memory to the parent partition. Basically this continues until the startup memory limit is reached. Setting this limit too low for virtual machines will result in paging and therefore in a performance decrease.
Please note that it is the virtual machine that starts paging, since the memory manager inside the virtual machine knows which memory pages to flush to disk and which pages not retain. Second level paging, i.e. the parent partition starts to flush virtual machine to disk does never occur with Hyper-V since the parent partition doesn’t know which parts of the virtual machine to flush to disk. This is a clear differentiator from VMware’s implementation of dynamic memory.
Figure 3. Dynamic memory architecture
Dynamic memory is often referred to as “Microsoft’s answer to VMware’s overcommit” but by now it should be clear that this is not the case. Memory can only be assigned once to virtual machines, and when the limit is reached additional virtual machines fail to start. Of course it is still possible to set the startup memory limit way too low, but you don’t have to be a guru to see that this approach doesn’t work.
Microsoft also doesn’t use memory page sharing between virtual machines like VMware does. Microsoft doesn’t do this from a security perspective, but also the added value using page sharing is very limited when using large memory blocks like Hyper-V does. Page sharing works absolutely fine as long as ‘normal’ 4KB memory pages are used, but when large memory blocks are used, the efficiency decreases rapidly since the parent partition has to find identical 2MB memory blocks and calculate its hash value. More information on this can be found on the VMware site: http://communities.vmware.com/message/1262016#1262016
For more information about large memory pages there’s also an interesting article on the AMD website: http://developer.amd.com/documentation/articles/pages/2142006111.aspx
Using dynamic memory with Windows Server 2008 R2 SP1 Hyper-V and running Windows Server virtual machines is 100% supported by Microsoft. But the trick is that the payloads of the virtual machines need to be taken into account.
For example, running Exchange Server 2010 in a virtual machine is 100% supported. But Exchange server is a memory hog and it allocates (and uses) as much memory as needed. When running Exchange Server inside a virtual machine with dynamic memory enabled, you’ll quickly see an increase in memory usage. But when the parent partition needs to reclaim memory from the virtual machine running Exchange Server, it will start paging straight away, resulting in a rapid performance decrease. Therefore using Exchange Server 2010 with dynamic memory is not a supported scenario. For more information please check the ”Hardware Virtualization” section on the Exchange 2010 System Requirements page: http://technet.microsoft.com/en-us/library/aa996719.aspx. When you check figure 2 for example you’ll see that the Exchange Server is not configured with the dynamic memory option.
Dynamic memory is a new feature of Windows Server 2008 R2 SP1 and permits a more efficient use of memory when running multiple virtual machines. It removes the barrier of statically assigning memory to virtual machine, and since assigning memory is always “better safe than sorry”, often too much memory was assigned.
Using dynamic memory gives you more flexibility, allowing you to run more virtual machines on the same hardware than before. Assigning dynamic memory to virtual machines is achieved using the Integration Components (or installing SP1 inside the virtual machine), and the Integration Components can interact with the kernel memory manager inside the virtual machine. This allows dynamic memory technology to work very efficient without nasty surprises. But, you have to be careful with the minimum amount of memory assigned to the virtual machine, and you have to check if the applications running inside the virtual machine are supported with dynamic memory.
Since we’re giving the usual TechNet newsletter a little rest over Easter, I’m telling you about this fortnight’s great TechNet On feature here on the blog instead. This time we’re talking all things Lync, and there’s a heap of technical info covering what you should know and why, and deployment. Find out why Lync is great and get started with planning and deployment below.
The concept of unified communications, delivered in Lync Server 2010, enables workers to use the same communications platform, regardless of device or location, while presence and coauthoring capabilities facilitate collaboration.
Planning for and deploying the unified communications platform is made easier with the wealth of documentation and tools. Lean how to get started with Lync Server 2010.