Come learn the proven best practices for selling Office 365, Virtually!
The Virtual Drumbeat Sales day, April 18th, provides partners in sales and pre-sales technical roles with best practice sales training for Office 365. Selling Office 365 requires a new way of selling, come hear about it. In addition to sharing Microsoft best practices, programs, and selling tools, we will also present insights into the cloud services market and the opportunity for partners who invest in growing an Office 365 practice.
You will also have the opportunity to interact and learn from your industry peers and representatives from Microsoft.
Date: April 18th, 2014Time: 9:00 am – 5:00 pm PST
WHO SHOULD ATTEND:
Sales ProfessionalsPre – Sales Technical
There is no charge for this exclusive training, however we will be imposing a no-show fee of $39 (USD) if you register, but do not cancel your registration within fourteen (14) business days before the start of the first event.
Space is limited. RSVP today!
Session 01 The Office 365 Enterprise Partner Opportunity
The new Office represents a once-in-a-generation shift in technology and a new era of partner opportunity. Microsoft Is front-running the industry transformation to the cloud and Office 365 is leading the charge. Learn more about our investments in the new Office and how we have created new partner opportunities across the customer lifecycle.
Session 02 Office 365 What to Sell
Office 365 is Microsoft’s fastest growing business ever to the tune of $1 billion and counting. And, three out of four enterprise customers work with a partner to deploy their Office 365 service. Are you one of these partners? Learn more about the benefits of becoming a recognized Office 365 Cloud Deployment partner and what it takes to be one.
Session 03 How to Sell Office 365
Microsoft's Office 365 is advantageously built on a set of cloud principles that form how we position Office 365 to customers. Become familiar with these principles and learn how to showcase the value of Office 365 cloud services across a breadth of real customer scenarios.
Session 04 Google Compete
The proliferation of devices, broadening workplace demographics and a transformative shift to the cloud are all trends impacting the way we work. Office 365 clearly addresses all of these trends and is backed by a sales process that has helped grow a $1B business. Learn how to sell to customers using the Customer Decision Framework, a sales process that enables partners to make the shift from traditional software selling to successfully sell Office 365 in the cloud.
Session 05 Selling with the Customer Immersion Experience
The Microsoft Customer Immersion Experience (CIE) is a hands-on introduction to Windows 8 and the new Office. For partners, it is an effective sales tool that provides customers with an opportunity to experience these powerful new productivity solutions for themselves. Learn how the CIE simplifies customer conversations and provides business decision-makers with an opportunity to experience the full Office stack to accelerate sales and close revenue.
Session 06 Pilot and Deploy Customers with Office 365 FastTrack
Office 365 FastTrack is Microsoft’s new 3-step pilot and deployment process designed so customers experience service value early in the sales cycle with a smooth path from pilot to full deployment within hours and no 'throw away' effort. Learn how to utilize the Office 365 FastTrack process to get customers up and running quickly to win against the competition.
Session 07 Office 365 Support and Communications
Microsoft is strengthening its partner support and communications strategy to better enable our partners to sell, service and support customers. Learn about new ways to enhance your service offerings and stay connected with the latest developments on Office 365.
Recently we’ve released several short training videos around building Windows Store apps. As a part of this process we’re now taking some of the most popular and offering them in additional languages. Today we’re excited to bring you the first two in the Italian Series.
Windows App Certification Kit
Costrure un Live Tile per la tua Windows Store App in meno di 10 minuti
In a recent post (Migrate from Gmail to Office 365 in 7 steps), I shared the steps to migrate from Gmail to Office 365.
If you are doing a large migration from Gmail to Office 365, you will generally want to use a 3rd party tool that automates the process. However, if you are migrating a small customer with a few mailboxes, it is quick, easy, (and free!) to do so manually).
As a picture is worth a thousand words (and a movie shows 30 pictures a second), I thought I would share the process in two short videos.
As always… if you are a Partner and need assistance migrating mailboxes, ask our experts!
Partners, did you know that if you have an MSDN subscription you qualify for free Windows Azure credits each month? MAPS Partners also qualify for $100 in Monthly Credits. Most Gold and Silver partners quality as well.
To see if you qualify for Internal Use Rights on Azure follow the steps in this video.
You might say, “but we don’t work on Azure, so those don’t help us” or, “we’d like to learn Azure, but where do we start?” There is a common business need you can start addressing that applies to almost everyone.
Just about every Partner I talk with uses virtual machines in some way. It might be for demos, lab testing, development work, training, customer support, or many other scenarios. Most of the folks I talk with also run into challenges with VMs. See if any of these apply to you:
Why not put those free Azure credits to work and start running some of those VM scenarios on Azure?
To help you get started, the Partner Services team has put together a new offering called (ready for my burst of creativity?) Labs on Azure. This offering is an opportunity for you or your team to spend some time 1-1 with one of our consultants. You’ll learn how to get started in Azure, build a VM in minutes, customize and re-use VMs, upload VMs you might be running on premise today, and how to start automating VM creation through PowerShell.
This is a great way to get started on Azure, solve a common challenge, and use a benefit you already have!
To get started, visit http://aka.ms/mysupport to view your available advisory hours and submit a request.
Also, if you want some broader training on Azure be sure to check out this learning path.
Windows 8.1 ofrece una oportunidad para los ISV de desarrollo de aplicaciones novedosas y su monetización a través de la Tienda de Windows. Las Store Apps permiten implementar rápidamente soluciones de movilidad que requieren un interfaz táctil y una conectividad permanente a datos. Esta serie de webcasts está pensada en desarrolladores de C# que quieren conocer la nueva funcionalidad disponible para las Store Apps en Windows 8.1
Windows 8.1 para Desarrolladores
Introducimos la funcionalidad del SS.OO Windows 8.1 de interés para desarrolladores. Comentaremos como las Store Apps integran con el SS.OO., introducimos la Tienda de Windows, las APIs y las herramientas de desarrollo.
Controles XAML UI
Presentamos el resumen de los controles XAML básicos para construir el interfaz de usuario: TextBox, RichTextBlock, ProgressRing, RichEditBox, Date y TimePicker, Buttons , Flyouts, Shapes, Paths y Images. También explicaremos la aplicación de estilos a los controles.
Controles de Listado Modernos
Presentamos los controles modernos de listados: FlipView, GridView, ListView y SemanticView. También comentamos mejoras de rendimiento en listados que podemos conseguir utilizando virtualización de UI y carga de datos incremental.
Lenguaje de Diseño de Store Apps
Presentamos los principios del diseño de las Store Apps. El propósito es ayudar al desarrollador a aplicar el diseño para conseguir que el aplicativo tenga una marca (brand) distinta y se perciba como una parte integral de Windows 8.1. Explicamos los 4 pilares de diseño: Principles, Personality, Patterns and Platform.
Navegación, Comandos, Ventanas y Layout
Comentamos el mecanismo de navegación entre las páginas y como implementar comandos con CommandBar. Finalmente, explicamos cómo adaptar el layout del interfaz de usuario a los cambios en el tamaño de ventanas que contienen el aplicativo.
Controles WebView y RenderTargetBitMap
Explicamos cómo integrar el contenido HTML en la App utilizando control WebView. También comentamos como generar una imagen en variedad de formatos desde una rama del árbol visual de XAML y compartirlo con otros aplicativos utilizando RenderTargetBitMap
Integrando con los Contactos y Calendario
En esta presentación explicamos cómo integrar con las app estándar de Windows 8.1: Contactos y Calendario utilizando el API de contrato. Conseguiremos que el aplicativo que necesita gestionar datos de sobre personas o citas puede acceder a la información que gestionan estos apps.
APIs de Windows Store
Esta presentación está enfocada en las APIs de Tienda de Windows y el diseño de la aplicación para monetizarla. Explicamos los modelos disponibles (add-funded, trial, in-app purchase, consumables), como gestionar el cambio de licencia del aplicativo y habilitar la funcionalidad correspondiente. Además comentaremos el proceso de publicación del aplicativo en la Tienda de Windows.
I just sat through an “Ask the Experts” session on Exchange Online migrations at the Microsoft Exchange Conference, and there were some great questions asked. It got me to thinking… what do YOU do when you have a question?
Search Bing/Google? Read TechNet or MSDN? Read the Office 365 Service Descriptions? The Office 365 Deployment Guide? Stack Exchange? Save up the question for SharePoint Conference or MEC or TechEd? Those are all excellent resources. Many teams within Microsoft write blogs to share tips/tricks/issues so they can be found later. Definitely use them to find an answer if you can.
For that matter, take advantage of the excellent training on Microsoft Virtual Academy or Ignite or from a Microsoft Learning Partner. We even have some great videos over on the MSPartnerTech YouTube channel. However, training tends to cover our products working as designed, in a vanilla environment. Out in the real world, things are much trickier, which is why we depend on our Partners for technical/industry/integration expertise.
That means our Partners hit some super interesting scenarios… “For a mailbox that was originally created on Exchange 2003, I cannot enable an archive once the user is moved to the cloud… am I doing something wrong or did I hit a bug?” “I am trying to move 30,000 mailboxes to the cloud, and when we did a few test migrations, I am only able to move mailboxes at 0.5 GB/hr. At that rate, it will take a year to move to the cloud. How can I speed up the migration?” “How can I move SharePoint list items from on-premises to SharePoint Online without changing the “modified” date?
Wouldn’t it be great if you could get ahold of someone at Microsoft that had seen that scenario before and had an answer or a workaround or a pointer to documentation? Someone that could track down an answer from the thousands of smart people within Microsoft that may have hit that edge case before? Some way to “Ask the Expert” when you have a problem, rather than waiting for a conference that takes place once a year?
Let me point you to resources for our partners that let you “Ask an Expert” when you need it most… when you are planning for or carrying out a project:
Partner Support Communities – Unlimited no-charge support for both technical and program (questions about your membership, benefits, etc). SLA’s vary by your membership level.
Office 365 Partner Yammer Community – This is a Yammer group maintained by the Office 365 Partner team. There are no guaranteed answers or SLA, but it is a great place to collaborate with the Product team and other partners.
Silver/Gold Competency Partners
Partners with a Silver competency have access to 20 advisory hours a year, and Partners with a Gold competency have access to 50 advisory hours a year. This is the Bat Phone to speak directly with a Partner Technical Consultant about your technical issue. There are many things a Partner Technical Consultant can help you out with (deployment planning, design review, and more). Submit your request via http://aka.ms/mysupport and a consultant will call you directly.
Cloud Accelerate/Cloud Deployment Program Partners
The Cloud Partner Support team is available to Cloud Accelerate and Cloud Deployment Program Partners via the Microsoft Online Portal (MOP) and via phone submission for severity A (critical) issues. Note: Cloud Accelerate partners must submit MOP issues on their partner Individual Use Rights (IUR) tenant to ensure routing to the correct Support team. A quick reference guide with SLA’s, best practices, escalation resources, etc is available here: Cloud Partner Support Quick Reference Guide
In the past, MPN Partners with the Cloud Essentials and Cloud Accelerate had access to Internal Use Rights for Office 365. Now, however, all partners with a Microsoft Action Pack Subscription (MAPS) as well as Partners with a Silver or Gold competency all have access to free Internal Use Rights licenses for Office 365. This gives you an opportunity to try out the service, so you can speak from experience when you discuss the benefits with your customers. It also means that someone else takes care of running your servers so that you can spend more time working and less time patching and troubleshooting.
I wanted to share a few resources to help get you started. First, the page with all the information you need on your Internal Use Rights licenses, how to access them, how to earn more licenses, and how to activate your partner features is available at: http://aka.ms/mpniur.
In the following video, York Hutton walks through the Internal Use Rights (IUR) core benefits, discussing how they now give partners the power of choice to mix and match online services and on-premises software licenses. Microsoft partners can choose between work-alike solutions for productivity, demonstration, development, testing, and internal training purposes.
In this video, York walks through the process of activating your IUR benefits, whether you are using them for the first time, or transitioning from a previous license grant:
A few additional resources:KB2887467: Support Article: What are my internal use rights benefits?
Office 365 partner features how-to guide (Learn about partner features available to help you sell to and manage your customers, including how to offer and use delegated administration, and how to send quotes and trials.)
If you have program questions (how do I get my license, where is my key, how do I sign up for MAPS or renew my membership?) visit the Partner Membership Community
If you have technical questions (why am I getting an error message when migrating my mailboxes? how do I resolve a DirSync error about an invalid attribute?), visit the Partner Online Services community.
If you have a Silver or Gold competency, you have access to 20 and 50 (respectively) hours of advisory services consultation with a Partner Technical Consultant. These consultants are a great resource to help plan for a deployment (even if it is an internal deployment). Submit an advisory request via: http://aka.ms/mysupport
All partners holding current internal-use software licenses available through a cloud program must make the transition so that they are in alignment with the new internal-use software license process and entitlements, which are available to Action Pack subscribers or competency partners, prior to June 30, 2014, or your internal-use software licenses will expire.Download the instructions to transition to the new system
Recently we’ve released several short training videos around building Windows Store apps. As a part of this process we’re now taking some of the most popular and offering them in additional languages. Last week we launched the Spanish series. Today we’re launching the Turkish series.
Windows Store uygulamanız için 10 dakikadan az bir süre içinde Live Tile oluşturun
Farklı ekran boyutları için Windows Store uygulamaları tasarlama
Windows App Certification Kit (Turkish)
Windows Store uygulamaları tasarlamak için Visual Studio şablonlarını kullanma
Windows Store uygulamalarında bildirimleri kullanmak için ipuçları
Recently we’ve released several short training videos around building Windows Store apps. As a part of this process we’re now taking some of the most popular and offering them in additional languages. Last week we launched the Spanish series. Today we’re launching the Portuguese versions.
Construir um Live Tile para a sua aplicação da Windows Store em menos de 10 minutos
Construir uma aplicação para diferentes tamanhos de janela
Windows App Certification Kit (Portuguese)
Dicas e truques para utilizar notificações em aplicações da Windows Store
Recently we’ve released several short training videos around building Windows Store apps. As a part of this process we’re now taking some of the most popular and offering them in additional languages. In the coming weeks we’ll bring you French, German, Portuguese, and Turkish as a start.
Today we’re excited to bring the first five in Spanish.
Desarrollo de Windows 8.1 Store Apps en C#
Certificación de Store Apps
Plantillas de Proyecto de Visual Studio para Store Apps
Consejos y Trucos para Implementación de Notificaciones en Store Apps
Como Adaptar Store Apps a los Tamaños de Ventana
Como Implementar Live Tile del Store App en menos de 10 minutos
What do @geektrainer and @bitchwhocodes have in common?
Sign up for one, two, or all three sessions, and be sure to bring questions for the Q&A!
Register now! Building Blocks series:
Initialize(), Wednesday, March 26, 9:00am‒5pm PDT
Construct(), Thursday, March 27, 9:00am‒5pm PDT
Extend(), Friday, March 28, 9:00am‒5pm PDT Where: Live, online virtual classroom Cost: Free!
This video shows how to fix the Error ACS50008 in the context of Windows Azure Access Control Service.
This error usually is displayed as an Inner Message like this:
An error occurred while processing your request. HTTP Error Code: 401 Message: ACS20001: An error occurred while processing a WS-Federation sign-in response. Inner Message: ACS5008:SAML token is invalid. Trace ID: 903f515f-3196-40c9-a334-71277700aca6 Timestamp: 2014-03-02 10:16:16Z
How to fix Error ACS50008 http://msdn.microsoft.com/en-us/library/windowsazure/jj571618.aspx
ACS Error Codes http://msdn.microsoft.com/en-us/library/windowsazure/gg185949.aspx
ACS Documentation http://msdn.microsoft.com/acs
This video shows you how to automatically collect Windows Azure Storage Analytic logs. Storage Analytics are key to diagnosing issues with blob, table and queue storage. You can run the Windows Azure Storage Analytics Diagnostics package (.DiagCab) to automatically collect the logs previously generated.
Before using this package, you will need to enable Windows Azure Storage Analytics.
Download .DiagCab http://dsazure.blob.core.windows.net/azuretools/AzureStorageAnalyticsLogs_global.DiagCab
How to use .DiagCab http://blogs.msdn.com/b/kwill/archive/2014/02/06/windows-azure-storage-analytics-sdp-package.aspx
Storage Analytics Video http://channel9.msdn.com/Series/DIY-Windows-Azure-Troubleshooting/Storage-Analytics
Storage Analytics Documentation http://msdn.microsoft.com/en-us/library/windowsazure/hh343270.aspx
Storage Analytics Billing http://msdn.microsoft.com/en-us/library/windowsazure/hh360997.aspx
Storage Analytics Log Format http://msdn.microsoft.com/en-us/library/windowsazure/hh343259.aspx
Storage Analytics Logging - How to Enable and Where to Find the logs. http://blogs.msdn.com/b/cie/archive/2013/10/10/storage-analytics-logging-how-to-enable-and-where-to-find-the-logs.aspx
This package will only work on a Windows 7 or later, or Windows Server 2008 R2 or later computer. You will need to have Microsoft Excel installed on the machine where you run this package in order to see the charts.
The Tech Support team for Windows Azure has put together an excellent series of short videos for those of you working with Windows Azure Web Sites. The links are below.
Diagnostics in Windows Azure Web Sites
Remote Debugging in Windows Azure Web Sites
Failed Request Tracing
Application Logging and Crashes
To be notified of future videos in the Azure Troubleshooting series, subscribe to the Channel 9 series.
Here is how: http://technet.microsoft.com/en-us/library/dn568114.aspx
This guide covers migrating from Gmail to Office 365 and will take about an hour to complete.
For more information on deploying Office 365, see the first article in the series at Office 365 Midsize Business Quick Deployment Guide and also watch the YouTube video at Office 365 Midsize Business Quick Deployment Guide video.
Before you begin the Gmail to Office 365 migration, you need to know or have at hand a few key pieces of information:
If you’re using Office 365 Midsize Business with the Microsoft Open License or the Open Value program, go to the get started with Office 365 page and create an Office 365 account first. After you’ve created the account, return to this document and begin Step 1: Sign in to the Gmail Admin console and Office 365 admin center.
What Gmail information is migrated?
Email is migrated, and this is covered in Step 5: Migrate a Gmail mailbox.
Gmail contacts are migrated and imported by using a CSV file. This topic is covered in Step 6: Migrate Gmail contacts.
Gmail calendar items are imported by exporting Google Calendar to an iCal file. This is covered in Step 7: Migrate Gmail calendar.
Okay, let’s get started.
Sign in to the Google Admin console
By using your Google Apps administrative credentials, sign in to http://admin.google.com.
After you’re signed in, choose Users and verify the list of users you want to migrate to Office 365.
By using your Office 365 administrative credentials, sign in to https://portal.microsoftonline.com.
After you’re signed in, you will be directed to the Office 365 admin center page.
To go to the Exchange admin center, click the drop-down arrow next to the Admin name in the ribbon bar.
From the list, select Exchange.
Select Office 365 to return to the Office 365 admin center page.
One of the most important tasks in preparing to migrate Gmail to Office 365 is first creating an Office 365 mailbox for each Gmail mailbox you want to migrate. Fortunately, creating an Office 365 mailbox is easy. You simply create a new user account and assign the Exchange Online Plan license to the user. Refer to your list of Gmail mailboxes you want to migrate, and complete the following steps to create corresponding Office 365 mailboxes.
To create an Office 365 mailbox for each user you want to migrate from Gmail
From the Office 365 admin center, click users and groups > active users.
Click the plus icon (+) to add a new user account. You can also create multiple user accounts at the same time by clicking the Bulk add icon, as shown in the following figure.
On the Send results in email page, type an email address where you will receive the temporary password for the user.
The newly created user name and password appear on the Results page and are also sent to the administrator via email.
Lastly, send the email message with the user name and temporary password information to each user.
The migration file, a comma-separated values (CSV) file, contains the list of Gmail accounts that will be migrated to Office 365. Each row of the file contains the email address of an Office 365 mailbox and the corresponding user name and password of the Gmail account that will be migrated.
The CSV file can easily be created by using Microsoft Excel.
Create the Gmail migration file
On your local computer, open Excel 2013 or Excel 2010.
Using the preceding figure as a template, create the migration file.
Column A lists the Office 365 mailbox.
Column B lists the Gmail user name.
Column C lists the password for the Gmail user in Column B.
Save the file as a CSV file type, and then close the program.
As part of the migration process, Office 365 must verify that it can communicate with Gmail. It’s very important to successfully connect to the Gmail server before continuing. If you do experience any problems performing this step, see Troubleshooting the Gmail connection to resolve the issue.
Test the connection to the Gmail server
Go to the Exchange admin center.
Select migration > More > migration endpoints.
Choose + and then select IMAP.
Set IMAP server to imap.gmail.com, and leave the remaining settings as they are.
Enter a name for the connection and choose new to create the migration endpoint. The preceding figure uses Gmail-migration as the name of the migration endpoint.
The migration endpoints page appears and displays the endpoint you just created.
When you migrate your Gmail mailbox to Office 365, only the items in your inbox or other mail folders are migrated. The steps for migrating your contacts and calendar items are covered in later steps.
Migrate messages from Gmail to Office 365
Navigate to Recipients > Migration.
Click the plus icon (+), and choose Migrate to Exchange Online.
Choose IMAP migration.
Choose Browse, and specify the file created in Step 3: Create a Gmail migration file.
On the Start the batch page, select Automatically start the batch. The status field will initially be set to Created, as shown below.
The status will change to Syncing and then to Synced after the Gmail messages have been synchronized with Office 365.
You migrate your contacts from Gmail to Office 365 by first exporting the list of contacts to a comma-separated values (CSV) file and then importing that file into Office 365.
Export Gmail contacts to a CSV file
Using your Google Apps administrative credentials, sign in to the Google admin console..
Choose Contacts > More > Export.
Choose All contacts > Outlook CSV format > Export.
Select a location to save your file.
Important: When you export Gmail contacts to a CSV file, you must choose the Outlook CSV format to successfully import the Gmail contacts into Office 365.
When you export Gmail contacts to a CSV file, you must choose the Outlook CSV format to successfully import the Gmail contacts into Office 365.
Import Gmail contacts into Office 365
Using your Office 365 administrative credentials, sign in to the Office 365 admin center.
Choose People > Settings > Import contacts.
Select the Gmail CSV file you saved in Step 3: Create a Gmail migration file, and choose Next.
After the Gmail contacts have been successfully imported into Office 365, choose finish.
You migrate calendar items from Gmail to Office 365 by using a two-step process. First, you export the Gmail calendar items as an iCal file. Once the iCal file is saved, you use Microsoft Outlook to import the calendar items into the Outlook Calendar. You cannot import the iCal file directly into Outlook Web Access.
Note: There are third-party tools available that simplify the task of moving Gmail calendar items and contacts to Office 365 and Microsoft Outlook. An Internet search for “Gmail to Office 365 migration tools” lists some of these tools.
There are third-party tools available that simplify the task of moving Gmail calendar items and contacts to Office 365 and Microsoft Outlook. An Internet search for “Gmail to Office 365 migration tools” lists some of these tools.
Export your Gmail calendar to an iCal file
Using your Google Apps administrative credentials, sign in to http://admin.google.com.
Choose Calendar > My calendars > Settings > Export calendars.
Select a location to save your file. Gmail saves the iCal file as a compressed file. Be sure to decompress the file before proceeding to the next step.
Set up Microsoft Outlook to access Office 365. For guidance, see Set up email in Outlook 2010 or Outlook 2013.
Choose Import > Comma Separated Values (Windows) > Next.
Select the iCalendar file you saved in the previous step..
Choose Outlook’s calendar > Finish. You should now see the Gmail calendar items within the Outlook calendar.
Now that you have migrated Gmail messages, contacts, and calendar items to Office 365, you can use Outlook Web App, which comes with Office 365, to verify that Gmail migrated successfully.
Verify Gmail migrated successfully using Outlook Web App
Open the email message sent by the Office 365 administrator that includes your temporary password.
Go to the sign-in page https://portal.microsoftonline.com.
Sign in with the user name and temporary password.
Update your password, and set your time zone.
Note: It’s very important that you select the correct time zone to ensure your calendar and email settings are correct.
It’s very important that you select the correct time zone to ensure your calendar and email settings are correct.
When Outlook Web App opens, send an email message to the Office 365 administrator to verify that you can send email.
Choose the Outlook icon, and verify that the Gmail messages have been migrated.
Choose the People icon, and verify that the Gmail contacts have been migrated.
Choose the Calendar icon, and verify that the Gmail calendar items have been migrated.
Note: You cannot import Gmail calendar items directly into Outlook Web App. However, you can view the items using Outlook Web App after they have been imported by Microsoft Outlook.
You cannot import Gmail calendar items directly into Outlook Web App. However, you can view the items using Outlook Web App after they have been imported by Microsoft Outlook.
Well, you’ve reached the end of migrating Gmail to Office 365. At this stage, email is flowing to both Gmail and Office 365 mailboxes. Many administrators choose to keep both the Gmail and Office 365 mailboxes running in parallel for a period of time. There’s nothing wrong with this approach. The limitation is that email is updated to Office 365 from Gmail once every 24 hours. To remove this limitation and direct Gmail messages directly to Office 365, follow the procedure below.
Route all future Gmail messages to Office 365
Sign in to your DNS hosting provider’s website.
Select your domain.
Find the page where you can edit DNS records for your domain.
Open a new browser window, and sign in to the Office 365 website using your Office 365 administrative credentials.
Choose domains > your company domain > View DNS Settings > View DNS records.
In the Exchange Online section, in the MX row, copy the Priority, Host Name, and Points to Address.
Return to your DNS hosting provider’s website, and use this information to create a new MX record.
Set the priority of the MX record to the highest value available, typically 0, and save the record.
For detailed instructions for creating MX records to point to Office 365, see the article Create DNS records for Office 365 when you manage your DNS records.
For information about creating an MX record, see Find your domain registrar or DNS hosting provider.
Note: Typically, it takes about 15 minutes for DNS changes to take effect. However, it can take up to 72 hours for a changed record to propagate throughout the DNS system.
Typically, it takes about 15 minutes for DNS changes to take effect. However, it can take up to 72 hours for a changed record to propagate throughout the DNS system.
See the following list of resources to further your exploration of Office 365:
The information in this article covers troubleshooting Step 4: Verify that Office 365 can communicate with Gmail. If you successfully created a connection to Gmail from Office 365, you can skip this topic. However, if you were not successful connecting to Gmail from Office 365, perform the following steps.
Open Windows PowerShell as an administrator on your computer.
From the Windows PowerShell command window, run Get-ExecutionPolicy.
The Get-ExecutionPolicy cmdlet tells you which of the four execution policies (policies that determine which Windows PowerShell scripts, if any, will run on your computer) is set. In the next step, we’ll change this setting to remotesigned.
From the Windows PowerShell command window, run Set-ExecutionPolicy remotesigned.
Next, run the following command:
$session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell/" -Credential $cred -Authentication Basic -AllowRedirection
When prompted to enter your Windows PowerShell credentials, enter your Office 365 administrator credentials.
Next, run Import-PSSession $session.
This command provides access so you can test the connection between Gmail and Office 365.
To see a list of Office 365 mailboxes configured on Office 365, run Get-Mailbox. This is just a quick test to verify that we are communicating with Office 365.
Finally, to test the connection between Gmail and Office 365, run the following command:
Test-MigrationServerAvailability -IMAP -RemoteServer imap.gmail.com -Port 993 -Security SSL
You should see Success appear in the Result row. If you see any errors, verify you have entered the command correctly.
Now that you’ve verified that Office 365 can connect to Gmail, it’s important to disconnect from Office 365. To do that, from the Windows PowerShell command window, run Exit.
Troubleshooting is now complete. Return to Step 4: Verify that Office 365 can communicate with Gmail.
In this talk we will take a look at how to use and debug Background Transfer API issues in your windows Store App. For a brief preview of the talk, check out this video:
Please join us on Wednesday February 26th, 2014 from 10am-12pm PST. We’ll be hosting a live chalk talk with technical experts from our developer support team.
We’ll look at demos of real-world scenarios, how the technology works, best practices for implementation, and development troubleshooting tips. There will also be plenty of time for you to ask your questions.
If you are interested, sign up today as space is limited.
WHAT: How to use and debug Background Transfer API issues
WHEN: February 26th, 2014. 10am-12pm PST
WHERE: Online Meeting
REGISTRATION: email Wsdsctreg@microsoft.com with the subject “Feb 26th Chalk Talk”.
Seems like everywhere you turn these days, Windows Azure is a hot topic. We get questions daily about where to find some great Windows Azure training. Well, we’re excited to let our Partners know that a new set of Windows Azure training is available for you. The training, Partner Practice Enablement: Windows Azure Technical Training, is available in multiple formats (more on that below).
What Kind of Training is it? What will it cover?
The level 200-300 training starts with an introduction to Windows Azure Virtual Machines and Virtual Networks (Infrastructure Services). It delivers the foundational knowledge needed for users intending to run new workloads in Windows Azure or migrate existing workloads from on-premises.
Students will be introduced to the rich features of Windows Azure Active Directory and see how it can be used to achieve single sign-on across cloud applications, protect application access, enforce multi-factor authentication, and integrate with Windows Server Active Directory. The 8 modules in the training are:
Introduction to Windows Azure Infrastructure Services
Windows Azure Infrastructure Services Networking
Windows Azure Active Directory
Windows Azure Active Directory Integration
Cloud Services, Websites and Infrastructure Services
Development and Test
SQL Server and SharePoint Server in Windows Azure
Management and Monitoring of Virtual Machines
Each module includes an instructional session, Q&A, and self-study guides for additional hands on learning.
Who should attend?
Anyone who is new to, or has not worked with Windows Azure at all, can benefit from this training. The training will have a technical focus, so technical sellers, implementers, and support experts are encouraged to participate.
When is the training and how do I sign up?
Web-based live training will run weekly from March 4th until April 24th. The webcasts are offered twice a day at 7AM and 5PM Pacific time. The schedule is as follows:
in addition to the live training, we are making self-study recordings of the training sessions available via an MPN Learning Path:
What does this training cost?
The self-study content of the Learning Path is provided at no cost.
For the live webcasts, two Partner Advisory Hours will be deducted from your organization's balance per session independent of the number of attendees from your organization.
If you have any questions, please send an email to email@example.com
Visual Studio 2013 enables you to modify the collection of diagnostic data after a cloud service is deployed to Windows Azure. This is a useful technique to collect troubleshooting information from a single instance or to temporarily increase the logging level in order to diagnose a problem. Changes that you made via the Visual Studio Server Explorer will persist until you do a full re-deployment of your cloud service because the diagnostic configuration is written to the wad-control-container in your storage account.
These are the links shared at the end of the video:
Configuring Windows Azure Diagnostics
Introduction to Windows Azure Diagnostics
Troubleshooting Best Practices for Developing Windows Azure Applications
In this video blog we talk about two powerful but overlooked features of Visual Studio 2013 that can help to speed up the DirectX development process on both Windows 8 and Windows Phone 8. We will take a look at the Visual Studio 3D asset viewer and modify our 3D model directly in the viewer. We will also take a look at setting up your project to automatically modify your standard 3D mess assets to a format that is more DirectX and Direct3D friendly.
If you have worked with Windows 8 for any length of time, chances are your customer has wanted to deploy a custom app. The app isn’t something you want to publish to the Windows Store, so what is the best way to go about deploying it to all the customer’s devices? Consider side loading. This brief video will explain how to get started with side loading in test environments.
After watching the video if you want to go deeper, or have specific questions please don’t hesitated to contact Partner Support. We are here to help.
Join the Microsoft Services team for a look at how to develop your Windows 8.1 App with Search and Sharing in mind. We'll look at some examples and jump into the code behind each one.
Join the Microsoft Services team for a look at how to implement drag and drop in your Windows Store applications. We'll take a brief look at why this is important and then jump right into the code and see how it all works.
In this short video, one of our developer-focused consultants will show you how to build Windows Phone and Windows Store apps that share code in order to make the process more efficient.
(Post courtesy Partner Technical Consultant specializing on Data Analytics)
Service Level Agreement Planning and Disaster Recovery Planning for the Microsoft Business Intelligence Stack with Microsoft SQL Server and Microsoft SharePoint Server
For a fully implemented Microsoft Business Intelligence stack, which might be composed of SQL Server, SSAS, SSIS, SSRS, SharePoint Server, MDS, and DQS; the question may arise regarding how to ensure the consistent status of all of the applications in case of a total or partial failure, where the different components may be subject to varying maintenance schedules.
In the worst case, disaster recovery requires us to recover from a situation where the primary data center, which hosts the servers, is unable to continue operation. Even potentially smaller disruptions like power outages, data corruption or accidental deletion of data can force us to restore data or configuration from our backups. It is well known that fault-tolerance is achieved through redundancy, ideally not only at the data layer but also in each and every component of the network and all services (switches, servers, Active Directory, SMTP…)
In this blog post we would like to focus on the Microsoft Business Intelligence stack and provide an overview what you need to consider when defining Service Level Agreements and how to prepare for a fast resumption of operation after such an unwelcome event has occurred.
Balancing cost and risk of downtime
First, let's consider the two factors that determine the Service Level Agreement corresponding to Availability. The whitepaper on "High Availability with SQL Server 2008 R2" at http://technet.microsoft.com/en-us/library/ff658546.aspx explains it concisely:
"The two main requirements around high-availability are commonly known as RTO and RPO. RTO stands for Recovery Time Objective and is the maximum allowable downtime when a failure occurs. RPO stands for Recovery Point Objective and is the maximum allowable data-loss when a failure occurs. Apart from specifying a number, it is also necessary to contextualize the number. For example, when specifying that a database must be available 99.99% of the time, is that 99.99% of 24x7 or is there an allowable maintenance window?"
This means that both RPO, i.e. the amount of data you are willing to lose, and RTO, the duration of the outage, need to be determined individually, depending on your customer's specific needs. Their calculation follows an actuarial principle in that cost and risk need to be balanced. Please do not forget that the RTO does not only depend on how soon your services are back online but might in certain circumstances encompass the amount of time needed to restore data up to a certain point in time from backups as well.
Assuming a 24x7x365 operation, the following calculation applies, taken from "Create a high availability architecture and strategy for SharePoint 2013" at http://technet.microsoft.com/en-us/library/cc748824.aspx:
Availability class Availability measurement Annual down time
Two nines 99% 3.7 days
Three nines 99.9% 8.8 hours
Four nines 99.99% 53 minutes
Five nines 99.999% 5.3 minutes
So now we start to appreciate what it means that in Windows Azure we receive a general SLA of 99.9% across services respectively 99.95% for cloud services, cf. http://www.windowsazure.com/en-us/support/legal/sla.
And here is one more argument in favor of using Windows Azure as your secondary site and standby data center: If you back up your databases and transaction logs to Azure blob storage and take Hyper-V based snapshots of your virtual machines, which you then transfer to Azure blob storage, then you will only incur the cheap storage cost and still be able to turn on the VM's any time you decide to bring them online, and start paying for them only while they are running. Windows Server 2012 Backup and Windows Azure Backup allow you to backup system state and files/folders to Windows Azure storage as well.
Alternatively, availability can be calculated as the expected time between two consecutive failures for a repairable system as per the following formula:
Availability = MTTF / (MTTF + MTTR)
where MTTF = Mean Time To Failure and MTTR = Mean Time To Recover.
A disaster recovery concept needs to encompass the whole architecture and all technologies involved and include best practices on functional and non-functional requirements (non-functional refers to software behavior like performance, security, etc.).
To summarize, partners need to define in the SLA towards their customers an RPO (recovery point objective) and RTO (recovery time objective). For this, they are looking for a disaster recovery concept that takes into account:
- full, differential and transaction log backups (assuming the database is in full recovery mode)
- application backups
- any add-on components of the software
- Hyper-V virtual images of production servers
With that let's take a detailed look at an end-to-end disaster recovery planning for a Microsoft BI solution.
SQL Server databases
To begin with, how do the above concepts apply to the SQL Server databases?
Where would you look in the first place to find out about the recovery time of all of your databases? Correct, it is the SQL Server's error log, which can be read along a timeline.
To estimate the roll forward rate for a standalone system, one could use a test copy of a database and restore a transaction log from a high-load time period to it. The application design plays an important role as well: Short-running transactions reduce the roll forward time.
Upon failover in an AlwaysOn failover cluster instance, all databases need to be recovered on the new node, which means that transactions that are committed in the transaction log need to be rolled forward in the database, whereas transactions that got aborted have to be rolled back.
Side note: In a Failover Cluster Instance, the time for switchover is furthermore impacted by factors like for example storage regrouping or DNS/network name provisioning. Regarding the client side, one can configure the connection timeout in order to accelerate the time needed to reestablish a broken connection.
The new SQL Server 2012 Availability Groups make it easy to observe the RPO and RTO. For details, see "Monitor for RTO and RPO" at http://technet.microsoft.com/en-us/library/dn135338.aspx.
Here are some tips for an efficient backup strategy of your SQL Server databases:
- Use a separate physical location where you store the backups.
- Have a schedule to carry out regular backups, for example nightly full backups, every 6 hours a differential backup, and every 30 minutes a transactional log backup, if you need a point-in-time recovery.
- Enable CHECKSUM on backups. This is the default with backup compression, which is available in Standard, Business Intelligence and Enterprise Edition.
- Test your backups periodically by restoring them because you might unknowingly carry on some data corruption, making your backups useless. Monitor the suspect_pages table in MSDB to determine when a page level restore is sufficient.
- With regards to long-term archival, it is considered good practice to maintain three different retention periods. If you leverage three rotational schemes, thus for example create full backups daily, weekly and monthly and store them onto different media sets each, then you could regenerate your data from these if necessary. This is called the grandfather-father-son principle and allows for reusing the media sets after their retention period. As an example, a backup on a Monday overwrites that of some previous Monday and so on. The screenshot at http://social.msdn.microsoft.com/Forums/en-US/92fbf076-3cd1-4ab2-97d2-1ae6c9e909c7/grandfatherfatherson-backup-scenario depicts these options very well.
- Filegroups for historical partitions can be marked as read-only, hence require only a one-time filegroup backup. A piecemeal restore of read-write filegroups can accelerate recovery.
- Use "SQL Server Backup to Windows Azure" to upload the backup files for offsite storage, optimally with compression and encryption, even for versions earlier than SQL Server 2014. Check out the "Microsoft SQL Server Backup to Microsoft Windows Azure Tool" at http://www.microsoft.com/en-us/download/details.aspx?id=40740.
- While the RBS FILESTREAM provider, which uses local disk storage, is integrated with SQL Server's backup and restore procedures, with a third party RBS provider it will be your responsibility to back up the Remote Blob Storage separately in a consistent manner, cf. "Plan for RBS in SharePoint 2013" http://technet.microsoft.com/en-us/library/ff628583.aspx.
Fortunately, all Microsoft products are built to scale for availability. With SQL Server Availability Groups in SQL Server 2012 and higher you get a highly available set of databases and of secondary replicas for failover, disaster recovery purposes or to load-balance your read requests. Availability groups are a feature of SQL Server Enterprise Edition, which comes with even more online features than the other editions to allow for higher availability and faster recovery, noticeably online page and file restore or Database Recovery Advisor. The latter is helpful for point-in-time restores across sometimes complicated backup chains. For a concise list please see the table at: http://msdn.microsoft.com/en-us/library/cc645993.aspx#High_availability.
With SQL Server Availability Groups spread out to Windows Azure virtual machines it is even possible to host your secondary database replicas in Azure and, for example, run your System Center Data Protection Manager and its agent in the cloud against them.
Marked transactions allow you to restore several databases, for example the MDS database, the SSIS catalog and your corresponding user databases on the same instance of SQL Server consistently up the very same point in time, which can be advantageous if a major event, for example a fusion of two companies’ databases, occurs. See "Use Marked Transactions to Recover Related Databases Consistently (Full Recovery Model)" at http://technet.microsoft.com/en-us/library/ms187014.aspx.
Since SQL Server Analysis Services is mainly a read-only system, you can do without things like transaction logging or differential backups. If metadata (.xmla files) is available, then this is sufficient to recreate and reprocess your cubes. If you even have functional database backups (.abf files), then those can be restored and used.
It is possible to run a separate SSAS server, which has the same configuration settings, in a remote location and supply it regularly with the latest data via database synchronization.
- When running SSAS in SharePoint mode (as the POWERPIVOT instance), the SharePoint content and service application databases contain the application and data files.
- If you host your secondary replica for read access in Windows Azure, you will want to place your SSAS instance running in an Azure VM within the same Availability Set.
SQL Server Integration Services since version 2012 offers two deployment modes: package-based for backward compatibility and the new project-based deployment. Backup and restore procedures depend on the storage location of the data. The package store can be folders in the file system or the msdb database. Any files should be copied away together with a script for dtutil to be able to upload them, additionally any centrally managed configuration files. Starting with SQL Server 2012, it is strongly recommended to use project deployment for the Integration Services server. The SSISDB catalog is a database that stores projects, packages, parameters, environments, operational history, and as such can be backed up into a .bak file. You also need to back up the master key for the SSISDB database, whereby the resulting file will be encrypted with a password you specify. Unless the key changes, this is a one-time operation.
With SQL Server Reporting Services in native mode being a stateless service, it is the ReportServer database which contains the metadata and report snapshots with data. It can be protected as required for your SLA and RTO via full or differential backups. Experience has shown that doing just full backups oftentimes works fast enough. The ReportServerTempDB database can be recreated anytime. Do not forget to back up the RecoveryKey, which encrypts the database. This should be done at creation time, which suffices unless the service identity or computer name changes. In case of subscriptions, you need to back up the SQL Server Agent jobs as well. This can be accomplished via a simultaneous backup of the msdb database. For a backup of the Report Server configuration and custom assemblies kindly refer the corresponding links in the final section of this blog post.
Concerning SQL Server Reporting Services in SharePoint mode, the SharePoint 2013 built-in backup does not take care of the SSRS databases – with the additional Reporting Service Alerting database - so the previous paragraph is still valid, which means you must use SQL Server tools for SharePoint Server or SQL Server (Express) tools for SharePoint Foundation. As for the application part, since SSRS in SharePoint mode is a true SharePoint Service application, configuration occurs through Central Administration and SharePoint Server's own backup and recovery applies.
The BI, also called Insights, features of SharePoint Server, like for example Excel Services, Access Services, Visio Services, PerformancePoint Services benefit from SharePoint Server's backup options for service applications. A restore of a service application database has to be followed by provisioning the service application. Please find further details in the TechNet articles referenced below.
Master Data Services consists of a database wherein all master data as well as MDS system settings are stored plus a Master Data Manager web application. Scheduling daily full backups and more frequent transaction log backups is recommended. MDSModelDeploy.exe is a useful tool for creating packages of your model objects and data.
Side note: In our experience it is less the IIS-hosted website that tends to cause a bottleneck at high load than the MDS database itself. Hence, a scale-out would not necessarily involve just several MDS web sites, pointing to the same database, although this allows for redundancy and increased availability while web servers get updated. Rather it would separate out models into different MDS databases. On the one hand, this increases the overhead for security accounts and administration, given that the metadata tables are completely isolated from each other; on the other hand, blockings are avoided and databases can be managed independently.
Data Quality Services keeps its information in three databases: DQS_MAIN, DQS_PROJECTS, and DQS_STAGING_DATA, therefore can neatly be integrated into your SQL Server backup and restore processes. With the help of the command DQSInstaller.exe it is even possible to export all of the published knowledge bases from a Data Quality Server to a DQS backup file (.dqsb) in one go.
Cross-cutting best practices
- Making use of SQL alias for connections to your SQL Server computer eases the process of moving a database, for example when a SQL virtual cluster name changes. For instructions see for example "Install & Configure SharePoint 2013 with SQL Client Alias" http://blogs.msdn.com/b/sowmyancs/archive/2012/08/06/install-amp-configure-sharepoint-2013-with-sql-client-alias.aspx. It shows how you gain flexibility over the SQL Server connection string by appropriately populating the SQL Server Network Configuration and SQL Server Client Network Utility. This procedure has significant advantages over the DNS A record or CNAME alias in that the SQL alias does not change the Kerberos SPN format for connections. You continue to use the registered DNS host name (A record) in the Service Principal Name when connecting. Furthermore, it allows you to specify more than one alias pointing to the same instance. For example, you can create an alias for your content databases, search databases etc. and thereby plan ahead for future scale out.
- System Center Data Protection Manager can be used for both database backups and application server backups. For a list of protected workloads please see the left-hand navigation bar on the page "Administering and Managing System Center 2012 - Data Protection Manager" http://technet.microsoft.com/en-us/library/hh757851.aspx.
- In the context of private clouds, System Center comes into play with its Operations Manager to monitor SQL Server instances and virtual machines and its Virtual Machine Manager to quickly provision new virtual machines.
SQL Server and SharePoint Server allow for robust disaster recovery routines as part of your business continuity plan. New hybrid and cloud based solutions enhance traditional possibilities greatly.
As has become clear, configuration changes that occur outside of user databases should always happen in a controlled manner, requiring a tight Change Management process.
"Microsoft SQL Server AlwaysOn Solutions Guide for High Availability and Disaster Recovery" http://msdn.microsoft.com/en-us/library/hh781257.aspx.
With some good discussions: "Simple script to backup all SQL Server databases" http://www.mssqltips.com/sqlservertip/1070/simple-script-to-backup-all-sql-server-databases/.
"Back Up and Restore of System Databases (SQL Server)" http://technet.microsoft.com/en-us/library/ms190190.aspx.
"SQL Server AlwaysOn Architecture Guide: Building a High Availability and Disaster Recovery Solution by Using AlwaysOn Availability Groups" http://msdn.microsoft.com/en-us/library/jj191711.aspx.
"SQL Server AlwaysOn Architecture Guide: Building a High Availability and Disaster Recovery Solution by Using Failover Cluster Instances and Availability Groups" http://msdn.microsoft.com/en-us/library/jj215886.aspx.
"Backup and Restore of Analysis Services Databases" http://technet.microsoft.com/en-us/library/ms174874.aspx.
"Disaster Recovery for PowerPivot for SharePoint" http://social.technet.microsoft.com/wiki/contents/articles/22137.disaster-recovery-for-powerpivot-for-sharepoint.aspx.
"Package Backup and Restore (SSIS Service)" http://technet.microsoft.com/en-us/library/ms141699.aspx.
"Backup, Restore, and Move the SSIS Catalog" http://technet.microsoft.com/en-us/library/hh213291.aspx.
"Backup and Restore Operations for Reporting Services" http://technet.microsoft.com/en-us/library/ms155814.aspx.
"Migrate a Reporting Services Installation (Native Mode)" http://technet.microsoft.com/en-us/library/ms143724.aspx.
"Migrate a Reporting Services Installation (SharePoint Mode)" http://technet.microsoft.com/en-us/library/hh270316.aspx.
"Backup and Restore Reporting Services Service Applications" http://technet.microsoft.com/en-us/library/hh270316.aspx.
"Planning Disaster Recovery for Microsoft SQL Server Reporting Services in SharePoint Integrated Mode" http://msdn.microsoft.com/en-us/library/jj856260.aspx.
"Overview of backup and recovery in SharePoint 2013" http://technet.microsoft.com/en-us/library/ee663490.aspx.
"Plan for backup and recovery in SharePoint 2013" http://technet.microsoft.com/en-us/library/cc261687.aspx.
"Backup and restore SharePoint 2013" http://technet.microsoft.com/en-us/library/ee662536.aspx.
"Supported high availability and disaster recovery options for SharePoint databases (SharePoint 2013)" http://technet.microsoft.com/EN-US/library/jj841106.aspx.
"Database Requirements (Master Data Services)" http://technet.microsoft.com/en-us/library/ee633767.aspx.
"Web Application Requirements (Master Data Services)" http://technet.microsoft.com/en-us/library/ee633744.aspx.
"Export and Import DQS Knowledge Bases Using DQSInstaller.exe" http://technet.microsoft.com/en-us/library/hh548693.aspx.
"Using AlwaysOn Availability Groups for High Availability and Disaster Recovery of Data Quality Services (DQS)" http://msdn.microsoft.com/en-us/library/jj874055.aspx.
"Install SQL Server 2012 Business Intelligence Features" http://technet.microsoft.com/en-us/library/hh231681.aspx.
"SQLCATs Guide to High Availability and Disaster Recovery", "SQLCAT's Guide to BI and Analytics" http://blogs.msdn.com/b/sqlcat/archive/2013/10/23/sqlcat-com-ebook-downloads.aspx.
Case Study for failover to a standby data center: "High Availability and Disaster Recovery at ServiceU: A SQL Server 2008 Technical Case Study" http://technet.microsoft.com/en-us/library/ee355221.aspx.
"Business Continuity in Windows Azure SQL Database" http://msdn.microsoft.com/en-us/library/hh852669.aspx.
"SQL Server Managed Backup to Windows Azure" http://msdn.microsoft.com/en-us/library/dn449496.aspx.
"SQL Server Deployment in Windows Azure Virtual Machines" http://msdn.microsoft.com/en-us/library/windowsazure/dn133141.aspx.
Hybrid storage appliance "StorSimple cloud integrated storage" http://www.microsoft.com/en-us/server-cloud/products/storsimple.
This posting is provided "AS IS" with no warranties, and confers no rights.
If you are building any kind of Windows App and have not used the certification kit to test your app for all the things Microsoft tests for take a few minutes to check out this video on how to get started.
You can download the Certification Kit here.