GD Bloggers

This is the blog site for Microsoft Global Delivery Communities focused in sharing the technical knowledge about devices, apps and cloud.
Follow Us On Twitter! Subscribe To Our Blog! Contact Us

April, 2013

  • SharePoint Group Calendar, Adding default users

    The Problem

    One feature of SharePoint is to create a calendar and make it a group calendar. This allows the user to see the schedule of list of people at the same time on the same view as per the image below.

    image

    To configure the Calendar as a group Calendar you need to edit the List settings and then click on “List name, description and navigation” and finally select the radio button as per the image below.

    image

    So all this is SharePoint out of the box functionality. So where is the problem?

    The problem is that to get the view of all calendar of your work group every time you open the calendar view you need to add each one of them individually! And I could not find any way to persist this information so that it would open the calendar view automatically with the people added as per the list you see in the first image. So lets get to work.

    The Solution

    The way I fixed this is using Java script.

    To be able to do this I used the following script.

    var sel = SP.UI.ApplicationPages.CalendarSelector.instance().getSelector(1, 'WPQ12');
    sel.selectEntities(ret, true);

    You need two things to be able to perform the above script, the ID of the calendar web part (WPQ12 above) and the XML script to be able to add the required individuals. The way to do this is to follow the following steps:

    Step 1: Getting the XML to add the required users

    1. Install and start Fiddler
    2. Click on the ribbon on the small arrow below “people” then “Add Person or Group”
      image
    3. This will give you the dialog to select the users, add all required users to be added by default to the Calendar view and then click “Ok” as image below
      image
    4. See Fiddler log to find an entry to add the users as per the below.
      image
    5. Then click on the response as text view you will find the Entities string as per the image below.
      image
    6. Copy this text to be used later in your script.

    Step 2: Getting the web part ID

    1. Open the page with the calendar web part
    2. Open the IE developer tools (F12)
    3. Click on the selector arrow and then on the calendar web part (most top level)
    4. In the developer tools view you should see Id of the web part as below
      image

    Step 3: Create the required script file

    1. Use the information gathered to construct a JS file as below

    function _firstTime() {

    //need to select the calendar tab so we can override the onlick method on some of the buttons.
    SelectRibbonTab('Ribbon.Calendar.Calendar', true);

    //give the ribbon time to load
    setTimeout('_doWireUp();',2000);

    }

    function _doWireUp()
    {
    //change the onclick event for the group buttons to make sure it reloads our default group
    var weekElem = document.getElementById('Ribbon.Calendar.Calendar.Scope.WeekGroup-Large');
    if(weekElem)
    weekElem.onclick = function() {setTimeout('_setDefaultResources();',1000);return false;};

    var dayElem = document.getElementById('Ribbon.Calendar.Calendar.Scope.DayGroup-Large');

    if(dayElem)
    dayElem.onclick = function() {setTimeout('_setDefaultResources();',1000);return false;};

    _setDefaultResources();
    }

    function _setDefaultResources() {

    // This is the entities XML from step1
        var ret ='\u003cEntities c…………………………………………………………………………………………………………..\u002fEntities\u003e';

    // Put here the web part ID from step 2

            var sel = SP.UI.ApplicationPages.CalendarSelector.instance().getSelector(1, 'WPQ12');
            sel.selectEntities(ret, true);
    }

    ExecuteOrDelayUntilScriptLoaded(_firstTime, "sp.ribbon.js");

    1. Save the file as cal.js (any name)

    Step 4: Configure the web page

    1. Upload the JS file to any accessible location on SharePoint such as the styles library
    2. Check it in and publish it if required
    3. Edit the page with the calendar web part and add a content editor web part to the bottom of the page
    4. Edit the source of the content editor web part and add the following line
      <script src="/sites/<your site name>/Style%20Library/ctrl/cal.js" type="text/javascript"></script>
    5. Save the page and then reload it.
    6. See the magic happen :)
  • Some users unable to create VMs in VMM 2012 SP1: “User or user role not valid” (Error ID 26726)

    This post describes an error that occurred during System Center Virtual Machine Manager 2012 SP1 deployment, with two VMM servers in cluster configuration.

    When using the VMM Console, some users (but not all users) were unable to create VMs, even if the user account is a member of the Delegated Administrator role. In this case, when the user selects “Create a new VM”, the following error message is displayed when “Next” is clicked on the “Configure Hardware” page:

    ID 26726: “Either the specified user role or the specified user (%Username) is not valid. User is not a member of the role. Add (%Username) as a member of the user role and try again or provide a different user role or a different user.”

    The same error persists even if the user accounts become VMM Administrators.

    This error originates from a known issue (http://support.microsoft.com/kb/331951) where the VMM service does not have access to authorization information on user account objects or computer account objects. Specifically, the VMM service cannot read the token-groups-global-and-universal (TGGAU) attribute in AD.

    This issue is resolved by adding the VMM Service account to the Windows Authorization Access (Pre-Windows 2000 Compatible Access) group in AD.

    In conclusion, if some users are unable to create VMs through the VMM Console due to Error ID 26726, the VMM service is probably unable to verify whether those users are authorized to create VMs, and adding the VMM service to the Pre-Windows 2000 Compatible Access group resolves the issue.

  • Securing Dynamic Data ASP.NET SQL Azure Published Web Site with ACS and Facebook as an Identity Provider

    The Scenario

    I wanted to implement an Azure web site that is using the Azure Access Control Service and integrates with an external identity provider to authenticate and authorize users. At first I thought of using Windows Live ID but it has a problem that the only claim offered by WLID is the unique identified which is simply a number and represents nothing from the user. Then I thought why not make things more interesting and use Facebook. I think things got more interesting than I thought. J

    I wanted to implement a Dynamic database access web site so that it generates the views on top of an existing SQL Azure database and lets the end user manipulate the database tables and filter them. This is using Linq-to-SQL classes.

    I am using Visual Studio 2012 latest version.

    The Steps highlights

    The steps at a glance of how to get this up and going are as below:

    1-     Create your project:

    a.      Create your database.

    b.      Create a new Dynamic ASP.NET project.

    c.      Add a Linq-To-SQL model to your database.

    d.      Change the Framework version of the project.

    e.      Set the “ScaffoldAllTables” to true.

    2-     Download the latest “Identity and access tool”

    3-     Create a new Azure web site and download the publishing settings.

    4-     Setup your identify provider:

    a.      Create a new Azure ACS namespace

    b.      Create a new Facebook application

    c.      Configure your Facebook application.

    d.      Add your Facebook application as an identity provider.

    e.      Configure you claims rules

    5-     Set the ACS settings in the “Identity and access tool”

    6-     Implement your custom claims authorization manager

    7-     Complete the web.config configuration

    8-     Publish your web site.

    Solution Walkthrough and description

    Create your project

    This is the first step and as I described I wanted to create a dynamic ASP.NET site based on a custom database.

    Create the Database

    First I created the database in SQL Azure.

    clip_image002

    Created a new SQL server and provided the administrator permissions and allowed Azure services access to this server.

    clip_image004

    Then I allowed access to my IP to this database to allow me to manage it.

    clip_image006

    Then I started to design my database (this can be either done online or using Visual Studio)

    clip_image008

    clip_image010

    Or from Visual Studio 2012

    clip_image012

    clip_image014

    Once the database is created and ready to be used then you will go to the next step.

    Create the Dynamic ASP.NET project

    Open visual Studio and click on new project. You will need to switch to .net framework 4.0 to see the dynamic data ASP.NET SQL template.

    clip_image016

    clip_image018

    clip_image020

    clip_image022

    Change the “Global.ascx” file to uncomment the line to connect to the classes to be as follows

    DefaultModel.RegisterContext(typeof(TestDbDataClassesDataContext), new ContextConfiguration() { ScaffoldAllTables = true });

    Change the .net Framework version to be 4.5 to be able to see the Identity and Access tool link.

    clip_image024

    Download the Identity and Access tool

    From the visual studio tools menu click on the “Extensions and updates”

    clip_image025

    Search for the “Identity and Access tool” and install it and restart your Visual Studio.

    clip_image027

    Create the Azure Web Site

    Open the Azure management site and click on new web site.

    clip_image028

    Click on “Download the publish profile”

    clip_image030

    Save this file on your Hard disk. Now in Visual Studio click on the project and then Publish

    clip_image031

    Click on “Import”

    clip_image033

    Now select the publish settings file already downloaded.

    Complete the publishing and test that the application is now published and working.

    clip_image034

    Now the next step is to configure the Facebook as an identity provider.

    Setup Facebook as an Identity Provider

    Create the Facebook application

    Logon to your Facebook account and then open the link http://developers.facebook.com

    Register yourself as a developer.

    clip_image036

    Now click on Apps and then create a new App

    clip_image038

    Give your application a name

    clip_image039

    Take note of your application ID and secret.

    Also enter the URL of the ACS namespace you will create on the Azure ACS web site in the next step (It is better to create that namespace first and then return to this step later).

    clip_image041

    Click save Changes.

    Create Your ACS Namespace

    Open the Azure portal and click new to create a new ACS names space.

    clip_image043

    Once created you can click on the Manage link to start managing it.

    clip_image045

    Configure your ACS service

    Start by adding the Facebook application as an identity provider. Click on identity providers and then “Add”

    clip_image046

    clip_image048

    Enter the application ID and secret and click “Save”

    clip_image050

    Configure Your Project to Link ACS Service

    While on the ACS management site click on “Management service” so see all management links required to be able to communicate with ACS.

    clip_image052

    clip_image054

    clip_image056

    Click on “Show Key” and then copy the symmetric key generated.

    These will be the namespace and the management key of the namespace.

    Right click on your project and then click “Identity and Access tool”

    clip_image057

    Configure your ACS with the namespace and the key already copied before after selecting “Use the Windows Azure Access Control Service”.

    clip_image059

    clip_image061

     

    Now select the settings as below

    clip_image063

    Finally click “Ok”. This will configure your ACS service and the relying party application with the required pass-through rule for all provider claims as shown below.

    clip_image065

    The next steps are to create and implement the Claims authorization manager and configure your web.config.

    Implement a Custom Claims Authorization Manager

    Since we are using .Net framework 4.5 this is a little different than what we used to do in 3.5 since now the WIF is totally integrated in the framework.

    Add reference to the System.IdentityModel assembly

    clip_image067

    Add and implement the new class “MyAuthorizationManager

    This is done so that the code of the file would be as follows.

    using System.IO;

    using System.Xml;

    using System.Collections.Generic;

    using System;

    using System.Web;

    using System.Linq;

    using System.Security.Claims;

     

    namespace TestDbWebApplication

    {

        public class MyAuthorizationManager : ClaimsAuthorizationManager

        {

            private static Dictionary<string, string> m_policies = new Dictionary<string, string>();

     

            public MyAuthorizationManager()

            {

            }

            public override void LoadCustomConfiguration(XmlNodeList nodes)

            {

                foreach (XmlNode node in nodes)

                {

                    {

                        //FIND ZIP CLAIM IN THE POLICY IN WEB.CONFIG AND GET ITS VALUE

                        //ADD THE VALUE TO MODULE SCOPE m_policies

                        XmlTextReader reader = new XmlTextReader(new StringReader(node.OuterXml));

                        reader.MoveToContent();

                        string resource = reader.GetAttribute("resource");

                        reader.Read();

                        string claimType = reader.GetAttribute("claimType");

                        if (claimType.CompareTo(ClaimTypes.Name) != 0)

                        {

                            throw new ArgumentNullException("Name Authorization is not specified in policy in web.config");

                        }

                        string name = "";

                        name = reader.GetAttribute("Name");

                        m_policies[resource] = name;

                    }

                }

            }

            public override bool CheckAccess(AuthorizationContext context)

            {

                //GET THE IDENTITY

                //COMPARE WITH THE POLICY

                string allowednames = "";

                string requestingname = "";

                Uri webPage = new Uri(context.Resource[0].Value);

                ClaimsPrincipal principal = (ClaimsPrincipal)HttpContext.Current.User;

                if (principal == null)

                {

                    throw new InvalidOperationException("Principal is not populate in the context - check configuration");

                }

                ClaimsIdentity identity = (ClaimsIdentity)principal.Identity;

                if (m_policies.ContainsKey(webPage.PathAndQuery))

                {

                    allowednames = m_policies[webPage.PathAndQuery];

                    requestingname = ((from c in identity.Claims

                                            where c.Type == ClaimTypes.Name

                                            select c.Value).FirstOrDefault());

                }

                else if (m_policies.ContainsKey("*"))

                {

                    allowednames = m_policies["*"];

                    requestingname = ((from c in identity.Claims

                                       where c.Type == ClaimTypes.Name

                                       select c.Value).FirstOrDefault());

                }

                if (allowednames.ToLower().Contains(requestingname.ToLower()))

                {

                    return true;

                }

                return false;

            }

        }

    }

    This would authorize users based on their login name reported by the identity provider (Facebook).

    Configure the Authorization manager in the Web.Config

    The required steps are to add the following lines:

      <system.webServer>

        <modules>

          <add name="ClaimsAuthorizationModule" type="System.IdentityModel.Services.ClaimsAuthorizationModule, System.IdentityModel.Services, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" preCondition="managedHandler" />

        </modules>

      </system.webServer>

    Please note the yellow highlighted section above as this is key to make this work.

    <system.identityModel>

      <identityConfiguration>

        <securityTokenHandlers>

          <remove type="System.IdentityModel.Tokens.SessionSecurityTokenHandler,System.IdentityModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />

          <add type="System.IdentityModel.Services.Tokens.MachineKeySessionSecurityTokenHandler,System.IdentityModel.Services, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />

        </securityTokenHandlers>

      </identityConfiguration>

    </system.identityModel>


    Please note the highlighted section as publishing to Azure web site will not work without these lines.

    Finally the claims authorization manager configuration.

    <system.identityModel>

      <identityConfiguration>

        <claimsAuthorizationManager type="TestDbWebApplication.MyAuthorizationManager, TestDbWebApplication">

          <policy resource="*">

            <claim claimType="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name" Name="Mohamed Malek" />

          </policy>

        </claimsAuthorizationManager>

      </identityConfiguration>

    </system.identityModel>

    Publish and test your site

    Now you have completed the site configuration publish it and you should be able to authenticate using Facebook and authorize the application to use your Facebook settings, and finally authorize the user.

    clip_image068

    clip_image070

     

    Logon to Facebook as usual.

    clip_image072

    The app would request user permissions to give the user details to the ACS service.

    clip_image074

    Web site will work as required.

    clip_image076

    Happy coding J

  • Finding unused fields in InfoPath From

    Introduction

    As I was working with one of the projects I had a customized InfoPath Form that has 100's of fields and multiple Views. As I was going though I had a concern that a lot of these fields were not used, so how to find which fields are used and which aren't. In this post I will give you some suggestions to tackle such issue.

     

    Suggestion # 1

    The easiest choice is to start with InfoPath "Rule Inspector" which will help you go through all the fields. However, using the Rule Inspector will not find fields that are not bound in the form (such as fields assigned using code behind), so if you have a custom development the "Rule Inspector" could be a good start for you but It will not be enough.

    Suggestion # 2

    In case you have some code behind in your InfoPath Form, you need to searching through the code as well. A good place to start is to open the "FormCode.cs" file using a development tool such as Visual Studio and start searching.

     

    Suggestion # 3

    Another suggestion to "Export Source Files" and search though the "manifest.xsf". To explain this approach let me start by explaining more about InfoPath Forms; The InfoPath Form ".XSN File" consist of more that one source file consolidated together in one single ".XSN File" for convenience purpose. What we aim in this approach is to extra these source files and start searching through them. 

    An example using InfoPath 2010,

    • Open the InfoPath Form in Design
    • Click File from the File Menu
    • Click Publish from the Left Side Menu
    • Finally Click Export Source Files from Export

    After that open the "manifest.xsf" file in a development tool such as Visual Studio and start searching thought the fields names.

     

    Suggestion # 4

    This is an extension to the previous step where you would like to determine which field is used in which View. To do so start by Exporting the Form to Source Files as shown in the previous step and search through XSL files; each one of XSL files represents a View in InfoPath Form.

     

    Hope you found this information useful. 

     

  • Windows Azure – Remove Virtual Disks

    This is very famous issue when you create many Virtual Machines on Windows Azure, after deleting the VMs the virtual disks (VHD files) are still there.

    So far there is no way to remove these VHD files using the GUI interface, so I started to search how can I do it through PowerShell. The challenge of Azure PowerShell is that most of the resources available for developers using Visual Studio, for me any thing starts with Visual Studio is sort of mystery (The nice story that one of my developer friends identify SharePoint Site Collection: is a site contains collections is true story) that should be left to wisdom men (like this friend) who can understand it Smile.

    For Infrastructure guys like me, the following are step by step of how to use PowerShell to connect to Azure and then delete the VHDs:

    Step 1: Management Certificate:

    In simple words we will need a certificate to connect with PowerShell to Azure, this certificate should be 2048 bits at least, self signed certificate is good option for testing and labs.

    Creating Self Signed Certificate can be done by many ways, the easiest way is using the normal PowerShell:

    1. Open PowerShell (as an Administrator).

    2. Type the following cmdlet:

    New-SelfSignedCertificate -DnsName azure -CertStoreLocation cert:\LocalMachine\My

    This cmdlet will create new self signed certificate and place it in the local machine store, check the following snapshots:

    image

    Let’s make sure that the certificate is created:

    1. From Run > Type mmc.

    2. Click File > Add/Remove Snap-in.

    3. Select Certificate and Click Add.

    4. Select Computer Account > Finish.

    5. Under Personal > Certificate > you should find the new certificate “azure”:

    image

    The next step is to upload this certificate with private key (the PFX file) to Windows Azure, in order to do that you will need to export the PFX file (the certificate with the Private Key) first:

    1. From the same Snap-in.

    2. Right click the certificate > All Tasks > Export.

    3. Click Next > Select “Yes Export the Private Key” > Click Next.

    4. On Export File Format click next > Then provide the password.

    5. Browse to the location where you will save the certificate.

    From Azure Management Portal create a new Cloud Service to use it with the certificate:

    image

    After having the PFX file we will need to upload it to Windows Azure:

    1. Log into the Management Portal.

    2. In the navigation pane, click Cloud Services, and then click the service for which you want to add a new certificate.

    3. On the ribbon, click Certificates, and then click Upload a Certificate.

    4. In the Upload certificate dialog, click Browse For File, go to the directory containing your certificate, and select the .pfx  file to upload.

    5. Type the password of the private key for the certificate.

    6. Click OK.

    image

    Step 2: Install Azure PowerShell

    To install Windows Azure PowerShell:

    1. Download Windows Azure PowerShell: http://go.microsoft.com/?linkid=9811175&clcid=0x409 
    2. Install Windows Azure PowerShell.

    Step 3: Connect to Azure:

    Before we start using the Windows Azure cmdlets to remove the VHDs or anything else, we will need to configure connectivity between the workstation and Windows Azure. This can be done by downloading the PublishSettings file (this file will setup the PowerShell environment to use Windows Azure) from Windows Azure and importing it. The settings for Windows Azure PowerShell are stored in: <user>\AppData\Roaming\Windows Azure PowerShell.

    1. From Windows Azure PowerShell type the following cmdlet:

      Get-AzurePublishSettingsFile

      A browser window opens at https://windows.azure.com/download/publishprofile.aspx, where you can sign in to Windows Azure.

    2. Sign in to the Windows Azure Management Portal, and then follow the instructions to download your Windows Azure publishing settings. Use your browser to save the file as a .publishsettings file to your local computer. Note the location of the file.
    3. In the Windows Azure PowerShell window, run the following command:

      Import-AzurePublishSettingsFile FileName.publishsettings

    image

    Step 4: Remove the VHDs (Finally)

    Once we are connected now to Azure, we can manage Azure with PowerShell and do anything we need, if you forgot what was the purpose of this article this is the time you will need to scroll up and read it again Smile

    Run the following cmdlet to see all the virtual disks (the VHD file):

    Get-Azuredisk

    disk

    In the above snapshot this is the disk that I want to remove (you will need to make sure that is not connected to any Virtual Machine), then run the following cmdlet:

    Remove-AzureDisk <DiskName>

    image

    Now you will probably decide to leave the VHDs and not delete them which is easier Smile, I don’t blame you for that. But you will have this article as reference in case one day someone asked you the question: How can I remove the unused VHD file.

    for more details about Azure PowerShell: http://msdn.microsoft.com/en-us/library/windowsazure/jj156055.aspx

    for more information about Azure Management Certificate: http://msdn.microsoft.com/en-us/library/windowsazure/gg981929.aspx 

     

  • Office 365 Arabization – Part III

    In Part II I’ve explained how to create bulk users using CSV file, and how to create the CSV file.

    In this part we will discuss some of the management tasks including password.

    Users Bulk Import with Password

    In Part II I’ve explained how to prepare the CSV, create the users with randomly generated password, which will require sending this password later to the users, check the following report after creating the users:

    clip_image002

    In large environments it may be required to create users with password (could be generated from ID as an example) and of course enforce the user to change it once logon, check the following cmdlet:

    Import-Csv D:\arab-users.csv | ForEach-Object -Process {New-MsolUser -UserPrincipalName $_.upn -FirstName $_.firstname -LastName $_.lastname -Department $_.department -Title $_.title -City $_.city -Country $_.country -DisplayName $_.displayname -LicenseAssignment meamcs:ENTERPRISEPACK -UsageLocation EG -Password $_.password -ForceChangePassword $true} | Export-Csv -Path c:\arabusers-result.csv -Encoding unicode

    In the above cmdlet we have completed the following:

    1. Create the users in the CSV file with the mentioned attributes.

    2. Assign license with usage location (which is required).

    3. Set each user’s password according to the value in the CSV file.

    4. The user will be forced to change the password on the next logon ($true is the default value).

    5. Export the result in CSV file.

    Let’s first discuss the second point how I got LicenseAssinment value, run the following cmdlet to get the license in your tenant:

    05-01

    Last thing is to make sure that the password will be according to the below conditions:

    - Use 8 to 16 characters.

    - Create a strong password. Office 365 requires at least 3 of the following:

    o Lowercase characters

    o Uppercase characters

    o Numbers (0-9)

    o Symbols, including: ! @ # $ % ^ & * - _ + = [ ] { } | \ : ‘ , . ? / ` ~ “ < > ( ) ;

    - Don’t use the user name (the part of your user ID to the left of the @ symbol).

    - UNICODE values are not allowed, so no Arabic password J

    Users Bulk Modify

    After creating users it may be required to modify users’ attributes, so we will need to have our CSV file ready as per the Part II, then the following cmdlet shows an example of modifying some attributes:

    Import-Csv D:\arab-users.csv | ForEach-Object -Process {Set-MsolUser -UserPrincipalName $_.upn -FirstName $_.firstname -LastName $_.lastname -Department $_.department -Title $_.title -City $_.city -Country $_.country -DisplayName $_.displayname -UsageLocation EG -PasswordNeverExpires $true} | Export-Csv -Path d:\arabusers-result.csv -Encoding unicode

    Note:

    1. From the CSV and for each user with this UserPrincipalName value ($_.upn) we have modified all of the above attributes to the values in the CSV file.

    2. UsageLocation value has been used with the same cmdlet.

    3. Password Never Expire has been allowed, so the password will not expire (the default is the password will expire every 90 days).

    The following can’t be done from this cmdlet:

    1. Change the User Principal Name, which can be done through the following cmdlet:

    Import-Csv C:\ newupn.csv | ForEach-Object -Process {Set-MsolUserPrincipalName -UserPrincipalName $_.upn -NewUserPrincipalName $_.upn1}

    2. Resetting the password, which can be done by the following cmdlet:

    Import-Csv d:\arab-userspassword.csv | ForEach-Object -Process {Set-MsolUserPassword -UserPrincipalName $_.upn -NewPassword $_.password}

    3. Assign license to the users, and this can be done by using the following cmdlet:

    Import-Csv d:\arab-userslicense.csv | ForEach-Object -Process {Set-MsolUserLicense -UserPrincipalName $_.upn -AddLicenses "meamcs:ENTERPRISEPACK"

    Last nice trick to modify everything in one cmdlet by pipeline all the above cmdlets as the following:

    Import-Csv -Path C:\arabusers.csv | ForEach-Object -Process {Set-MsolUser -UserPrincipalName $_.upn -UsageLocation EG | Set-MsolUserLicense -UserPrincipalName $_.upn -AddLicenses "meamcs:ENTERPRISEPACK" | Set-MsolUserPassword -upn $_.userprincipalname -NewPassword $_.password}

    And don’t forget the “” with the license.

    I hope the above cmdlets were useful and saved time of trying and searching how to do it. In the next part I will explain the reports and Dynamic Distribution Groups.

    Other parts:

    Part I: Arabic Problems

    Part II: Users Bulk Creation

    Part III: Users Bulk Modification

    Part IV: Dynamic Distribution Group