One feature of SharePoint is to create a calendar and make it a group calendar. This allows the user to see the schedule of list of people at the same time on the same view as per the image below.
To configure the Calendar as a group Calendar you need to edit the List settings and then click on “List name, description and navigation” and finally select the radio button as per the image below.
So all this is SharePoint out of the box functionality. So where is the problem?
The problem is that to get the view of all calendar of your work group every time you open the calendar view you need to add each one of them individually! And I could not find any way to persist this information so that it would open the calendar view automatically with the people added as per the list you see in the first image. So lets get to work.
The way I fixed this is using Java script.
To be able to do this I used the following script.
var sel = SP.UI.ApplicationPages.CalendarSelector.instance().getSelector(1, 'WPQ12'); sel.selectEntities(ret, true);
You need two things to be able to perform the above script, the ID of the calendar web part (WPQ12 above) and the XML script to be able to add the required individuals. The way to do this is to follow the following steps:
function _firstTime() {
//need to select the calendar tab so we can override the onlick method on some of the buttons. SelectRibbonTab('Ribbon.Calendar.Calendar', true);
//give the ribbon time to load setTimeout('_doWireUp();',2000);
}
function _doWireUp() { //change the onclick event for the group buttons to make sure it reloads our default group var weekElem = document.getElementById('Ribbon.Calendar.Calendar.Scope.WeekGroup-Large'); if(weekElem) weekElem.onclick = function() {setTimeout('_setDefaultResources();',1000);return false;};
var dayElem = document.getElementById('Ribbon.Calendar.Calendar.Scope.DayGroup-Large');
if(dayElem) dayElem.onclick = function() {setTimeout('_setDefaultResources();',1000);return false;};
_setDefaultResources(); }
function _setDefaultResources() {
// This is the entities XML from step1 var ret ='\u003cEntities c…………………………………………………………………………………………………………..\u002fEntities\u003e';
// Put here the web part ID from step 2
var sel = SP.UI.ApplicationPages.CalendarSelector.instance().getSelector(1, 'WPQ12'); sel.selectEntities(ret, true); }
ExecuteOrDelayUntilScriptLoaded(_firstTime, "sp.ribbon.js");
This post describes an error that occurred during System Center Virtual Machine Manager 2012 SP1 deployment, with two VMM servers in cluster configuration.
When using the VMM Console, some users (but not all users) were unable to create VMs, even if the user account is a member of the Delegated Administrator role. In this case, when the user selects “Create a new VM”, the following error message is displayed when “Next” is clicked on the “Configure Hardware” page:
ID 26726: “Either the specified user role or the specified user (%Username) is not valid. User is not a member of the role. Add (%Username) as a member of the user role and try again or provide a different user role or a different user.”
The same error persists even if the user accounts become VMM Administrators.
This error originates from a known issue (http://support.microsoft.com/kb/331951) where the VMM service does not have access to authorization information on user account objects or computer account objects. Specifically, the VMM service cannot read the token-groups-global-and-universal (TGGAU) attribute in AD.
This issue is resolved by adding the VMM Service account to the Windows Authorization Access (Pre-Windows 2000 Compatible Access) group in AD.
In conclusion, if some users are unable to create VMs through the VMM Console due to Error ID 26726, the VMM service is probably unable to verify whether those users are authorized to create VMs, and adding the VMM service to the Pre-Windows 2000 Compatible Access group resolves the issue.
I wanted to implement an Azure web site that is using the Azure Access Control Service and integrates with an external identity provider to authenticate and authorize users. At first I thought of using Windows Live ID but it has a problem that the only claim offered by WLID is the unique identified which is simply a number and represents nothing from the user. Then I thought why not make things more interesting and use Facebook. I think things got more interesting than I thought. J
I wanted to implement a Dynamic database access web site so that it generates the views on top of an existing SQL Azure database and lets the end user manipulate the database tables and filter them. This is using Linq-to-SQL classes.
I am using Visual Studio 2012 latest version.
The steps at a glance of how to get this up and going are as below:
1- Create your project:
a. Create your database.
b. Create a new Dynamic ASP.NET project.
c. Add a Linq-To-SQL model to your database.
d. Change the Framework version of the project.
e. Set the “ScaffoldAllTables” to true.
2- Download the latest “Identity and access tool”
3- Create a new Azure web site and download the publishing settings.
4- Setup your identify provider:
a. Create a new Azure ACS namespace
b. Create a new Facebook application
c. Configure your Facebook application.
d. Add your Facebook application as an identity provider.
e. Configure you claims rules
5- Set the ACS settings in the “Identity and access tool”
6- Implement your custom claims authorization manager
7- Complete the web.config configuration
8- Publish your web site.
This is the first step and as I described I wanted to create a dynamic ASP.NET site based on a custom database.
First I created the database in SQL Azure.
Created a new SQL server and provided the administrator permissions and allowed Azure services access to this server.
Then I allowed access to my IP to this database to allow me to manage it.
Then I started to design my database (this can be either done online or using Visual Studio)
Or from Visual Studio 2012
Once the database is created and ready to be used then you will go to the next step.
Open visual Studio and click on new project. You will need to switch to .net framework 4.0 to see the dynamic data ASP.NET SQL template.
Change the “Global.ascx” file to uncomment the line to connect to the classes to be as follows
DefaultModel.RegisterContext(typeof(TestDbDataClassesDataContext), new ContextConfiguration() { ScaffoldAllTables = true });
Change the .net Framework version to be 4.5 to be able to see the Identity and Access tool link.
From the visual studio tools menu click on the “Extensions and updates”
Search for the “Identity and Access tool” and install it and restart your Visual Studio.
Open the Azure management site and click on new web site.
Click on “Download the publish profile”
Save this file on your Hard disk. Now in Visual Studio click on the project and then Publish
Click on “Import”
Now select the publish settings file already downloaded.
Complete the publishing and test that the application is now published and working.
Now the next step is to configure the Facebook as an identity provider.
Logon to your Facebook account and then open the link http://developers.facebook.com
Register yourself as a developer.
Now click on Apps and then create a new App
Give your application a name
Take note of your application ID and secret.
Also enter the URL of the ACS namespace you will create on the Azure ACS web site in the next step (It is better to create that namespace first and then return to this step later).
Click save Changes.
Open the Azure portal and click new to create a new ACS names space.
Once created you can click on the Manage link to start managing it.
Start by adding the Facebook application as an identity provider. Click on identity providers and then “Add”
Enter the application ID and secret and click “Save”
While on the ACS management site click on “Management service” so see all management links required to be able to communicate with ACS.
Click on “Show Key” and then copy the symmetric key generated.
These will be the namespace and the management key of the namespace.
Right click on your project and then click “Identity and Access tool”
Configure your ACS with the namespace and the key already copied before after selecting “Use the Windows Azure Access Control Service”.
Now select the settings as below
Finally click “Ok”. This will configure your ACS service and the relying party application with the required pass-through rule for all provider claims as shown below.
The next steps are to create and implement the Claims authorization manager and configure your web.config.
Since we are using .Net framework 4.5 this is a little different than what we used to do in 3.5 since now the WIF is totally integrated in the framework.
This is done so that the code of the file would be as follows.
using System.IO;
using System.Xml;
using System.Collections.Generic;
using System;
using System.Web;
using System.Linq;
using System.Security.Claims;
namespace TestDbWebApplication
{
public class MyAuthorizationManager : ClaimsAuthorizationManager
private static Dictionary<string, string> m_policies = new Dictionary<string, string>();
public MyAuthorizationManager()
public override void LoadCustomConfiguration(XmlNodeList nodes)
foreach (XmlNode node in nodes)
//FIND ZIP CLAIM IN THE POLICY IN WEB.CONFIG AND GET ITS VALUE
//ADD THE VALUE TO MODULE SCOPE m_policies
XmlTextReader reader = new XmlTextReader(new StringReader(node.OuterXml));
reader.MoveToContent();
string resource = reader.GetAttribute("resource");
reader.Read();
string claimType = reader.GetAttribute("claimType");
if (claimType.CompareTo(ClaimTypes.Name) != 0)
throw new ArgumentNullException("Name Authorization is not specified in policy in web.config");
string name = "";
name = reader.GetAttribute("Name");
m_policies[resource] = name;
public override bool CheckAccess(AuthorizationContext context)
//GET THE IDENTITY
//COMPARE WITH THE POLICY
string allowednames = "";
string requestingname = "";
Uri webPage = new Uri(context.Resource[0].Value);
ClaimsPrincipal principal = (ClaimsPrincipal)HttpContext.Current.User;
if (principal == null)
throw new InvalidOperationException("Principal is not populate in the context - check configuration");
ClaimsIdentity identity = (ClaimsIdentity)principal.Identity;
if (m_policies.ContainsKey(webPage.PathAndQuery))
allowednames = m_policies[webPage.PathAndQuery];
requestingname = ((from c in identity.Claims
where c.Type == ClaimTypes.Name
select c.Value).FirstOrDefault());
else if (m_policies.ContainsKey("*"))
allowednames = m_policies["*"];
if (allowednames.ToLower().Contains(requestingname.ToLower()))
return true;
return false;
This would authorize users based on their login name reported by the identity provider (Facebook).
The required steps are to add the following lines:
<system.webServer>
<modules>
<add name="ClaimsAuthorizationModule" type="System.IdentityModel.Services.ClaimsAuthorizationModule, System.IdentityModel.Services, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" preCondition="managedHandler" />
</modules>
</system.webServer>
Please note the yellow highlighted section above as this is key to make this work.
<system.identityModel>
<identityConfiguration>
<securityTokenHandlers>
<remove type="System.IdentityModel.Tokens.SessionSecurityTokenHandler,System.IdentityModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />
<add type="System.IdentityModel.Services.Tokens.MachineKeySessionSecurityTokenHandler,System.IdentityModel.Services, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />
</securityTokenHandlers>
</identityConfiguration>
</system.identityModel>
Please note the highlighted section as publishing to Azure web site will not work without these lines.
Finally the claims authorization manager configuration.
<claimsAuthorizationManager type="TestDbWebApplication.MyAuthorizationManager, TestDbWebApplication">
<policy resource="*">
<claim claimType="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name" Name="Mohamed Malek" />
</policy>
</claimsAuthorizationManager>
Now you have completed the site configuration publish it and you should be able to authenticate using Facebook and authorize the application to use your Facebook settings, and finally authorize the user.
Logon to Facebook as usual.
The app would request user permissions to give the user details to the ACS service.
Web site will work as required.
Happy coding J
In Part II I’ve explained how to create bulk users using CSV file, and how to create the CSV file.
In this part we will discuss some of the management tasks including password.
Users Bulk Import with Password
In Part II I’ve explained how to prepare the CSV, create the users with randomly generated password, which will require sending this password later to the users, check the following report after creating the users:
In large environments it may be required to create users with password (could be generated from ID as an example) and of course enforce the user to change it once logon, check the following cmdlet:
Import-Csv D:\arab-users.csv | ForEach-Object -Process {New-MsolUser -UserPrincipalName $_.upn -FirstName $_.firstname -LastName $_.lastname -Department $_.department -Title $_.title -City $_.city -Country $_.country -DisplayName $_.displayname -LicenseAssignment meamcs:ENTERPRISEPACK -UsageLocation EG -Password $_.password -ForceChangePassword $true} | Export-Csv -Path c:\arabusers-result.csv -Encoding unicode
In the above cmdlet we have completed the following:
1. Create the users in the CSV file with the mentioned attributes. 2. Assign license with usage location (which is required). 3. Set each user’s password according to the value in the CSV file. 4. The user will be forced to change the password on the next logon ($true is the default value). 5. Export the result in CSV file.
1. Create the users in the CSV file with the mentioned attributes.
2. Assign license with usage location (which is required).
3. Set each user’s password according to the value in the CSV file.
4. The user will be forced to change the password on the next logon ($true is the default value).
5. Export the result in CSV file.
Let’s first discuss the second point how I got LicenseAssinment value, run the following cmdlet to get the license in your tenant:
Last thing is to make sure that the password will be according to the below conditions:
- Use 8 to 16 characters.
- Create a strong password. Office 365 requires at least 3 of the following:
o Lowercase characters o Uppercase characters o Numbers (0-9)
o Lowercase characters
o Uppercase characters
o Numbers (0-9)
o Symbols, including: ! @ # $ % ^ & * - _ + = [ ] { } | \ : ‘ , . ? / ` ~ “ < > ( ) ;
- Don’t use the user name (the part of your user ID to the left of the @ symbol).
- UNICODE values are not allowed, so no Arabic password J
Users Bulk Modify
After creating users it may be required to modify users’ attributes, so we will need to have our CSV file ready as per the Part II, then the following cmdlet shows an example of modifying some attributes:
Import-Csv D:\arab-users.csv | ForEach-Object -Process {Set-MsolUser -UserPrincipalName $_.upn -FirstName $_.firstname -LastName $_.lastname -Department $_.department -Title $_.title -City $_.city -Country $_.country -DisplayName $_.displayname -UsageLocation EG -PasswordNeverExpires $true} | Export-Csv -Path d:\arabusers-result.csv -Encoding unicode
Note:
1. From the CSV and for each user with this UserPrincipalName value ($_.upn) we have modified all of the above attributes to the values in the CSV file. 2. UsageLocation value has been used with the same cmdlet. 3. Password Never Expire has been allowed, so the password will not expire (the default is the password will expire every 90 days).
1. From the CSV and for each user with this UserPrincipalName value ($_.upn) we have modified all of the above attributes to the values in the CSV file.
2. UsageLocation value has been used with the same cmdlet.
3. Password Never Expire has been allowed, so the password will not expire (the default is the password will expire every 90 days).
The following can’t be done from this cmdlet:
1. Change the User Principal Name, which can be done through the following cmdlet:
Import-Csv C:\ newupn.csv | ForEach-Object -Process {Set-MsolUserPrincipalName -UserPrincipalName $_.upn -NewUserPrincipalName $_.upn1}
2. Resetting the password, which can be done by the following cmdlet:
Import-Csv d:\arab-userspassword.csv | ForEach-Object -Process {Set-MsolUserPassword -UserPrincipalName $_.upn -NewPassword $_.password}
3. Assign license to the users, and this can be done by using the following cmdlet:
Import-Csv d:\arab-userslicense.csv | ForEach-Object -Process {Set-MsolUserLicense -UserPrincipalName $_.upn -AddLicenses "meamcs:ENTERPRISEPACK"
Last nice trick to modify everything in one cmdlet by pipeline all the above cmdlets as the following:
Import-Csv -Path C:\arabusers.csv | ForEach-Object -Process {Set-MsolUser -UserPrincipalName $_.upn -UsageLocation EG | Set-MsolUserLicense -UserPrincipalName $_.upn -AddLicenses "meamcs:ENTERPRISEPACK" | Set-MsolUserPassword -upn $_.userprincipalname -NewPassword $_.password}
And don’t forget the “” with the license.
I hope the above cmdlets were useful and saved time of trying and searching how to do it. In the next part I will explain the reports and Dynamic Distribution Groups.
Other parts:
Part I: Arabic Problems
Part II: Users Bulk Creation
Part III: Users Bulk Modification
Part IV: Dynamic Distribution Group
This is very famous issue when you create many Virtual Machines on Windows Azure, after deleting the VMs the virtual disks (VHD files) are still there.
So far there is no way to remove these VHD files using the GUI interface, so I started to search how can I do it through PowerShell. The challenge of Azure PowerShell is that most of the resources available for developers using Visual Studio, for me any thing starts with Visual Studio is sort of mystery (The nice story that one of my developer friends identify SharePoint Site Collection: is a site contains collections is true story) that should be left to wisdom men (like this friend) who can understand it .
For Infrastructure guys like me, the following are step by step of how to use PowerShell to connect to Azure and then delete the VHDs:
Step 1: Management Certificate:
In simple words we will need a certificate to connect with PowerShell to Azure, this certificate should be 2048 bits at least, self signed certificate is good option for testing and labs.
Creating Self Signed Certificate can be done by many ways, the easiest way is using the normal PowerShell:
1. Open PowerShell (as an Administrator). 2. Type the following cmdlet:
1. Open PowerShell (as an Administrator).
2. Type the following cmdlet:
New-SelfSignedCertificate -DnsName azure -CertStoreLocation cert:\LocalMachine\My
This cmdlet will create new self signed certificate and place it in the local machine store, check the following snapshots:
Let’s make sure that the certificate is created:
1. From Run > Type mmc. 2. Click File > Add/Remove Snap-in. 3. Select Certificate and Click Add. 4. Select Computer Account > Finish. 5. Under Personal > Certificate > you should find the new certificate “azure”:
1. From Run > Type mmc.
2. Click File > Add/Remove Snap-in.
3. Select Certificate and Click Add.
4. Select Computer Account > Finish.
5. Under Personal > Certificate > you should find the new certificate “azure”:
The next step is to upload this certificate with private key (the PFX file) to Windows Azure, in order to do that you will need to export the PFX file (the certificate with the Private Key) first:
1. From the same Snap-in. 2. Right click the certificate > All Tasks > Export. 3. Click Next > Select “Yes Export the Private Key” > Click Next. 4. On Export File Format click next > Then provide the password. 5. Browse to the location where you will save the certificate.
1. From the same Snap-in.
2. Right click the certificate > All Tasks > Export.
3. Click Next > Select “Yes Export the Private Key” > Click Next.
4. On Export File Format click next > Then provide the password.
5. Browse to the location where you will save the certificate.
From Azure Management Portal create a new Cloud Service to use it with the certificate:
After having the PFX file we will need to upload it to Windows Azure:
Log into the Management Portal.
In the navigation pane, click Cloud Services, and then click the service for which you want to add a new certificate.
On the ribbon, click Certificates, and then click Upload a Certificate.
In the Upload certificate dialog, click Browse For File, go to the directory containing your certificate, and select the .pfx file to upload.
Type the password of the private key for the certificate.
Click OK.
Step 2: Install Azure PowerShell
To install Windows Azure PowerShell:
Step 3: Connect to Azure:
Before we start using the Windows Azure cmdlets to remove the VHDs or anything else, we will need to configure connectivity between the workstation and Windows Azure. This can be done by downloading the PublishSettings file (this file will setup the PowerShell environment to use Windows Azure) from Windows Azure and importing it. The settings for Windows Azure PowerShell are stored in: <user>\AppData\Roaming\Windows Azure PowerShell.
Get-AzurePublishSettingsFile
A browser window opens at https://windows.azure.com/download/publishprofile.aspx, where you can sign in to Windows Azure.
Import-AzurePublishSettingsFile FileName.publishsettings
Step 4: Remove the VHDs (Finally)
Once we are connected now to Azure, we can manage Azure with PowerShell and do anything we need, if you forgot what was the purpose of this article this is the time you will need to scroll up and read it again
Run the following cmdlet to see all the virtual disks (the VHD file):
Get-Azuredisk
In the above snapshot this is the disk that I want to remove (you will need to make sure that is not connected to any Virtual Machine), then run the following cmdlet:
Remove-AzureDisk <DiskName>
Now you will probably decide to leave the VHDs and not delete them which is easier , I don’t blame you for that. But you will have this article as reference in case one day someone asked you the question: How can I remove the unused VHD file.
for more details about Azure PowerShell: http://msdn.microsoft.com/en-us/library/windowsazure/jj156055.aspx
for more information about Azure Management Certificate: http://msdn.microsoft.com/en-us/library/windowsazure/gg981929.aspx
Introduction
As I was working with one of the projects I had a customized InfoPath Form that has 100's of fields and multiple Views. As I was going though I had a concern that a lot of these fields were not used, so how to find which fields are used and which aren't. In this post I will give you some suggestions to tackle such issue.
Suggestion # 1
The easiest choice is to start with InfoPath "Rule Inspector" which will help you go through all the fields. However, using the Rule Inspector will not find fields that are not bound in the form (such as fields assigned using code behind), so if you have a custom development the "Rule Inspector" could be a good start for you but It will not be enough.
Suggestion # 2
In case you have some code behind in your InfoPath Form, you need to searching through the code as well. A good place to start is to open the "FormCode.cs" file using a development tool such as Visual Studio and start searching.
Suggestion # 3
Another suggestion to "Export Source Files" and search though the "manifest.xsf". To explain this approach let me start by explaining more about InfoPath Forms; The InfoPath Form ".XSN File" consist of more that one source file consolidated together in one single ".XSN File" for convenience purpose. What we aim in this approach is to extra these source files and start searching through them.
An example using InfoPath 2010,
After that open the "manifest.xsf" file in a development tool such as Visual Studio and start searching thought the fields names.
Suggestion # 4
This is an extension to the previous step where you would like to determine which field is used in which View. To do so start by Exporting the Form to Source Files as shown in the previous step and search through XSL files; each one of XSL files represents a View in InfoPath Form.
Hope you found this information useful.