Service Management Automation and SharePoint - #MVP

Service Management Automation and SharePoint - #MVP

  • Comments 2
  • Likes

Goal: End to End Service Delivery

Ryan Andorfer, a System Center: Cloud and Datacenter MVP, and I have been talking about experiences in automation and where we can provide more practical uses than the typical high-level demonstration or complex scalability scenario. Ryan who works as a Sr. IT Automation Engineer for a General Mills and has developed tons of processes and examples he has shared. November brought Ryan to Microsoft campus for the MVP annual summit and during his time here we discussed how we could get some more of these solutions available. Below is the interview I did with Ryan around automating with SMA, and some samples to help get you started.

Background

Internally Ryan’s team goal and primary objective is to deliver end-to-end services to our consumers of IT services. The services they provide to these customers include everything from self-service re-imaging of end user’s computers to automated SCCM packaging to creating Exchange mailboxes and distribution lists. Ryan has been doing this automation work for about 3 years now using solely the Microsoft stack. He and his team have played around with a number of delivery ‘stacks’ each with its own Pros and Cons in the past (outline chronologically below).

1.       Opalis 6.3 Automation engine with Custom .NET web front ends

2.       System Center Orchestrator 2012 with Service Manager Self Service Portal

3.       System Center Orchestrator 2012 with SharePoint

In the past Ryan’s team maintained the same ‘automation engine’ which was the legacy Opalis engine. This engine is based on integration packs and has served them very well. Its logging capabilities, graphical design and web service have worked very well and allowed every request to be automated that the end users had dreamed up. The team had constant battles with the User Interface. Custom .Net websites will allow you to do anything, but the time required to design and support them made it an unsupportable platform for the number of requests being served to the customers. The suggested UI for this purpose, Service Manager SSP, which was not flexible enough for them and posed some performance and stability issues. That decision landed them onto their current platform, SharePoint Lists with InfoPath forms. For now they believe this is a viable option that provides them with the stability and flexibility that they want. In the current model Ryan’s team uses Orchestrator with the integration pack from Jeff Fanjoy (http://orchestrator.codeplex.com/releases/view/75877) for SharePoint integration. The process is to poll each list that has an InfoPath form related to it using custom CAML queries to return results that are in a status that indicates if they should be processed. This has worked very well for them and provides great performance to scale.

Swapping out the Engine

Up until this point the changes they have made have all been UI changes. With the 2012 R2 wave of releases from Microsoft however they received a new automation engine from the Orchestrator team called Service Management Automation, or SMA for short. This product has a very similar architecture to Orchestrator, which makes sense given that it is designed by the Orchestrator team. In order to take advantage of the solution they are leveraging the Windows Azure Pack (WAP). To see more detailed information on the architecture of SMA you can refer to http://technet.microsoft.com/en-us/library/dn469259.aspx
image
The largest difference is that the Runbook Workers no longer run the Opalis Engine, they now execute in the PowerShell Workflow Engine. This has a number of large benefits for automation engineers.

Benefits

1.       It allows authors automation in PowerShell (supports v4 with 64 bit!)

a.       For many this is huge as the amount of reference material / scripts for PowerShell is much larger than it was for Orchestrator

·   Ryan and team believe this means that there will be  less ‘re-inventing’ of the wheel that the team has to do

·   There is more PowerShell experience on other infrastructure teams inside of their company than there was Orchestrator experience

Normal Automation tasks usually involved taking an existing PowerShell script from a team, reconfiguring it to work in Orchestrator and building an input UI. We believe that there will be less re-configuring that our team needs to do going forward and more of this work can be pushed back out to the external teams.” Said Ryan.

b.      The number of integration points for PowerShell is huge

·   The number of PowerShell Modules from Microsoft and other vendors is huge compared to the number of integration packs

·   PowerShell is much more adopted in the industry and is the standard for automating infrastructure. This level of adoption means that it makes sense financially for third parties to make PowerShell modules. This was not necessarily true for Orchestrator Integration Packs.

·   Just like Orchestrator if you can’t find a Module you can create one. In Ryan’s opinion creating PowerShell modules is far more documented and supportable than creating Integration Packs was (and Ryan has created his fair share of Integration Packs and Modules).

c.       If there isn’t a module for PowerShell you can fallback to .Net

·   PowerShell really is just another interface to the .Net runtime. This means that if it cannot be done with native modules you can drop down to .Net classes and get the work done. This is a large benefit if you have any developer available to your teams

2.       Check pointing becomes a reality

a.       You now have the ability to check point our automation. This means that you can instrument your automation to save state once it reaches a certain point

·   Allowing you to ‘stop’ running automation to do automation and then ‘resume’ it.

b.      Better resiliency – if for any reason a Runbook server crashes all automation can now re-start from the latest check point without having to completely re-run

c.       Hugely simplifies making automation that is ‘enterprise grade’ and can handle infrastructure related failures

3.       Everything that can be done through a UI can be done through code

a.       This is huge, the fact that there is no ‘hidden’ calls or undocumented COM classes, if it can be done through the WAP UI it can be directly coded

b.      Automated environmental tasks, like Change Control / Code Promotion can be automated! In fact some people from my team at Microsoft have already built us some examples of how this can be done (Thanks to Jim!).

c.       Automated tasks like nightly Runbook backups to TFS should be very easy to codify

·   Plus, because you now have PowerShell scripts that you are exporting, diffs between versions in TFS will actually be human readable!

4.       Better load balancing

a.       Gone is the ‘spill-over’ load balancing that we had with Orchestrator which often left lots of resources / Runbook Servers unused. Now when a Runbook Server is brought on-line it has a portion of the job cache given to it. This means that you should see load distributed across all Runbook Servers

b.      Note, this balancing is done when the Runbook Server is brought online and there is some coordination that needs to be done between all Runbook Servers for scale in / scale out operations. This also means that the balancing is done statically not in response to load on any one Runbook Server

c.       See http://technet.microsoft.com/en-us/library/dn530618(v=sc.20).aspx for more information

5.       Easier understanding for parallel and sequential processing

Draw Backs

When talking with Ryan about his experiences as an MVP who focuses so much of his energy on automation the following list is his initial concerns.

1.       SMA is a v1.0 product

·         PowerShell workflow, the engine, has been around for a long time, and its underlying engine, Windows Workflow Foundation, has been around for even longer but never less SMA is v1

2.       SMA has only one roll, administrator. If you have access to SMA you have access to everything

3.       SMA has no ‘graphical’ design for Runbooks like we had in Orchestrator

“This isn’t a show stopper for us and I would challenge anyone who said that this is a huge showstopper if they have used Orchestrator. I cannot recall one Runbook I have written in the last 3 years that didn’t require me to use PowerShell in some capacity.  This means that if you have been doing Orchestrator work successfully you probably know PowerShell in some capacity. Continuing to invest in that knowledge is a good bet and will show dividends”. Commented Ryan.

Decision

Ryan and his team have decided to investigate what it would look like if they stopped doing automation work in Orchestrator and began doing all new work in SMA. The deciding point in this decision was future compatibility.

Next Steps

Given their front-end system currently is SharePoint step 1 is determining how to integrate SMA into SharePoint! To that end Ryan has a simple monitor script that checks for new list items and processes them. The example shows how to process things serially and in parallel using PowerShell Workflow and SMA.

Example:

The PowerShell Workflow Script (Not SMA integrated)

Workflow Monitor-Demo
{
Param([PSCredential] $credential)
Function Update-SharePointListItem
{
Param([String]$listURI, [String]$PropertyName, [String]$PropertyValue, [PSCredential]$credential)
$updatePropertyName = $PropertyName
$updateValue = $PropertyValue
$body = "{" + [String]::Format("{0}: '{1}'", $updatePropertyName, $updateValue) + "}"
Invoke-RestMethod -Method Merge -Uri $listURI -Body $body -ContentType "application/json" -Headers @{"If-Match"="*"} -Credential $credential
}
# Get the credential to authenticate to the SharePoint List with
#$credential = (Get-Credential)

# Time between polling
$delay = 5

# SharePoint Site
$spSite = "http://portal.contoso.com/sites/selfservice"

# List Name
$spList = "Demo"

#Filter -- in this example only process requests in the new state. this is a default on the SP list I created
$spFilter = "Status eq 'new'"
#$spFilter = ""

#combined uri
$uri
if($spFilter) { $uri = [System.String]::Format("{0}/_vti_bin/listdata.svc/{1}?`$filter={2}",$spSite, $spList, $spFilter) }
else { $uri = [System.String]::Format("{0}/_vti_bin/listdata.svc/{1}",$spSite, $spList) }

while($true)
{
$listItemURIs = inlinescript {
$itemIDs = @()
$listItems = Invoke-RestMethod -Uri $Using:uri -Credential $Using:credential
#$listItems = Invoke-RestMethod -Uri $uri -Credential $credential
foreach($li in $listItems)
{
$itemIDs += $li.id
}
$itemIDs
}
if($listItemURIs)
{
foreach -Parallel ($listItemURI in $listItemURIs)
{
#get requested user name -- example of getting properties
# $listItem.Content.Properties is base .UserName gets the UserName Property
$userName = inlinescript {
$listItem = Invoke-RestMethod -Uri $Using:listItemURI -Credential $Using:credential
$userName = $listItem.Entry.Content.Properties.UserName
$userName
}

#Change the Status Property
Update-SharePointListItem -listURI $listItemURI -PropertyName "Status" -PropertyValue "Executing" -credential $credential
Checkpoint-Workflow

start-sleep -seconds 10;
#Change the Name Property
Update-SharePointListItem -listURI $listItemURI -PropertyName "UserName" -PropertyValue "$userName - New Name" -credential $credential

#Change the Status Property
Update-SharePointListItem -listURI $listItemURI -PropertyName "Status" -PropertyValue "Completed" -credential $credential
Checkpoint-Workflow
}
}
Start-Sleep -Seconds $delay
}
}

Monitor-Demo -credential (Get-Credential)

Changing the Script to be SMA integrated

SMA allows them to store credentials in a secure, encrypted fashion (just like Orchestrator did). It also allows them to pull configuration items, like the name of the SharePoint list, out of a script and into an environmental variable. After doing those replacements the script looks like this (Highlighted changes)

Workflow Monitor-Demo

{

Function Update-SharePointListItem

{

Param([String]$listURI, [String]$PropertyName, [String]$PropertyValue, [PSCredential]$credential)

$updatePropertyName = $PropertyName

$updateValue = $PropertyValue

$body = "{" + [String]::Format("{0}: '{1}'", $updatePropertyName, $updateValue) + "}"

$listURI

$body

Invoke-RestMethod -Method Merge -Uri $listURI -Body $body -ContentType "application/json" -Headers @{"If-Match"="*"} -Credential $credential

}

# Get the credential to authenticate to the SharePoint List with

$credential = Get-AutomationPSCredential -Name ‘CONTOSO\Automation Account'



# SharePoint Site

$spSite = Get-AutomationVariable -Name '
SharePoint Site Base'

# List Name

$spList = "Demo"



#Filter -- in this example only process requests in the new state. this is a default on the SP list I created

$spFilter = "Status eq '
new'"

#$spFilter = ""



#combined uri

$uri

if($spFilter) { $uri = [System.String]::Format("{0}/_vti_bin/listdata.svc/{1}?`$filter={2}",$spSite, $spList, $spFilter) }

else { $uri = [System.String]::Format("{0}/_vti_bin/listdata.svc/{1}",$spSite, $spList) }



while($true)

{

# Time between polling

$delay = Get-AutomationVariable -Name '
SharePoint Monitor Delay'



$listItemURIs = inlinescript {

$itemIDs = @()

$listItems = Invoke-RestMethod -Uri $Using:uri -Credential $Using:credential

#$listItems = Invoke-RestMethod -Uri $uri -Credential $credential

foreach($li in $listItems)

{

$itemIDs += $li.id

}

$itemIDs

}

if($listItemURIs)

{

foreach -Parallel ($listItemURI in $listItemURIs)

{

#get requested user name -- example of getting properties

# $listItem.Content.Properties is base .UserName gets the UserName Property

$userName = inlinescript {

$listItem = Invoke-RestMethod -Uri $Using:listItemURI -Credential $Using:credential

$userName = $listItem.Entry.Content.Properties.UserName

$userName

}



#Change the Status Property

Update-SharePointListItem -listURI $listItemURI -PropertyName "Status" -PropertyValue "Executing" -credential $credential

Checkpoint-Workflow



start-sleep -seconds 10;

#Change the Name Property

Update-SharePointListItem -listURI $listItemURI -PropertyName "UserName" -PropertyValue "$userName - New Name" -credential $credential



#Change the Status Property

Update-SharePointListItem -listURI $listItemURI -PropertyName "Status" -PropertyValue "Completed" -credential $credential

Checkpoint-Workflow

}

}

Start-Sleep -Seconds $delay

}

}

Example in Action SharePoint List

image

image

image

Example of Action SMA

image

image

image

In Closing

One of the goals we have as a community team and working with our MVPs is to provide some practical examples of how organizations just like yours are utilizing the CloudOS technologies. The Building Clouds blog has multiple posts on many other practical solutions you can start to build and use right away. I highly recommend you get over to the Building Clouds and start reading through some of these!

About Ryan Andorfer

Ryan is a Sr. IT Automation Engineer working for General Mills. Ryan’s team is responsible for delivering IT services in a cloud (Self-Service) model. Ryan’s team accomplishes this through utilizing the entire Microsoft stack including SharePoint, Service Management Automation, and SQL. Ryan and his team have a very broad automation portfolio including everything from end user initiated re-imaging of PCs, automated application packaging, to AD and Exchange provisioning.

You can reach Ryan and found out more about him here:

Blog: http://opalis.wordpress.com

Project coordinator for http://scorch.codeplex.com 

MVP Profile: http://mvp.microsoft.com/en-us/mvp/Ryan%20Andorfer-5000029

/Enjoy!

Christian Booth (ChBooth) | Sr. Program Manager | Cloud & Enterprise

Program Lead: System Center: Cloud & Datacenter MVP

 

 

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment
  • A very interesting article. It makes me stop and ask "what is the future of Orchestrator?" Should a greenfield customer be looking at Orchestrator at all or focus on SMA instead?