...building hybrid clouds that can support any device from anywhere
Yes, it’s only been a couple minutes since Charles published parts 1 to 4 for this series, and maybe you thought we would rest a bit and take a breath before publishing new content… Well, not really!
Today, we also have part 5 of the series available for you to read, and it brings a new set of sample scripts, this time in the context of working with the SQL Server resource provider in WAP.
In this post, we will demonstrate the following:
And as a bonus, I will include some user and subscriptions samples at the end. But these will be quite short and more of a teaser, as Charles is preparing a more in-depth series on this topic.
Having the ability to run scripts as both admins and tenant is critical for automation, especially in the context of enterprises. A few examples:
- Doing actions on behalf of a tenant, after approvals in an ITSM solution
- Collecting data available in the admin and/or tenant APIs, to centralize/reconcile in a CDMB or else
Now, there is a fine line between controlled processes as requested by ITIL, and the expected and promised agility of a cloud… This is where a balance has to be found. At the end of this post, we try to start covering this topic, by introducing what we called the “ITIL dilemma” (see section “Bonus – Creating users and adding them to plans”)
Yes, the scripts from this page are available as a download!
Note : This is a single script, intended to be run sample-by-sample, using selection mode (“F8” in PowerShell ISE). If you start to run the script right away, it will likely fail because some variables need to be edited, like the subscriptions ID. Please refer to the sample below and to the header of the script, to understand what need to be modified. By default, write actions are commented out anyway, to ensure you do not mistakenly write to your environment.
The SQL Server resource provider in WAP allows an administrator/service provider to delegate tenants the ability to create/resize/delete SQL Server databases. It’s a very easy and flexible way to provide databases that could be hosted on standalone servers, or using SQL Server AlwaysOn. These databases can then be used with the Web Sites or Virtual Machines from this tenant, for instance (there are many scenarios, depending on how network routing and isolation is designed in your WAP environment). Just like with VM Clouds, the SQL Server resource providers relies on a shared fabric (this time made up of SQL Servers servers – optionally categorized in groups – whereas VM Clouds rely on a fabric of Hyper-V hosts surfaced through the Virtual Machine Manager clouds). You can read more about the SQL Server resource provider here, and we will have a more detailed blog post on this very soon.
This is very similar to VM Clouds, that have been the main focus of parts 1 to 4 for this series. We need to authenticate with the WAP APIs (admin or tenant) and then can execute actions using the WAP cmdlets (from an admin standpoint) or through the tenant API web service (from a tenant standpoint).
We’ll need the WAP admin API PowerShell module loaded, and here are two variables we are setting and will reuse for all these scripts. They correspond to the endpoints for the tenant and admin APIs.
Note : You may need to run these samples as administrator/privileged.
Once again, this has been previously covered by Charles, but let’s look at the variations you may face in different environments. These variations may be due to the following:
- Whether you use self-signed certificates, or certificates signed by an approved certification authority
- Whether you have setup ADFS or not, for the WAP admin interfaces
The important part here is the “-DisableCertificateValidation” switch, but it’s not enough. you will also need to add this to your script, before continuing:
Note : Be aware that this snippet disable SSL verification in the rest of your PowerShell session. Of course, having “well-signed” certificates is recommended in production…
OK, I am cheating a bit here, as “Get-AdfsToken” is actually calling a function in the script But this function is provided in the downloadable script sample. It’s also actually adapted from the samples you have on any Windows Azure Pack installation – Thanks for Shri for his help on this!
When we have a token, displaying it should look like this:
To do this, let’s display existing plans, subscriptions and users:
This is just a matter to craft the expected Web Request, and using our token as part of the headers – note that the subscription ID can either be found in the WAP admin portal…or in the output of the previous sample
Here is the output in PowerShell, and the view in the tenant portal:
If we had wanted to query VM Role Gallery Items available through that subscriptions, the last two lines to use would have been this one, with a different URI:
As you can see, working with a different resource provider only requires to understand the namespace for that provider. And for the resource providers shipping out of the box, MSDN is your friend, such as this link for the SQL Server resource provider.
The script would look like this – It is using the URI from the previous sample, to automatically display the details of the first database that was found. But you could replace $DBName by any database name that you know to exist in the subscription. $SubscriptionID and $Headers also come from the previous sample (both samples are grouped toegether in the downloadable script)
And the output gives us:
The sample below uses our token to cycle through all subscriptions and, when the SQL Server resource provider is part of the associated plan, it queries to see if databases are already created. If yes, it lists them.
Reading data is great, but creating may be important once in a while, right?
This scripts create a database in the specified subscription, by preparing the JSON body and calling the right Web Request, this time with a “POST” command, vs an implied “GET” in previous examples. Notice how we have to specify the “application/json” content type too.
Note : You do not need to use the “-f” syntax in the hashtable. This is just to show that you can work with values in many different ways, as long as you do not forget to convert to JSON later (using ConvertTo-Json)
How did we know what properties to insert in the hash table?
Well, as said before, MSDN is your friend, and in particular this link : it provides a sample JSON payload to insert in the request body. You could actually use this in the previous code, to insert the JSON body directly as a PowerShell here-string:
Using a hash table has the benefit to make it easier to insert variables. When going from the here-string to the hash table, we also removed the “null” properties, only keeping the other ones. The subscription property was not added too, as we are already executing the request in the subscription’s context.
Output of a database creation:
In addition to the previous samples, here are three additional and very simple scripts. We had them handy, so figured it was worth adding there as a bonus.
- Listing all users
- Creating a Windows Azure Pack user
- Adding a WAP user to a plan (see also here)
Output for the new user and new subscription creations:
These is just a teaser, since Charles will be creating some posts soon on topic, with more depth and more scenarios. For example, creating the user in WAP is only one of two steps. The second step is to make sure the user actually exists in the authentication system. With ADFS, there is no need for the second step (user is already an Active Directory user, so what you have earlier is enough in an ADFS-enabled environment). But without ADFS, the user needs to created in the default ASP.NET membership provider too.
When talking with enterprises, there is often an IT Service Management (ITSM) solution already used to manage requests and approvals, often through a “service catalog”. Using the samples from this blog post, it is definitely possible to approve new databases (or virtual machines) requests in the ITSM solution, and trigger API calls to actually deploy the resources on behalf of the user. Now, as organizations try to embrace the “IT as a Service” frame of mind, a balance has to be found, so that traceability and approvals required by ITIL processes do not offset the agility and economics of the cloud. This is where these bonus examples might be interesting : If your processes require it, you could provide controlled/approved access to private plans in WAP, and then let users do whatever they want inside their “delegated sandboxes” (with optional chargeback).
The scenario would be like this:
1. The WAP administrator creates plans but keeps some of them “private”
1. A user goes to a well-known service catalog (System Center Service Manager comes to mind, but you may be using another solution today too)
2. The “private” plans are listed in the service catalog. They could have been added to the CMDB, by listing them using the samples here, and added through automation (assuming your ITSM solution allows to create CMDB items programmatically – hint : Service Manager does that)
3. The approvals occur within the realm of the ITSM solution, as for any other enterprise process
4. Once approved, automation kicks in. The automation process (Service Management Automation, Orchestrator, or just a PowerShell script) could either monitor approved requests, or be called externally by the ITSM solution. This depend on your design, and on the ITSM solution and automation engine being used
5. The Runbook or script can request an admin token, and create the user (if not there already), add him/her to the plan, and notify the user by sending him the URL to the WAP portal
6. Optionaly, another Runbook could be cycling through resources created (like all databases in all subscriptions) and add/update then in the CMDB. This is a way to reconcile items potentially created outside of the ITSM process, if that’s something technically possible in your implementation.
Well, maybe this time we’ll take a breather Parts 1 to 5 are now out, and we should be back in a few weeks with part 6, hopefully with something around: Value Added Services/Offerings and Ongoing Automated Maintenance/Operations. Thanks for reading!