vRealize Automation 6 – Post Provisioning Workflows on AWS
June 15, 2015In order to deploy a fully provisioned automated deployment of a server we have to look past just deploying a virtual machine OS and configuring an IP Address. In order to get something usable we also need to configure the server with some applications or make post provisioning changes. For instance we might want to install Apache after deploying a Linux machine. In vRealize Automation deployments invoke a post-provisioning stub to call vRealize Orchestrator workflows to make additional changes. This works very well on a vSphere environment since we can leverage VMtools to access the guest OS. But if you’ve ever deployed an instance in Amazon EC2 you’ll know that this isn’t quite as easy. EC2 instances don’t have VMTools to allow us into the guest OS. To make matters worse, the current version of vRealize Automation doesn’t pass the IP address of the guest Operating System to vRO. See this KB article from VMware for more information.
This post goes into more details about how to use a post-provisioning workflow on an Amazon EC2 instance, specifically customizing a Red Hat Linux Guest OS.
Overview
In order to achieve a fully automated deployment, we use vRealize Automation (vRA) to as our user portal and life-cycle management. Layered on top of this, we utilize vRealize Orchestration to make the vRA servers, Amazon EC2 endpoints and Guest OS all work together. Here is our high level process of what needs to happen.
vRA will deploy the server from a blueprint. Once this is finished, the post-provisioning stub will make a call to run an orchestrator workflow. The first thing that this workflow will do is to make a SQL call to the vRA database. This call retrieves the IP Address of the EC2 instance. Next, we pass the IP Address to another workflow and run an SSH command on the Linux Appliance.
Blueprint Setup
I won’t go into the blueprint setup in too much detail. The important piece of this is to ensure that the “ExternalWFSubs.MachineProvisioned” custom property is added to the blueprint. The value that corresponds to this should be the workflow ID from vRealize Orchestrator.
The value can be seen from the vRO workflow screen as seen below.
vRealize Orchestrator Workflow
Now the real work begins. We build a workflow to take the vRA information and then connect to the EC2 instance to run an SSH command. The picture below lays out the workflow and order.
The first element simply logs the inputs to the workflow. This is a script that I commonly use to grab any of the information that was passed from vRealize Automation, over to vRealize Orchestrator. It will log it to the screen so you can see what information is available to tie back to the original blueprint. This script isn’t necessary but is a nice thing to have. The javascript for this element is listed below.
System.log("Workflow started from workflow stub " + externalWFStub + " on vCAC host " + vCACHost.displayName); System.log("Got vCAC virtual machine " + vCACVm.virtualMachineName); System.log("Matching virtual machine entity " + virtualMachineEntity.keyString); vmName = vCACVm.virtualMachineName; System.log("vmName is: " + vmName); var virtualMachine = virtualMachineEntity.getInventoryObject(); if (virtualMachine != null) { var virtualMachinePropertyEntities = virtualMachineEntity.getLink(vCACHost, "VirtualMachineProperties"); var virtualMachineProperties = new Properties(); //Loop through all of the VM Properties and log them for reference. for each (var virtualMachinePropertyEntity in virtualMachinePropertyEntities) { var propertyName = virtualMachinePropertyEntity.getProperty("PropertyName"); var propertyValue = virtualMachinePropertyEntity.getProperty("PropertyValue"); virtualMachineProperties.put(propertyName, propertyValue); System.log("INFO: " + " PropertyName " + propertyName + " propertyValue " + propertyValue); } // Enter the var name to output, and the property field with the value you're looking to export var vmIP = virtualMachineProperties.get("VirtualMachine.Network0.Address"); //Log the value for troubleshooting purposes System.log (vmIP); }
The next piece of the puzzle is a SQL Query. I used another script element to build the SQL Query that I plan to use. This query will be passed along to the next element in the workflow which will actually execute this script. The vCAC:Entity is passed to the script element.
The script takes the Entity name and merges it with our SQL Query. The script also removes the “guid” part of the vCAC:Entity string so that it matches the format that it’s stored in the database.
var guid guid = (virtualMachineEntity.keyString); vmString = guid.replace("guid", ""); System.log(vmString) SQLQuery = "select PrivateIPAddress from [DynamicOps.AmazonWSModel].Instances where VirtualMachineID = " + vmString System.log(SQLQuery)
Now we pass the query that we just created, over to the element that will execute the query. This is a standard SQL Read Query that can be found in the default list of workflows. We map the result of the query to a new attribute called “IPAddress”. Thats what we’re after!
NOTE: the workflow I modified is called “Read a custom query from a database” workflow. Also, before this workflow can be used, the “Add a database” query needs to be run to identify which database can be used for the queries.
Now that we’ve returned the IP Address from the SQL database, we take Array record and convert it to a string to be used in our next element. I know there is likely a more clean method to cleanup this data, or stringify the array value, but this is what I was able to quickly get working.
System.log("IPAddress = " + IPAddress); ipString = IPAddress.toString(); System.log("ipString = " + ipString); var tempIP1 = ipString.replace("DynamicWrapper (Instance) : [SQLActiveRecord]-[class com.vmware.o11n.plugin.database.ActiveRecord] -- VALUE : ActiveRecord: {PrivateIPAddress=", ""); var tempIP2 = tempIP1.replace("}",""); System.log("New IP = " + tempIP2); ipString = tempIP2
We can now pass the ipString variable over to our SSH workflow. This workflow logs into the EC2 instance and runs the query we chose. In this case I’ve built an SSH Workflow to install Apache on RHEL. This operation does require the SSH Keys to be available by vRealize Orchestrator. This operation can be found in another post.
Summary
OK, this doesn’t seem like the most straight forward way to run a command after provisioning a server, but this is the only way that I know of to do this with the current version of vRealize Automation. I’m sure that the integration between vRA and AWS will get much tighter but for now, this is how the operation can be achieved.