Collaboration Store, Part XXXVIII

“You don’t have to be good to start … you just have to start to be good!”
Joe Sabah

Last time out we really did not accomplish anything of any consequence, but today let’s see if we can actually make some progress on something of real value. We need to create a process that will run every so often and check to make sure that all of the instances in the community have all of the latest content. We can use the Flow Designer to create and schedule a flow that will run daily on the Host instance and compare the artifacts present on each Client instance with the artifacts present on the Host, and then send over any missing items that never made it over originally for whatever reason. Doing it this way will eliminate the need to track and record errors when they happen, since we will just compare the Client inventory with that of the Host and correct any discrepancies. We don’t really need to know what failed or why.

We could make all of the REST API calls to the Client instances using the Integration Hub, but since that is a collection of optional products, not every instance will have all of those components installed and we would not want to make that a prerequisite for the product. To make this work on any ServiceNow configuration, we will want to roll our own calls via Javascript, and we will just use the Flow Designer to schedule a simple flow that loops through all of the Client instance records and then calls a Script Action that will do all of the heavy lifting. Before we build the Action though, we will want to build a Script Include function that can be called in the Action. Since this will be a significant script, we should probably create a new Script Include specific to this purpose rather than adding a number of new functions to any of our existing Script Includes. We can call our new Script Include InstanceSyncUtils and create a simple function called syncInstance with a single argument of the sys_id of the record for the instance to be synced. For now, we can just stub out the function with a simple logging statement and then circle back later to fill in the details. Right now, we just want to have something to call in our Flow Designer Action.

New Script Include function to be called in our new Action

With that out of the way, we can now fire up the Flow Designer and navigate to New -> Action. We will call our new Action Sync Instance and configure a single Input called Instance ID.

New Sync Instance Action with a single Input

For the Script step, we will just pass the Input to our new Script Include function:

(function execute(inputs, outputs) {
	var isu = new InstanceSyncUtils();
	isu.syncInstance(inputs.instance_id);
})(inputs, outputs);

There is currently nothing returned by the function, so there is no need to define any Outputs. At this point, all we need to do is to Save and Publish the Action before turning our attentions to the actual Flow itself.

To build a Flow, go back to the Home Page of the Flow Designer and select New -> Flow. On the resulting pop-up Flow Properties panel, enter all of the details for the Flow and set the Run As value to System User.

New Flow Properties

Since we want this Flow to run on its own every day, we can set the Trigger to Daily, and the Time to 12:30, which should run it in the middle of the lunch hour in the Host time zone, when most instances should be available.

Flow Trigger configuration

Since we only want this Flow to run on the Host instance, the first thing that we need to do is to create a Flow Variable that indicates whether or not this instance is the Host. To set the value of the variable, we add a Set Flow Variables step that uses the following script to determine if this instance is the Host based on System Properties.

return gs.getProperty('instance_name') == gs.getProperty('x_11556_col_store.host_instance');

The next step then will be a conditional that checks the value of the new Flow Variable.

Making sure that this is the Host instance

Failing this test will terminate the Flow, but if this is the Host instance, then the next thing that we need to do is gather up all of the Client instance records and then loop through them and call our new Action for each one in turn.

Find all records where Active is true and Host is false

Inside the following For Each Item loop, we can then invoke our new Action, passing in the sys_id of the current record.

Calling our new Action inside the For Each Item loop

That completes the work on the Flow, and all we need to do now is to Save and Activate the Flow. At this point, the Flow will kick off every day at 12:30 local time, but it won’t really do much except put a few entries in the System Log. The real work will be done in the Script Include, and since that’s a major effort in and of itself, we’ll leave that for our next exciting episode.

Collaboration Store, Part XXXII

“There are two ways to write error-free programs; only the third works.”
Alan J. Perlis

Now that we have all of the Javascript functions needed to push a new application version out to all of the Client instances, we need to create an Action that will call the root function and then create a Subflow that will utilize the new Action. Since we already created a similar Action for the process that shares new Client instances with existing Client instances, the easiest thing to do will be to bring that guy up in the Flow Designer and make a copy. To make a copy, pull up the Action that you want to copy, and use the ellipses menu to select Copy from the drop-down menu.

Making a copy of the Publish New Instance Action

Once we have our copy, we need to make the necessary changes to adapt it to our purpose. The publishNewVersion function that we just created takes three arguments (newVersion, targetInstance, and attachmentId), so we will need to modify the Action Inputs to bring in those values. The new Action Inputs now look like this:

Modified Action Inputs

Similarly, we will need to modify the Input Variables of the Script Step to pass along these values:

Modified Script Input Variables

The last thing that we need to do is modify the script itself. The existing script in the cloned Action was pretty simple, as it was just a call to one of our Script Include functions:

(function execute(inputs, outputs) {
	var csu = new CollaborationStoreUtils();
	csu.publishNewInstance(inputs.new_instance, inputs.target_instance);
})(inputs, outputs);

We also want to call a function, and it is in the very same Script Include, so our modification is just a matter of changing the name of the function along with the arguments passed.

(function execute(inputs, outputs) {
	var csu = new CollaborationStoreUtils();
	csu.publishNewVersion(inputs.new_version, inputs.target_instance, inputs.attachment_id);
})(inputs, outputs);

That’s it for the changes, so now all we have to do is Save and Publish the new Action and we are ready to utilize it in our new Subflow. Creating the Subflow will be another Copy operation, this time from our existing New_Collaboration_Store_Instance Subflow. We’ll call our copy Distribute_New_Application_Version, and once again, we will need to make some modifications to adapt it to our new purpose. The original Subflow had a single input, the New Instance. Instead of a new instance, we will be distributing a new application version, so we can replace the New Instance input with a New Version input that will contain the sys_id of the new version record. We will also need the sys_id of the Attachment record, so we will add an Attachment ID input as well.

Modified Inputs for the new Distribute_New_Application_Version Subflow

The target instances for this process will be the same as those for the original Subflow: all instances in the community except for the Host instance and the instance from which the data was provided (which could also be the Host in this case, since Host instances can publish applications just like any other Client instance). To filter out the data providing instance, we will need to add a new step to use the passed version record sys_id to fetch the GlideRecord that will lead us to the instance that produced the application.

New step to fetch the version record

This gives us the information that we need to complete the filter on the query of the instance table so that we can gather up all of the instances except for the Host and the instance that produced the application.

Modified query filter using the data available in the fetched version record

Now all we need to do is to replace the original custom Action inside of the loop with the new Action that we just created.

New Action added to push application artifacts to the target instance

This completes the changes needed to the cloned Subflow, so again, all we need to do is to Save and Publish and we are all ready to go. Now all we need to is to come up with a way to launch this Subflow whenever a new application version is sent over to the Host instance. We’ll take a look at that next time out, and hopefully release a new Update Set as well so that those of you that have been kind enough to participate in the testing can give this all a trial run.

Collaboration Store, Part XXXI

“Someone’s sitting in the shade today because someone planted a tree a long time ago.”
Warren Buffett

While we wait for additional test results to come in from the corrected Update Set for our latest iteration, it would be a good time to take a look at what we will need to do to complete the last remaining step in the application publishing process. What we have completed so far pushes all of the new application version artifacts to the Host instance. Now we need to build out the process for the Host instance to send out all of those artifacts to all of the other Client instances. Fortunately, we already have in hand most of the code to do that; we just need to clone a few things and make a number of modifications to fit the new purpose.

We have already built a Subflow to push new Client instance information out to all of the existing Client instances during the new instance registration process. Pushing out a new version of an application to those same instances is essentially the same process, so we should be able to clone that guy, and the associated Action, to create our new process. We have also created functions to push over the application record, the version record, and the attached Update Set XML file, and we should be able to copy over those guys as well to complete the process. In fact, we should probably start with the JavaScript functions, as we will need to call the root function in the Action, which will need to be created before it can be referenced in the Subflow.

The functions as written are hard-coded to send everything over to the Host instance. We will want to do basically the same thing, but this time we will be sending things out to various Client instances. Ideally, I should just refactor the code to have the destination instance and credentials passed in so that one function would work in both circumstances, but at this point it was less complicated to just make copies of each function and hack them up for their new purpose. I don’t really like having two copies of essentially the same thing, though, so one day I am going to have to go back and rework the entire mess.

But for now, this is what I came up with when I copied what was once called processPhase5 to create the new publishNewVersion function:

publishNewVersion: function(newVersion, targetInstance, attachmentId) {
	var targetGR = new GlideRecord('x_11556_col_store_member_organization');
	if (targetGR.get('instance', targetInstance)) {
		var token = targetGR.getDisplayValue('token');
		var versionGR = new GlideRecord('x_11556_col_store_member_application_version');
		if (versionGR.get(newVersion)) {
			var canContinue = true;
			var targetAppId = '';
			var mbrAppGR = versionGR.member_application.getRefRecord();
			var request  = new sn_ws.RESTMessageV2();
			request.setHttpMethod('get');
			request.setBasicAuth(this.WORKER_ROOT + targetInstance, token);
			request.setRequestHeader("Accept", "application/json");
			request.setEndpoint('https://' + targetInstance + '.service-now.com/api/now/table/x_11556_col_store_member_application?sysparm_fields=sys_id&sysparm_query=provider.instance%3D' + mbrAppGR.getDisplayValue('provider.instance') + '%5Ename%3D' + encodeURIComponent(mbrAppGR.getDisplayValue('name')));
			var response = request.execute();
			if (response.haveError()) {
				gs.error('CollaborationStoreUtils.publishNewVersion: Error returned from Target instance ' + targetInstance + ': ' + response.getErrorCode() + ' - ' + response.getErrorMessage());
				canContinue = false;
			} else if (response.getStatusCode() == '200') {
				var jsonString = response.getBody();
				var jsonObject = {};
				try {
					jsonObject = JSON.parse(jsonString);
				} catch (e) {
					gs.error('CollaborationStoreUtils.publishNewVersion: Unparsable JSON string returned from Target instance ' + targetInstance + ': ' + jsonString);
					canContinue = false;
				}
				if (canContinue) {
					var payload = {};
					payload.name = mbrAppGR.getDisplayValue('name');
					payload.description = mbrAppGR.getDisplayValue('description');
					payload.current_version = mbrAppGR.getDisplayValue('current_version');
					payload.active = 'true';
					request  = new sn_ws.RESTMessageV2();
					request.setBasicAuth(this.WORKER_ROOT + targetInstance, token);
					request.setRequestHeader("Accept", "application/json");
					if (jsonObject.result && jsonObject.result.length > 0) {
						targetAppId = jsonObject.result[0].sys_id;
						request.setHttpMethod('put');
						request.setEndpoint('https://' + targetInstance + '.service-now.com/api/now/table/x_11556_col_store_member_application/' + targetAppId);
					} else {
						request.setHttpMethod('post');
						request.setEndpoint('https://' + targetInstance + '.service-now.com/api/now/table/x_11556_col_store_member_application');
						payload.provider = mbrAppGR.getDisplayValue('provider.instance');
					}
					request.setRequestBody(JSON.stringify(payload, null, '\t'));
					response = request.execute();
					if (response.haveError()) {
						gs.error('CollaborationStoreUtils.publishNewVersion: Error returned from Target instance ' + targetInstance + ': ' + response.getErrorCode() + ' - ' + response.getErrorMessage());
						canContinue = false;
					} else if (response.getStatusCode() != 200 && response.getStatusCode() != 201) {
						gs.error('CollaborationStoreUtils.publishNewVersion: Invalid HTTP Response Code returned from Target instance ' + targetInstance + ': ' + response.getStatusCode());
						canContinue = false;
					} else {
						jsonString = response.getBody();
						jsonObject = {};
						try {
							jsonObject = JSON.parse(jsonString);
						} catch (e) {
							gs.error('CollaborationStoreUtils.publishNewVersion: Unparsable JSON string returned from Target instance ' + targetInstance + ': ' + jsonString);
							canContinue = false;
						}
						if (canContinue) {
							targetAppId = jsonObject.result.sys_id;
						}
					}
				}
			} else {
				gs.error('CollaborationStoreUtils.publishNewVersion: Invalid HTTP Response Code returned from Target instance ' + targetInstance + ': ' + response.getStatusCode());
			}
			if (canContinue) {
				this.publishVersionRecord(targetInstance, token, versionGR, targetAppId, attachmentId);
			}
		} else {
			gs.error('CollaborationStoreUtils.publishNewVersion: Version record not found: ' + newVersion);
		}
	} else {
		gs.error('CollaborationStoreUtils.publishNewVersion: Target instance record not found: ' + targetInstance);
	}
},

Unlike the original application publication process, we are not bouncing back and forth between the client side and the server side, so with the successful completion of one artifact, we can simply move on to pushing over the next artifact. To create the next step in the process, the publishVersionRecord function, I started out with a copy of the original processPhase6 function. Here is the end result:

publishVersionRecord: function(targetInstance, token, versionGR, targetAppId, attachmentId) {
	var canContinue = true;
	var payload = {};
	payload.member_application = targetAppId;
	payload.version = versionGR.getDisplayValue('version');
	var request  = new sn_ws.RESTMessageV2();
	request.setBasicAuth(this.WORKER_ROOT + targetInstance, token);
	request.setRequestHeader("Accept", "application/json");
	request.setHttpMethod('post');
	request.setEndpoint('https://' + targetInstance + '.service-now.com/api/now/table/x_11556_col_store_member_application_version');
	request.setRequestBody(JSON.stringify(payload, null, '\t'));
	response = request.execute();
	if (response.haveError()) {
		gs.error('CollaborationStoreUtils.publishVersionRecord: Error returned from Target instance ' + targetInstance + ': ' + response.getErrorCode() + ' - ' + response.getErrorMessage());
		canContinue = false;
	} else if (response.getStatusCode() != 201) {
		gs.error('CollaborationStoreUtils.publishVersionRecord: Invalid HTTP Response Code returned from Target instance ' + targetInstance + ': ' + response.getStatusCode());
		canContinue = false;
	} else {
		jsonString = response.getBody();
		jsonObject = {};
		try {
			jsonObject = JSON.parse(jsonString);
		} catch (e) {
			gs.error('CollaborationStoreUtils.publishVersionRecord: Unparsable JSON string returned from Target instance ' + targetInstance + ': ' + jsonString);
			canContinue = false;
		}
		if (canContinue) {
			targetVerId = jsonObject.result.sys_id;
			this.publishVersionAttachment(targetInstance, token, targetVerId, attachmentId);
		}
	}
},

The last step in the process, then, is to send over the Update Set XML file attachment. For that one, I started out with the processPhase7 function and ended up with this:

publishVersionAttachment: function(targetInstance, token, targetVerId, attachmentId) {
	var gsa = new GlideSysAttachment();
	var sysAttGR = new GlideRecord('sys_attachment');
	if (sysAttGR.get(attachmentId)) {
		var url = 'https://';
		url += targetInstance;
		url += '.service-now.com/api/now/attachment/file?table_name=x_11556_col_store_member_application_version&table_sys_id=';
		url += targetVerId;
		url += '&file_name=';
		url += sysAttGR.getDisplayValue('file_name');
		var request  = new sn_ws.RESTMessageV2();
		request.setBasicAuth(this.WORKER_ROOT + targetInstance, token);
		request.setRequestHeader('Content-Type', sysAttGR.getDisplayValue('content_type'));
		request.setRequestHeader('Accept', 'application/json');
		request.setHttpMethod('post');
		request.setEndpoint(url);
		request.setRequestBody(gsa.getContent(sysAttGR));
		response = request.execute();
		if (response.haveError()) {
			gs.error('CollaborationStoreUtils.publishVersionAttachment: Error returned from Target instance ' + targetInstance + ': ' + response.getErrorCode() + ' - ' + response.getErrorMessage());
		} else if (response.getStatusCode() != 201) {
			gs.error('CollaborationStoreUtils.publishVersionAttachment: Invalid HTTP Response Code returned from Target instance ' + targetInstance + ': ' + response.getStatusCode());
		}
	} else {
		gs.error('CollaborationStoreUtils.publishVersionAttachment: Invalid attachment record sys_id: ' + attachmentId);
	}
},

So that takes care of the functions. Now we have to create an Action in the Flow Designer that will call the function, and then produce a new Subflow that will leverage that action. That sounds like a good subject for our next installment.

Special Note to Testers

If you want to test the set-up process, you can do that with a single instance. You just have to select the Host option during the set-up process. If you want to test the Client set-up process, you will need at least two instances, one to serve as the Host, which you will need to reference when you set up the Client (you cannot be a Client of your own Host). But if you really want to test out the full registration process, including the Subflow that informs all existing instances of the new instance, you will need at least three participating instances: 1) the Host instance, 2) the new Client instance, and 3) an existing Client instance that will be notified of the new member of the community.

The same holds true for the application publishing process. You can test a portion of the publishing process by publishing an app on a single Host instance, but if you want to test out all of the parts and pieces, you will want to publish the app from a Client instance and make sure that it makes its way all the way back to the Host. Once we complete the application distribution process, full testing will require at least three instances to verify that the Host distributes the new version of the application to other Clients.