Collaboration Store, Part XIII

“First, solve the problem. Then, write the code.”
John Johnson

Today we need to build out the Subflow that we referenced in the Script Include function that we built last time out. To create a new Subflow, open up the Flow Designer, click on the New button to bring up the selection list, and select Subflow from the list.

Creating a new Subflow

When the initial form pops up, all you really need to enter is the name of the Subflow, which we already referred to in our Script Include function as New_Collaboration_Store_Instance.

New Subflow properties form

Once you submit the initial properties form, the new Subflow will appear in the list of Subflows, and from there you can bring it up in full edit mode. In edit mode, we can add the one Input to the Subflow, the name of the new instance.

Subflow Inputs and Outputs

Since we will be launching this Subflow to run on its own in the background, there is no need for any Outputs.

Once we configure the Inputs and Outputs, we can move on to the steps of the Subflow. Our first step will be to gather up all of the records in the table of instances except for two: the Host instance, which has already been updated, and the new instance, which already knows about itself.

Database query step

We select the Look Up Records Action, select the Member Organization table, and then define two Conditions to get the records that we want: 1) the host value is false, and 2) the instance is not the instance that we brought in as Input. It really doesn’t matter for our purposes what sequence the records come in, but I went ahead and sorted the records by instance, just so they we will always work through them in the same order.

Our next step, then is to loop through the records. We do that with a For Each Item Flow Control step, setting the items to the records obtained in the first step.

Looping through the retrieved records

Now we have the basic structure of the Subflow; we just need to perform the tasks necessary to notify each host of the new instance, and notify the new instance of each host within the the For Each Item loop. This could all be done with additional Subflow steps, but I took the easy way out here and just built another function in our Script Include to handle the REST API calls to the instances. In fact, I built two functions, one to make the call, and another to call that function twice, once to tell the existing host about the new host, and then again to inform the new host of the existing host. To run that script, I had to create a custom Action, which is just a simple Script Action that calls the function, passing in the names of the two instances (existing, from the current record, and new, from the Subflow input). Once I built the custom Action, I was able to select it from the list and then configure it.

Custom Script Action step

That completes the Subflow, but once again, we have referenced a function in our Script Include that does not exist, so we will have to get into that in our next installment.

Fun with Webhooks, Part VII

“The question isn’t who is going to let me; it’s who is going to stop me.”
Ayn Rand

Now that we have proven that the essential elements of our Subflow all work as intended, it’s time finish out the remainder of the flow’s activities, which include logging the HTTP POST and Response details as well as reporting any undesired results to Event Management. Let’s start with logging the activity.

The first thing that we will need in order to record our Webhook POSTs is a table in which to store the data. As we did with our original Webhook Registry table, we can navigate to System Definition -> Tables and click on the New button to bring up the Table definition screen.

New Webhook Log table

And once again we will give it a Label and let the system generate the associated Name. We will also want to uncheck the Create module checkbox again to prevent the generation of a number of artifacts for which we have no use. Once the table has been defined, we can start adding fields, and the first field that we will want to add is a Reference to the Webhook Registry table. Every log record will be linked to the registry for which the activity was POSTed, so we will want to establish that relationship with a Reference field that we can label Registry.

The other Reference field that we will want is a link back to the original Incident that is the subject of the POST. Since we set things up in a way that would allow us to support tables other than Incident, we will want to do this with a Document ID field rather than a direct reference to the Incident table. This time, when we configure the Dependent Field, we can dot walk through the registry reference to get to the table name column in that related table. This will save us from having to have a column on the log table for the name of the table that holds the record associated with the Document ID, and it will ensure that all of the Document IDs related to each Registry will only come from the table associated with that Registry.

Selecting the Registry record’s Table column as the Dependent Field

The rest of the fields in the log table are just what we sent over, and what we received in response:

  • Payload – The data that we will be POSTing
  • URL – The URL to which we will be POSTing our Payload
  • Status – The HTTP Response Code returned by the target server
  • Body – The body of the message returned by the target server
  • Error – The error flag
  • Error Code – The error code
  • Error Message – The error message
  • Parse Error – Any error that occurred while parsing the body of the response

After defining all of the fields on the table, I brought up the table’s form and arranged all of the fields on the screen in a manner that I thought was most appropriate.

Webhook Log form layout

Those of you who are paying close attention will also have noticed that I added the JSON View Dictionary Attribute to both the Payload and Body fields, just to make reading the JSON content a little easier.

Now that we have a table, we can start putting records into it. We will do this in our Subflow, right after we POST the payload. This is just a simple, out-of-the-box Create Record action that we can configure using data pills from various other steps.

Logging the Webhook POST and Response

In addition to capturing everything related to each POST, the other thing that we wanted to do was to capture any issues that might come up during this process. We are already aware of two possible issues, one being a bad response and the other being a good response, but with an unparsable response body. After we do the POST and log the result, we can throw in a few more conditionals to pick those up, and then add a Log Event Action to the flow for each.

Complete Subflow with Event logging for error conditions

Whenever we log an Event, we will want to capture as much information as we can about what went wrong. In this case, however, we have already logged everything about the transaction to the Webhook Log table, so really all we will need to provide is a way to find that record. Putting the sys_id in the additional_info field should do the trick. Here is how I populated all of the data for the Event:

Event Log data

That should complete the Subflow, at least for now. We may end up adding some other features in the future, but for now, this accomplishes everything that we set out to do. We still need to do a lot more testing to verify that all of these various branches in the tree work out as we are hoping, but the building part should be done now, at least for this portion.

As far as the remaining development goes, we still have to build out the My Webhooks portal widget and we also need to go back into the Script Include and add support for Basic Authentication. We also need to add code, and possibly additional fields in the registry record, for any other authentication protocols that we would like to support. So once again we find ourselves at a crossroads: we can either jump into the Service Portal world and start working on our widget, or we can turn our attentions to authentication and finish things up in that area. There is no need to make any decision on that today, though. We’ll figure all of that out when we meet again.

Fun with Webhooks, Part V

“Change is easy. Improvement is far more difficult.”
Ferdinand Porsche

Now that we have a Business Rule to launch our Subflow, it’s time to get busy and actually create the Subflow. To create a new Subflow, open up the Flow Designer, click on the New button and select Subflow from drop-down menu. I initially called mine Webhook Poster, but the Business Rule couldn’t find it to launch it for some reason. I finally got it to work when I replaced the space with an underscore, so now the name is Webhook_Poster. I’m not really sure why I had to do that, but it works, so we will leave it at that.

With the Subflow named, the next thing to do is to define all of the inputs. In my mind, there were only two, the current object and the previous object. Unfortunately, the Flow Designer will not recognize any of the properties of those objects unless you explicitly define them, so I had to specify every element of both objects. That seemed like a rather tedious waste of time, but eventually I got through it and now it’s done.

Subflow inputs defined

That takes care of the set-up, so now it’s on to the flow itself. The first thing that we are going to want to do is to gather up all of the registration records that would be triggered by this Incident. Since we gave our users several options for selecting records, we will have to test them all. Essentially, we will need a query filter for each possible Type, which we can then OR together to create one huge query filter. Here is how that looks in practice:

Webhook Registry selection criteria

As a second step, I threw in an IF statement to see if the query returned any records. We are basically done at this point if there are no records, but if we do have records to process, then we will need to build the payload that we will be posting to all of the target URLs. For that job, I needed to create a new Flow Designer Action, so I saved my Subflow and closed the Subflow tab for now.

The payload is basically a generic JSON object containing the relevant information about the the things that have changed about the record since the last time that it was saved. Pulling that all together will basically be a lot of tedious scripting, so that sounds like a job for yet another function in our Script Include. Here is what I came up with:

buildPayload: function(current, previous) {
	var payload = {};

	payload.id = current.number;
	payload.short_description = current.short_description;
	payload.message = '';
	var separator = '';
	if (current.state != previous.state) {
		payload.state = current.state;
		if (previous.state) {
			payload.message += separator + 'State changed from ' + previous.state + ' to ' + current.state + ' on ' + payload.id + '.';
		} else {
			payload.message += separator + 'State set to ' + current.state + ' on ' + payload.id + '.';
		}
		separator = '\n\n';
	}
	if (current.assignment_group != previous.assignment_group) {
		payload.assignment_group = current.assignment_group;
		if (previous.assignment_group) {
			payload.message += separator + 'Assignment Group changed from ' + previous.assignment_group + ' to ' + current.assignment_group + ' on ' + payload.id + '.';
		} else {
			payload.message += separator + 'Assignment Group set to ' + current.assignment_group + ' on ' + payload.id + '.';
		}
		separator = '\n\n';
	}
	if (current.assigned_to != previous.assigned_to) {
		payload.assigned_to = current.assigned_to;
		if (previous.assigned_to) {
			payload.message += separator + 'Assigned To changed from ' + previous.assigned_to + ' to ' + current.assigned_to + ' on ' + payload.id + '.';
		} else {
			payload.message += separator + 'Assigned To set to ' + current.assigned_to + ' on ' + payload.id + '.';
		}
		separator = '\n\n';
	}
	if (current.comments) {
		payload.comments = current.comments;
		payload.message += separator + current.operator + ' has added a new comment to ' + payload.id + ':\n' + current.comments;
		separator = '\n\n';
	}
	if (current.work_notes) {
		payload.work_notes = current.work_notes;
		payload.message += separator + current.operator + ' has added a new work note to ' + payload.id + ':\n' + current.work_notes;
		separator = '\n\n';
	}

	return JSON.stringify(payload, null, '\t');
},

We still need a Flow Designer Action to call this function, but the function does almost all of the heavy lifting and our Action should now be quite simple to put together. Let’s open up the Flow Designer again, click on the New button, and select Action from the drop-down selection list. We’ll call our new Action Build Webhook Payload since that’s what it does, and we will define two inputs, the current and previous objects. Since we won’t be referencing any of the properties of those objects directly, this time we won’t have to invest the time in specifying all of the elements of the objects.

All we will need for our Action is to add a Script step with the same two inputs and the payload as an output. The script itself is just a call to the function that we just added to our Script Include.

(function execute(inputs, outputs) {
	var wru = new WebhookRegistryUtils();
	outputs.payload = wru.buildPayload(inputs.current, inputs.previous);
})(inputs, outputs);

We also need to define the payload as an output of Action itself, and map the Action inputs to the Script step inputs and the Script step output to the Action output. That should complete our new Action.

Build Webhook Payload Action

Now we can go back into our Subflow and select this Action as a third step under our second step conditional that checks to see if there are any records from the query in the first step. For our fourth step, we will add a flow logic step that loops through all of the records from step 1, and inside of that loop, our fifth step will make the POST of the payload. I looked for an out-of-the-box simple HTTP POST Action already included on the platform, but I couldn’t find anything, so that’s another Action that we are going to have to produce ourselves. We already built much of the script for that when we built our testURL function; if we rework that code just a little bit, we can probably find a way to make it work just fine for both purposes.

testURL: function(whrGR) {
	var jsonObj = {message: 'This is a test posting.'};
	return this.postWebhook(whrGR, JSON.stringify(jsonObj, null, 4));
},

postWebhook: function(whrGR, payload) {
	var result = {};

	var request = new sn_ws.RESTMessageV2();
	request.setEndpoint(whrGR.getValue('url'));
	request.setHttpMethod('POST');
	request.setRequestHeader('Content-Type', 'application/json');
	request.setRequestHeader('Accept', 'application/json');
	request.setRequestBody(payload);
	var response = request.execute();
	result.status = response.getStatusCode();
	result.body = response.getBody();
	if (result.body) {
		try {
			result.obj = JSON.parse(result.body);
		} catch (e) {
			result.parse_error = e.toString();
		}
	}
	result.error = response.haveError();
	result.error_code = response.getErrorCode();
	result.error_message = response.getErrorMessage();

	return result;
},

Now the testURL function is just a call to our new postWebhook function that contains most of the code previously contained in the testURL function. The testURL function will pass in our simple test payload, and when we create out new Action, we can call the very same postWebhook function, passing in a real payload. The script for our new Action will then simply be this:

(function execute(inputs, outputs) {
	var wru = new WebhookRegistryUtils();
	var result = wru.postWebhook(inputs.webhook_registry, inputs.payload);
	for (var key in result) {
		outputs[key] = result[key];
	}
})(inputs, outputs);

Stopping right here should give us enough with which to test out the process so far. This isn’t all that we want to do here, because we still want to record both the attempt and the response in our log table. However, since we haven enough built out to just test the POSTing of payloads, I think we should stop and do that. Afterwards, we can circle back and build out the activity logging and any Event logging that we might want to do for any kind of bad responses. But right now, we have enough built out that we can create a few Webhook Registrations and then go update some qualifying Incidents to see what actually happens.

Setting all of that up and verifying the results seems worthy of its own episode, so let’s wrap things up for today and pick that up next time out.

Event Management for ServiceNow, Revisited

“True prevention is not waiting for bad things to happen, it’s preventing things from happening in the first place.”
Don McPherson

Some time ago we built some utility functions to support reporting Events within the ServiceNow Platform. That was before the Flow Designer, though, so that effort did not include any support for that environment. We already have the script to do all of the heavy lifting from our earlier work, so it wouldn’t take much to create a Flow Designer Action that called that script to report an Event that occurred during Flow processing. We can call our new Action Log Event, and set up Action Inputs for all of the usual suspects.

Log Event Action Inputs

For our script step, we will basically set up the same inputs and then source them directly from the primary Action Inputs.

Script step inputs mapped to Action Inputs

Those of you who are paying attention will notice that we defined the additional_input field as a String even though it needs to be an Object when we make the call to our existing script. The assumption here is that the caller will provide a valid JSON String, and then we can turn it into an Object in our script before we make the call. Here is the script to convert the String and then make the call.

(function execute(inputs, outputs) {
	if (inputs.additional_info) {
		try {
			inputs.additional_info = JSON.parse(inputs.additional_info);
		} catch(e) {
			//
		}
	}
	var seu = new ServerEventUtil();
	seu.logEvent(inputs.source, inputs.resource, inputs.metric_name, inputs.severity, inputs.description, inputs.additional_info);
})(inputs, outputs);

There are no outputs from this process, so this is the entire Action. Once we Save and Publish it, it will be available from the Action selection list, and then we can add Log Event steps anywhere in our Flows and Subflows where we want to report an Event. That was fairly quick, easy, and relatively painless. For those of you would like to try it out on your own, here is an Update Set.

Flow Designer Counter

“If you have built castles in the air, your work need not be lost; that is where they should be. Now put the foundations under them.”
Henry David Thoreau

Since I first threw together my Flow Designer Array Iterator, I have had a number of occasions to put it to good use, but recently I had a need for an iterating index, but I did not have a String Array to use as the basis for the iteration. I thought about just creating one, just to have something to pass in to the Action, but then it occurred to me that it might be useful to have some kind of simple index action that wasn’t dependent on the presence of a String Array. Basically, it could be very similar to my Array Iterator, but without the array.

I thought about making a new Script Include just for these new Actions, but I decided to just add them to my existing SNHStringArrayUtils Script Include, mainly because I wanted to copy the existing functions from there, anyway, and so it was easier to just drop the copy right there into the same script. To start with, I copied the createIterator function and then hacked it up to create a new createCounter function.

createCounter: function(scratchpadId, counterId) {
	var response = {};

	var snhspu = new SNHScratchpadUtils();
	response = snhspu.setScratchpadProperty(scratchpadId, counterId, 0);
	if (response.success) {
		response.message = 'Counter ' + counterId + ' successfully created';
	}

	return response;
}

For the most part, that was just a matter of removing all of the code related to the String Array, which we are not using here, and then setting the value of our new counter to zero. Once that was done, I copied the existing iteratorNext function and hacked that up to create the counterNext function.

counterNext: function(scratchpadId, counterId) {
	var response = {};

	var snhspu = new SNHScratchpadUtils();
	response = snhspu.getScratchpadProperty(scratchpadId, counterId);
	if (response.success) {
		var counter = parseInt(response.property_value);
		var previous = counter;
		counter++;
		var current = counter;
		counter++;
		var next = counter;
		response = snhspu.setScratchpadProperty(scratchpadId, counterId, current);
		if (response.success) {
			response.previous_value = previous;
			response.current_value = current;
			response.next_value = next;
			response.message = 'The current value of the counter is ' + current;
		}
	}

	return response;
}

That was pretty much it for the scripting changes. With that all put to bed, I popped over to the Flow Designer and basically did the same thing with my array Actions that I did with the array functions, copy them and then modify them for my new purpose. I used my Create Array Iterator Action as the starting point for my new Create Counter Action, and then used my Array Iterator Next Action as the basis of my new Increment Counter action. Once again, I spent more time removing things than I did adding things, and it all went relatively quickly. The only thing to do now was to do test it all out.

As with the array Actions, you have to first have a Scratchpad on which to store the values, so I ran a quick test on my Create Scratchpad Action to get a fresh Scratchpad ID. With that in hand, I pulled up my new Create Counter Action and hit the Test button, entered my Scratchpad ID and Counter ID and ran the test.

Testing the Create Counter Action

That all seemed to work out OK, so I pulled up my new Increment Counter Action and gave that one a whirl as well. In fact, I ran the test a few times, just to run up the number.

Increment Counter Action test results

Well, everything seems to work. Obviously, there is a lot more testing to do in order to check out all of the error handling built into the processes, but the main purpose of the exercise seems to be satisfied, so that’s always a good thing. If you want to play around with all of the various parts and pieces, here’s an Update Set that should contain everything that you would need.

Note: With the introduction of Flow Variables, this component is no longer necessary.