Collaboration Store, Part XXXIII

“The only impossible journey is the one you never begin.”
Tony Robbins

Last time, we completed the construction of the process that will send out a new version of an application out to all of the instances in the community. Now we need to come up with a mechanism to kick this process off whenever a new version of an application is published to the Host instance. The simplest way to do that would be to create a new Business Rule on one of the tables housing the artifacts of the new application. To make sure that all of the application artifacts have made their way to the Host, the safest table to target would be the Attachment table, since that is the last artifact to be saved or sent to the Host. Unfortunately, that table is used for all kinds of things, so we will need to make sure that only records attached to the application version table trigger the rule.

To create a new Business Rule, navigate to System Definition > Business Rules and click the New button to bring up the data entry form.

New Business Rule input form

We’ll call our new rule Distribute New Version, associate it with the sys_attachment table, run it async on insert, and check the Advanced checkbox so that we can include a script to run when the rule is triggered.

New Distribute New Version Business Rule

To limit the impact of the rule to just the inserts that we want, we need to add a couple of Filter Conditions, one for the Table and another for the Content Type.

Business Rule filter conditions

On the Advanced tab, we will want to add a Condition that will ensure that this rule will only run on a Host instance. The easiest way to do that is to simply compare the stock instance_name property with the scoped host_instance property. Those two values will only be the same on a Host instance.

gs.getProperty('instance_name') == gs.getProperty('x_11556_col_store.host_instance')

The other thing that we need to do on the Advanced tab is to enter the script that will launch our Subflow. The original way to do that was to utilize the FlowAPI, which is the way that I had coded it initially.

sn_fd.FlowAPI.startSubflow('Distribute_New_Application_Version', {new_version: current.getDisplayValue('table_sys_id'), attachment_id: current.getUniqueValue()});

However, later I decided that it was high time to start embracing the new, preferred approach, which is to use the ScriptableFlowRunner. The code for that is a little more complicated, but at some point the older way of doing this will undoubtedly go away, so we might as well start transitioning things in that direction. Here is the script for the Business Rule using this newer technique:

(function executeRule(current, previous) {
	try {
		var result = sn_fd.FlowAPI.getRunner()
			.subflow('x_11556_col_store.Distribute_New_Application_Version')
			.inBackground()
			.withInputs({new_version: current.getDisplayValue('table_sys_id'), attachment_id: current.getUniqueValue()})
			.run();
	} catch (e) {
		gs.error('Business Rule Distribute New Version - Error running subflow Distribute_New_Application_Version: ' + e.getMessage());
	}
})(current, previous);

That completes the Business Rule, so all we need to do at this point is to Save it and now the next time that an XML attachment is created for a version record, our new Subflow will kick off, distributing the new version of the application to all instances in the community (except for the Host and the instance that produced the application). This also completes the application distribution process, so now it is time to release yet another Update Set so that those of you who have been nice enough to do a little testing can give it a go.

If you have just jumped into this and have not been following along, here is what you will need in order to install the app and try it out:

  • If you have not done so already, you will need to install the latest version of snh-form-fields, which you can find here.
  • You will also need to install this additional global component, which is not included in the scoped application.
  • Once you have successfully installed the prerequisites, you will then need to install the scoped application from this Update Set.
  • After the scoped application has been installed, using an admin account, navigate to Collaboration Store -> Collaboration Store Set-up and complete the set-up process for either a Host or Client instance. The Host instance will need to be set up first, as the set-up for a Client instance requires successful contact with a functioning Host.

And again, as I mentioned last time, in order to test out the distribution process, you will need three or more instances in your community so that one of the Client instances can publish the application to the Host instance and then the Host instance can distribute the new version out to all of the other Client instances. As always, all feedback is welcome and appreciated, and if you are able to get everything to actually work, please let me know in the comments. Thanks!

Next time, we will take a look at where we go from here. Thanks again to all of you who have taken the time to help me out and have let me know what you have found.

Automatically Link Referenced Tasks, Improved

“The secret of change is to focus all of your energy not on fighting the old, but on building the new”
Socrates

After running my LinkReferencedTasks Business Rule for a while, it has become apparent that there was a flaw in my approach. Whenever someone separates different task numbers with a special character, neither one gets picked up in the process. For example, if you enter something like REQ0010005/RITM0010009, the process that converts all special characters to an empty string ends up converting that to REQ0010005RITM0010009, which starts with REQ, but is not a valid Task number. A little Corrective Maintenance should solve that problem. Let’s convert this:

text = text.replace(/\n/g, ' ');
text = text.replace(/[^A-Z0-9 ]/g, '');

… to this:

text = text.replace(/[^A-Z0-9 ]/g, ' ');

Originally, I had converted all line feeds to spaces and then all characters that were not letters, numbers, or spaces to an empty string. Once I decided to change all characters that were not letters, numbers, or spaces to a space, I no longer needed the line above, as line feeds fall into that same category. That solved that problem.

However, while I was in there, I decided to do a little Perfective Maintenance as well. By changing all of the special characters to single spaces, I thought that I might end up with several spaces in a row, which would result in one or more empty strings in my array of words. To screen those out, I thought about discarding words with a length of zero, but then it occurred to me that short words of any kind will not be task numbers, so I settled on an arbitrary minimum length of 6 instead. Now the code that unduplicates the list of words looks like this:

var unduplicated = {};
for (var i=0; i<words.length; i++) {
	var thisWord = words[i];
	if (thisWord.length > 5) {
		unduplicated[thisWord] = thisWord;
	}
}

This should reduce the clutter quite a bit and minimize the number of words that need to be checked to see if they start with one of the specified prefixes.

And finally, I added an Enhancement, which I had actually thought about earlier, but didn’t implement. This enhancement allows you to specify the relationship type instead of just accepting the hard-coded Investigates relationship that I had put in the original. For backward compatibility, I did not want to make that mandatory, so I kept that as a default for those implementations that did not want to specify their own. The new version of the Script Include now looks like this:

var TaskReferenceUtils = Class.create();
TaskReferenceUtils.prototype = {
    initialize: function() {
    },

	linkReferencedTasks: function(taskGR, prefix, relationship) {
		if (!relationship) {
			relationship = 'd80dc65b0a25810200fe91a7c64e9cac';
		}
		var text = taskGR.short_description + ' ' + taskGR.description;
		text = text.toUpperCase();
		text = text.replace(/[^A-Z0-9 ]/g, ' ');
		var words = text.split(' ');
		var unduplicated = {};
		for (var i=0; i<words.length; i++) {
			var thisWord = words[i];
			if (thisWord.length > 5) {
				unduplicated[thisWord] = thisWord;
			}
		}
		for (var word in unduplicated) {
			for (var x in prefix) {
				if (word.startsWith(prefix[x])) {
					this._findTask(word, taskGR.getUniqueValue(), relationship);
				}
			}
		}
	},

	_findTask: function(number, child) {
		var taskGR = new GlideRecord('task');
		if (taskGR.get('number', number)) {
			this._documentRelationship(taskGR.getUniqueValue(), child, relationship);
		}
	},

	_documentRelationship: function(parent, child, relationship) {
		var relGR = new GlideRecord('task_rel_task');
		relGR.addQuery('parent', parent);
		relGR.addQuery('child', child);
		relGR.query();
		if (!relGR.next()) {
			relGR.initialize();
			relGR.parent = parent;
			relGR.child = child;
			relGR.type = relationship;
			relGR.insert();
		}
	},

    type: 'TaskReferenceUtils'
};

So, a little fix here, a little improvement there, and a brand new feature over there, and suddenly we have a new version better than the last. Stuff’s getting better. Stuff’s getting better every day. Here’s the improved Update Set.

Automatically Link Referenced Tasks

“It is necessity and not pleasure that compels us.”
Dante Alighieri

Occasionally, we will get an Incident Ticket that references another task present in the system. This is usually a status request on a Service Catalog Request or a problem with a recent Change. It would be nice to be able to just click on those recognizable task numbers, but both the Short Description and the Description are plain text fields where that is not an option. There is, however, a handy out-of-the-box UI Formatter that you can include on your Incident Form that provides the capability to link other tasks to your Incident. The name of the Formatter is Task Relations, and I like to drag it onto the Incident Form in the Forms Designer right underneath the Formatter for Contextual Search Results. Once you have that Formatter present on your Incident Form, you can click on the green plus sign and link other tasks of various types to your Incident.

That’s a really cool feature that allows you click on the related tasks to open them up, but being the lazy developer that I am, I would prefer not to have to go through all of the manual work to set up all of the links to the things mentioned in the text of the ticket. In my ideal world, the system would be smart enough to read the text, recognize a task number, and then build the link for me so that I don’t have to do all of that work by hand, and also to make sure that I did not miss anything. How hard could that be?

My thought was that I could create a Business Rule linked to the Incident table that would examine the two primary text fields (Short Description and Description), look for anything that appeared to be a task number, search the Task table to see if it really was a task number, and if so, build the link for me. It seemed like a relatively simple thing to do, so I went to work.

It felt as if there might be a lot of code involved, so instead of putting all of that in the Business Rule itself, I decided to build a Script Include that I could call from the Business Rule. I thought that if I could make the function generic enough, I might be able to reuse it for other Business Rules linked to different Task tables. Plus, putting all of the code in the Script Include keeps the Business Rule a lot cleaner. Here are the things that I thought that I needed to do in order to get this all to work:

  • Combine the two text fields on the Incident into a single string variable for examination,
  • Convert the text to upper case,
  • Convert all line feeds to spaces,
  • Remove all characters that are not letters, numbers, or spaces,
  • Split the string by spaces creating an array of words,
  • Unduplicate the array of words so that we only looked at each unique word once,
  • Examine each word to see if it started with a known task number prefix,
  • Read the Task table for every word that started with a known task number prefix, and
  • Build a relationship record for every Task record that was found on the Task table.

All we need to do now is turn that into code.

Combine the two text fields on the Incident into a single string variable for examination:

var text = taskGR.short_description + ' ' + taskGR.description;

Convert the text to upper case:

text = text.toUpperCase();

Convert all line feeds to spaces:

text = text.replace(/\n/g, ' ');

Remove all characters that are not letters, numbers, or spaces:

text = text.replace(/[^A-Z0-9 ]/g, '');

Split the string by spaces creating an array of words:

var words = text.split(' ');

Unduplicate the array of words so that we only looked at each unique word once:

var unduplicated = {};
for (var i=0; i<words.length; i++) {
	var thisWord = words[i];
	unduplicated[thisWord] = thisWord;
}

Examine each word to see if it started with a known task number prefix:

for (var word in unduplicated) {
	for (var x in prefix) {
		if (word.startsWith(prefix[x])) {
			this._findTask(word, taskGR.getUniqueValue());
		}
	}
}

Read the Task table for every word that started with a known task number prefix:

_findTask: function(number, child) {
	var taskGR = new GlideRecord('task');
	if (taskGR.get('number', number)) {
		this._documentRelationship(taskGR.getUniqueValue(), child);
	}
},

Build a relationship record for every Task record that was found on the Task table:

_documentRelationship: function(parent, child) {
	var relGR = new GlideRecord('task_rel_task');
	relGR.addQuery('parent', parent);
	relGR.addQuery('child', child);
	relGR.query();
	if (!relGR.next()) {
		relGR.initialize();
		relGR.parent = parent;
		relGR.child = child;
		relGR.type = 'd80dc65b0a25810200fe91a7c64e9cac';
		relGR.insert();
	}
},

Before I create a relationship record, I want to make sure that there isn’t already a relationship record out there, so I do a quick query just to check before I commit to inserting a new one. The two records don’t need to be linked more than once. I also hard-coded the relationship type, which works for my current purpose, but if I ever want to expand this out to other use cases, I may eventually want to pass that in as an argument as I did with the task number prefixes. Here is the whole thing, all put together:

var TaskReferenceUtils = Class.create();
TaskReferenceUtils.prototype = {
    initialize: function() {
    },

	linkReferencedTasks: function(taskGR, prefix) {
		var text = taskGR.short_description + ' ' + taskGR.description;
		text = text.toUpperCase();
		text = text.replace(/\n/g, ' ');
		text = text.replace(/[^A-Z0-9 ]/g, '');
		var words = text.split(' ');
		var unduplicated = {};
		for (var i=0; i<words.length; i++) {
			var thisWord = words[i];
			unduplicated[thisWord] = thisWord;
		}
		for (var word in unduplicated) {
			for (var x in prefix) {
				if (word.startsWith(prefix[x])) {
					this._findTask(word, taskGR.getUniqueValue());
				}
			}
		}
	},

	_findTask: function(number, child) {
		var taskGR = new GlideRecord('task');
		if (taskGR.get('number', number)) {
			this._documentRelationship(taskGR.getUniqueValue(), child);
		}
	},

	_documentRelationship: function(parent, child) {
		var relGR = new GlideRecord('task_rel_task');
		relGR.addQuery('parent', parent);
		relGR.addQuery('child', child);
		relGR.query();
		if (!relGR.next()) {
			relGR.initialize();
			relGR.parent = parent;
			relGR.child = child;
			relGR.type = 'd80dc65b0a25810200fe91a7c64e9cac';
			relGR.insert();
		}
	},

    type: 'TaskReferenceUtils'
};

Now that we our Script Include, we need to build the Business Rule that calls it. For my purpose, I added a Business Rule to the Incident table and called it LinkReferencedTasks. I checked the Advanced checkbox and made it active async on Insert whenever there was text in either of the two description fields. I could have also triggered it on Update as well, but in my experience, the Incident description is usually captured when the Incident is created an rarely updated after that.

LinkReferencedTasks Business Rule

Under the Advanced tab, I entered the following script:

(function executeRule(current, previous) {
	new TaskReferenceUtils().linkReferencedTasks(current, ['REQ','RITM','SCTASK','CHG']);
})(current, previous);

In addition to the current GlideRecord, you also need to pass in a string array of task table prefixes which will be used somewhat like a filter to only link tasks that start with those values. If you want a different mix of task types, you can just update that list. That’s it. We are done. Well, maybe we had better test it out first, but the building part is done anyway. Testing should be simple enough: we just need to find an existing task that we want to reference and then include that task number in the text of a new incident. Let’s do that now.

New test Incident with references to other tasks in the Short Description field

Now all we have to do is hit that Submit button and see if those tasks referenced in the text are automatically linked to the Incident. Seems as if a little drum-roll would be appropriate here …

Test results for the new Business Rule and Script Include

Well, nothing ever goes right the very first time! It looks like we managed to create a link to the RITM, but not to the Request. Fortunately, it turns out that this is a tester error and not a developer error. When I typed in the Request number, I missed a zero. The actual Request number is REQ0010005, not REQ001005. Of course, my explanation for that is that I did that on purpose to demonstrate that it will only link real requests, and if your request number is not a real request, then it won’t bother to attempt to create a link to it. That’s it — I did it on purpose — it was all part of the plan. You know that you have to test for failure as well as success — I just did it all in one test because I am so efficient. Sometimes I even amaze myself!

Anyway, it all seems to work, so for those of you who like to play along at home, here’s an Update Set with all of the parts.

Update: There is a better (corrected) version here.

Fun with Webhooks, Part VI

“The most exciting phrase to hear in science, the one that heralds discoveries, is not ‘Eureka!’ but ‘Now that’s funny …'”
Isaac Asimov

Even though we are not quite finished building out our new Subflow just yet, we have enough elements in place that we can try out what we have so far, and see if it actually works. The simplest way to do that would be to create a Webhook Registry that watches a single Incident, and then go make a qualifying change to that Incident and see what happens. Let’s go find an open Incident and then we can set up our registry.

In my instance, INC0000002 seemed like a likely candidate, so I pulled up the list of Webhook Registrations, clicked on the New button, and created the following new Webhook Registry entry to track changes to that specific Incident.

New Webhook Registry entry

For the URL, I used that same webhook.site address that I had used before so that I could see the result on the other end of the transaction. Once I had saved my new Webhook Registry entry, all I needed to do in order to test it out was to make a change to the Incident. For my first test, I simply changed the Assignment Group from Network to Hardware, which also then removed the Assigned to field value, since the original person specified in that field was not a member of the new Assignment Group. After making the change, I went over to webhook.site to see if anything showed up as a result of that modification. Sure, enough, a new POST had just been received by that site shortly after I saved the record.

New Webhook posting after updating an Incident

Although that proves that the process does indeed work as intended, I did notice one thing that is going to require a little bit of rework. Take a look at the second line of the message property:

Assigned To changed from Howard Johnson to  on INC0000002.

The Assigned to field changed from having a value to not having a value. The code that we have in our buildPayload function doesn’t really handle that circumstance in a way that produces the appropriate text to explain that in plain English. Here is the logic that creates that portion of the message:

if (current.assigned_to != previous.assigned_to) {
	payload.assigned_to = current.assigned_to;
	if (previous.assigned_to) {
		payload.message += separator + 'Assigned To changed from ' + previous.assigned_to + ' to ' + current.assigned_to + ' on ' + payload.id + '.';
	} else {
		payload.message += separator + 'Assigned To set to ' + current.assigned_to + ' on ' + payload.id + '.';
	}
	separator = '\n\n';
}

There is a check to see if the previous value was blank, but there is no check to see if the new value is blank. It wouldn’t take much to throw that in there, though, so let’s go ahead and do that now while we are thinking about it.

if (current.assigned_to != previous.assigned_to) {
	payload.assigned_to = current.assigned_to;
	if (previous.assigned_to) {
		if (current.assigned_to) {
			payload.message += separator + 'Assigned To changed from ' + previous.assigned_to + ' to ' + current.assigned_to + ' on ' + payload.id + '.';
		} else {
			payload.message += separator + 'Assigned To ' + previous.assigned_to + ' removed.';
		}
	} else {
		payload.message += separator + 'Assigned To set to ' + current.assigned_to + ' on ' + payload.id + '.';
	}
	separator = '\n\n';
}

That’s better. While we are at it, we might as well make the same change to the code for the Assignment group, as that is pretty much a line for line copy of the code for the Assigned to field. Everything else looks to be OK. The State is mandatory, so it will never have a new blank value, and the Comments and Work Notes fields are based on new entries and not on changes to existing entries, so if either of those are ever blank, we won’t be doing anything with them at all.

Still, other than that one little annoying language problem, the test was actually quite successful. Our Business Rule fired, the change was detected, our new Subflow was launched, our payload was generated, and the payload was successfully POSTed to the appropriate URL. All in all, I would have to say that that was a pretty good first test of the process. Obviously, we need to do a lot more testing, but this was a great start.

At this point, all I really wanted to do was to make sure that everything was working so far, which I think we accomplished. Before we do too much more testing, though, I think we need to go back and finish up the rest of the Subflow. We still have to add in the step that logs the activity, and we also have to add any error handling for when things don’t go as planned. Once we build all of that out, then we can do a lot more testing to validate the entire process from end to end. It was good to take a quick break and make sure that everything was working up to this point, but now that we know that it is, we need to get back to work on finishing up our development efforts.

Fun with Webhooks, Part IV

“Try not to become a man of success, but rather try to become a man of value.”
Albert Einstein

Last time, we came to a fork in the road, not knowing whether it would be better to jump into the My Webhooks Service Portal widget, or to start working out the details of the process that actually POSTs the Webhooks. At this point, I am thinking that the My Webhooks widget will end up being a fairly simple clone of my old My Delegates widget, so that may not be all that interesting. POSTing the Webhooks, on the other hand, does sound like it might be rather challenging, so let’s start there.

When I first considered attempting this, my main question was whether it would be better to handle this operation in a Business Rule or by building something out in the Flow Designer. After doing a little experimenting with each, I later came to realize that the best alternative involved a little bit of both. In a Business Rule, you have access to both the current and previous values of every field in the record; that information is not available in the Flow Designer. In fact, it’s not available in a Business Rule, either, if you attempt to run it async. But if you don’t run it async, then you are holding everything up on the browser while you wait for everything to get posted. That’s not very good, either. What I ultimately decided to do was to start out with a Business Rule running before the update, gather up all of the needed current and previous values, and then pass them to a Subflow, which runs async in the background.

My first attempt was to just pass the current and previous objects straight into my Subflow, but that failed miserably. Apparently, when you pass a GlideRecord into a Subflow, you are just passing a reference, not the entire record, and then when the Subflow starts up, it uses that reference to fetch the data from the database. That’s not the data that I wanted, though. I want the data as it was before the database was updated. I had to take a different route with that part of it, but the basic rule remained the same.

Business Rule to POST webhooks

The rule is attached to the Incident table, runs on both Insert and Update, and is triggered whenever there is a change to any of the following fields: State, Assignment Group, Assigned to, Work notes, or Additional comments.

To get around my GlideRecord issue, I wound up creating two objects, one for current and one for previous, which I populated with data that I pulled out of the original GlideRecords. When I passed those two objects to my Subflow, everything was there so that I could do what I wanted to do. Populating the objects turned out to be quite a bit of code, so rather than put that all in my Business Rule, I created a function in my Script Include called buildSubflowInputs and I passed it current and previous as arguments. That simplified the Business Rule script quite a bit.

(function executeRule(current, previous) {
	var wru = new WebhookRegistryUtils();
	sn_fd.FlowAPI.startSubflow('Webhook_Poster', wru.buildSubflowInputs(current, previous));
})(current, previous);

In the Script Include, things got a lot more complicated. Since I was going to be turning both the current and previous GlideRecords into objects, I thought it would be helpful to create a function that did that, which I could call once for each. That function would pull out the values for the Sys ID, Number, State, Caller, Assignment group, and Assigned to fields and return an object populated with those values.

getIncidentValues: function(incidentGR) {
	var values = {};

	values.sys_id = incidentGR.getValue('sys_id');
	values.number = incidentGR.getDisplayValue('number');
	if (!incidentGR.short_description.nil()) {
		values.short_description = incidentGR.getDisplayValue('short_description');
	}
	if (!incidentGR.state.nil()) {
		values.state = incidentGR.getDisplayValue('state');
	}
	if (!incidentGR.caller_id.nil()) {
		values.caller = incidentGR.getDisplayValue('caller_id');
		values.caller_id = incidentGR.getValue('caller_id');
	}
	if (!incidentGR.assignment_group.nil()) {
		values.assignment_group = incidentGR.getDisplayValue('assignment_group');
		values.assignment_group_id = incidentGR.getValue('assignment_group');
	}
	if (!incidentGR.assigned_to.nil()) {
		values.assigned_to = incidentGR.getDisplayValue('assigned_to');
		values.assigned_to_id = incidentGR.getValue('assigned_to');
	}

	return values;
},

Since my Business Rule could be fired by either an Update or an Insert, I had to allow for the possibility that there was no previous GlideRecord. I could arbitrarily call the above function for current, but I needed to check to make sure that there actually was a previous before making that call. I also wanted to add some additional data to the current object, including the name of the person making the changes. The field sys_updated_by contains the username of that person, so to get the actual name I had to use that in a query of the sys_user table to access that data.

buildSubflowInputs: function(currentGR, previousGR) {
	var inputs = {};

	inputs.current = this.getIncidentValues(currentGR);
	if (currentGR.isNewRecord()) {
		inputs.previous = {};
	} else {
		inputs.previous = this.getIncidentValues(previousGR);
	}
	inputs.current.operator = currentGR.getDisplayValue('sys_updated_by');
	var userGR = new GlideRecord('sys_user');
	if (userGR.get('user_name', inputs.current.operator)) {
		inputs.current.operator = userGR.getDisplayValue('name');
	}

	return inputs;
},

One other thing that I wanted to track was any new comments or work_notes, and that turned out to be the most the most complicated code of all. If you try the normal getValue or getDisplayValue methods on any of these Journal Fields, you end up with all of the comments ever made. I just wanted the last one. I had to do a little searching around to find it, but there is a function called getJournalEntry that you can use on these fields, and if you pass it a 1 as an argument, it will return the most recent entry. However, it is not just the entry; it is also a collection of metadata about the entry, which I did not want either. To get rid of that, I had to find the first line feed and then pick off all of the text that followed that. Putting all of that together, you end up with the following code:

if (!currentGR.comments.nil()) {
	var c = currentGR.comments.getJournalEntry(1);
	inputs.current.comments = c.substring(c.indexOf('\n') + 1);
}
if (!currentGR.work_notes.nil()) {
	var w = currentGR.work_notes.getJournalEntry(1);
	inputs.current.work_notes = w.substring(w.indexOf('\n') + 1);
}

That’s pretty much all that there is to the new Business Rule and the needed additions to the Script Include. Now we need to start tackling the Subflow that will actually take these current and previous objects and do something with them. That’s a bit much to jump into at this point, so we’ll wrap things up here for now and save all of that for our next time out.

Fun with Outbound REST Events, Part V

“The best preparation for good work tomorrow is to do good work today.”
Elbert Hubbard

At the end of the last installment in this series, I mentioned two possible options for the next item on which to focus our energies. At the time, I wasn’t really sure which direction would be the preferable choice, but now that it is time to fish or cut bait, a decision needs to be made. Given that the entire purpose of this exercise is to demonstrate the use of Event Management practices on internal ServiceNow processes, I believe it would be good to go ahead and wrap up our example Use Case at this point so that we can devote the remainder of our time exclusively on the Event Management aspects. All that is really left to do in order to to complete our address verification scenario is to add the form validation to the User Profile form. That process will leverage the new Script Include and Outbound REST Message that we have already completed.

There are two different ways that we can go about this: we can use an onSubmit Business Rule on the server side, or we can use an onSubmit Client Script on the client side. The Business Rule route is actually a little easier, as you have access to both the current and previous versions of the GlideRecord, and you can call the Script Include directly, without the need for GlideAjax. My problem with that, though, from a User Experience perspective, is that you have to submit the entire form to the server for processing, which then gets reloaded if you have validation issues. My preference is to validate the form right where it sits on the client side, before the form ever gets submitted to the server. For that, we need to build a Client Script.

In fact, for our purpose, we will need two Client Scripts, one an onLoad script and the other an onSubmit script. The reason that we have to have an onLoad script is because you do not have access to the previous field values on the client side like you do with a server-side Business Rule. We will need to snag those values and stuff them somewhere for safekeeping as soon as the form loads. The ServiceNow platform provides a place for just that sort of thing called the g_scratchpad. You can pretty much throw whatever you want in there and it will be available for use until the form is reloaded. The entire onLoad script is just a few short lines of code.

function onLoad() {
	g_scratchpad.originalStreet = g_form.getValue('street');
	g_scratchpad.originalCity = g_form.getValue('city');
	g_scratchpad.originalState = g_form.getValue('state');
	g_scratchpad.originalZip = g_form.getValue('zip');
}

That’s all there is to that. With those initial values safely tucked away, we can then pull them back out later on and refer to them in our onSubmit script to see if anything has changed. Before we start in on our onSubmit script, however, we need to go back to our Script Include and add just a bit of code. When we first built out Script Include, we did not set it up to be client callable, but we are going to need to do that now so that we can access it via GlideAjax. It’s a simple checkbox on the Script Include form, which triggers a slight change in the prototype for the script. That change is handled automatically for any new script, but since we have been working on ours for a while, checking the box may not actually alter any modified code. In that case, you just need to modify the prototype line yourself to look like this:

AddressValidationUtils.prototype = Object.extendsObject(AbstractAjaxProcessor, {

Also, you will need to go down to the very bottom of the script and insert a closing paren in between the final curly brace and the terminating semi-colon. Once that’s done, we can add a new function that we can call from the client side.

validateAddressViaClient: function() {
	var street = this.getParameter('sysparm_street');
	var city = this.getParameter('sysparm_city');
	var state = this.getParameter('sysparm_state');
	var zip = this.getParameter('sysparm_zip');
	return JSON.stringify(this.validateAddress(street, city, state, zip));
},

This function just grabs all of the parameters passed via GlideAjax and passes them to our existing function, and then turns the response object into a string that can be returned in the XML Ajax response. Now we are all set up to receive calls from the client side.

Being a validation script, our onSubmit script will need to prevent the submission of the form in the event that any validation errors are detected. Without an Ajax call back to the server, that’s easily accomplished by returning false from the onSubmit function. However, making an asynchronous Ajax call means you are leaving your function without the answer in hand, so you don’t know whether or not any issues were detected until the response comes back, which will be in a completely different thread. Although you could utilize the getXMLWait() method to simply wait for the answer, that approach is quite frowned upon in Client Scripts for a variety of reasons, so we do not want to go down that road. Instead, we will use a modified version of this technique.

The approach is to cancel submission of the form before making the Ajax call, and then when the response comes in, submit the form a second time if all is well. Of course, submitting the form again will launch the onSubmit script again, so we need to make it smart enough to know that this is the second submit and to not start the whole process all over again. To accomplish that, we use yet another g_scratchpad property to let the script know that validation has already taken place. Here is the main onSubmit script:

function onSubmit() {
	var submitForm = true;

	var actionName = g_form.getActionName();
	if (!g_scratchpad.isFormValid) {
		submitForm = checkAddress();
	}

	return submitForm;
}

On the first submit, we default the submitForm variable to true, capture the name of the button that was pushed to submit the form so that we can use it when we submit again later, and then check to see if validation has already taken place and passed by testing the isFormValid scratchpad property. In the case of the first submit, isFormValid has not been set to true, so we check the address to see if it needs validated. If it does, then the submitForm variable will be set to false, and the form will not be submitted.

Assuming that it does need to be validated, the checkAddress function will make the Ajax call and cancel the original form submission by returning false. When the Ajax response comes in, if the address is valid, the response function will then submit the form a second time using the saved actionName after setting the isFormValid scratchpad property to true. When the onSubmit function then starts again due to the second submit, it will bypass the address check and simply allow the form to submit due to the isFormValid scratchpad property being set to true.

It’s all a little convoluted, but it works. Here is the checkAddress function:

function checkAddress() {
	var submitForm = true;

	var street = g_form.getValue('street');
	var city = g_form.getValue('city');
	var state = g_form.getValue('state');
	var zip = g_form.getValue('zip');
	if (street || city || state || zip) {
		if (street != g_scratchpad.originalStreet || city != g_scratchpad.originalCity || state != g_scratchpad.originalState || zip != g_scratchpad.originalZip) {
			GlideUI.get().clearOutputMessages();
			var ga = new GlideAjax("AddressValidationUtils");
			ga.addParam('sysparm_name', 'validateAddressViaClient');
			ga.addParam('sysparm_street', street);
			ga.addParam('sysparm_city', city);
			ga.addParam('sysparm_state', state);
			ga.addParam('sysparm_zip', zip);
			ga.getXMLAnswer(processXMLAnswer);
			submitForm = false;
		}
	}

	return submitForm;
}

Just like in the main onSubmit function, we first default the submitForm variable to true, and then we grab all of values for the four address components off of the g_form object. The first thing that we check is if there is even any data in any of the four fields. If they are all empty, then there is nothing to validate. If there is some data there, then the next thing that we check is if it is any different than the values that we squirreled away in our onLoad script. If there are no changes, then again there is no need for validation. But if there is data there and it has changed in any way, now we are going to be making that GlideAjax call. Most of that is pretty standard GlideAjax stuff, but we also clear out any error messages on the screen from previous attempts and set the submitForm variable to false to kill the original form submission.

To process the Ajax response, we have yet another function, processXMLAnswer:

function processXMLAnswer(answer) {
	var response = JSON.parse(answer);
	if (response.result == 'invalid') {
		g_form.addErrorMessage('Unable to verify address entered using the US Address validation service');
		g_form.showFieldMsg('street', 'Unverifiable address', 'error');
		g_form.showFieldMsg('city', 'Unverifiable address', 'error');
		g_form.showFieldMsg('state', 'Unverifiable address', 'error');
		g_form.showFieldMsg('zip', 'Unverifiable address', 'error');
	} else {
		if (response.result == 'valid') {
			g_form.setValue('street', response.street);
			g_form.setValue('city', response.city);
			g_form.setValue('state', response.state);
			g_form.setValue('zip', response.zip);
		}
		g_scratchpad.isFormValid = true;
		g_form.submit(actionName);
	}
}

The response comes back in the form of a string, so the first thing that we have to do is convert it back to an object. If the result property of that object is ‘invalid’, then we leave the form unsubmitted and throw a few error messages up on the screen; otherwise, we are going to submit the form a second time. But before we do that, we check to see if the result property is ‘valid’, in which case we overlay the user’s input with the clean address values returned by the service. After that , we set our isFormValid scratchpad property to true and resubmit the form using the saved actionName.

It’s all a little complicated, having to pass through the onSubmit script twice due to the double submit, but it all makes sense if you really think about it. Of course, we won’t really know if it works until we try it, so let’s pull up a user’s profile and give it a shot.

For our first test, let’s see if we can get a validation error. We can use the test address that we have been using up to this point, but let’s change the state from FL to TX and let’s drop the zip code entirely. Also, let’s put everything in lower case, just for fun. OK, let’s see what happens.

Test using invalid address

Having both form-level and field-level error messages is probably a little overkill, but everything seems to have worked. Now, let’s change the state back to FL and see if that is enough to consider it a valid address.

User profile after address validation

Not only did it pass validation, it also corrected our capitalization and added the missing zip code. That’s actually pretty slick. This wasn’t really the point of this series, but I like this address validation feature. I was originally just looking for something that might have a failure to showcase the Event Management stuff, but this turned out to be a pretty handy little addition that could be useful in a number of other places, such as the Building and Location forms.

This now completes the example working feature that has the potential for failure. From this point on, we can focus on what we came here for, which is to log and process Events that originate in ServiceNow. Next time out, we will start in on the logging.

Generic Feedback Widget, Part V

“If at first you don’t succeed, you are running about average.”
M.H. Alderson

I looked at several different ways to solve my problem with the Generic Feedback Widget, but I couldn’t come up with anything that didn’t involve inactivating or altering the ACL that was at the heart of the issue.Finally, I settled on a plan that would at least involve minimally invasive alterations to the ACL. The plan was pretty simple: create an obscure User Preference and set it to true just before accessing the live_group_profile record, and then delete the preference as soon as the record was obtained. The alteration to the ACL, then, would be to check for that preference before applying the ACL. The updated version of the ACL script now looked like this:

if (gs.getPreference('snh.live.group.read.authorization') == 'true') {
	answer = true;
} else {
	var gr = new GlideRecord('live_group_member');
	gr.addQuery('member', GlideappLiveProfile().getID());
	gr.addQuery('group', current.sys_id);
	gr.addQuery('state', 'admin').addOrCondition('state', 'active');
	gr.query();
	answer = gr.next();
}

The first thing that we do is check for the preference, and if it’s there, then we bypass the original code; otherwise, things proceed as they always have. I don’t really like tinkering with stock components if I can avoid it, mainly because of the subsequent issues with patches and upgrades potentially skipping an upgrade of anything that you have touched. Still, this one seemed to be unavoidable if I wanted to salvage the original intent and still do what I wanted to do.

The next thing that I needed to do was to set the preference just before attempting the read operation, and then removing it as soon as I was done. That code turned out to look like this:

gs.getUser().setPreference('snh.live.group.read.authorization', 'true');
grp = new GlideRecord('live_group_profile');
grp.addQuery('table', table);
grp.addQuery('document', sys_id);
grp.query();
if (grp.next()) {
	groupId = grp.getValue('sys_id');
}
gs.getUser().setPreference('snh.live.group.read.authorization', null);

I ended up pulling that out of the widget and putting it into its own Script Include, mainly to tuck the specialized code away and out of sight. Anyway, it all sounded like a great plan and all I needed to do now was to test it out, so I did. And it failed. So much for my great plan.

It took a little digging, but I finally figured out that the ACL was not the only thing keeping people from outside the group from reading the group profile record. There are also a number of Business Rules that do pretty much the same thing. I spent a little time combing through all of those looking for ways to hack around them, and then finally decided that, for my purposes anyway, I really didn’t to be running any Business Rules at all. So I added one more line to my read script to turn off all of the Business Rules.

gs.getUser().setPreference('snh.live.group.read.authorization', 'true');
grp = new GlideRecord('live_group_profile');
grp.setWorkflow(false);
grp.addQuery('table', table);
grp.addQuery('document', sys_id);
grp.query();
if (grp.next()) {
	groupId = grp.getValue('sys_id');
}
gs.getUser().setPreference('snh.live.group.read.authorization', null);

That did it. Now, people who are not in the group can still read the group profile record, which is good, because you need the sys_id of that record to read all of the messages in the group, which is what we are using as feedback. The only thing that I have accommodated at this point is situations where a group profile record does not exist at all, and I have to create one.

But that’s an entirely different adventure

Scoped Application Properties, Part II

“Any daily work task that takes 5 minutes will cost over 20 hours a year, or over half of a work week. Even if it takes 20 hours to automate that daily 5 minute task, the automation will break even in a year.”
Breaking into Information Security: Crafting a Custom Career Path to Get the Job You Really Want by Josh More, Anthony J Stieber, & Chris Liu

So far, we have created the UI Action to produce our Setup Properties button, configured the conditions under which the button would appear, and built the Business Rule to ensure that all properties created for an application are placed in a Systems Properties Category of the same name. Before we jump into the business of building out all of the things that we want to automate through the push of that button, there are just a couple of more little odds and ends that we will want to take care of first. One thing that will be helpful in maintaining System Properties for the application will be to have the properties listed out as Related Lists. That’s accomplished fairly easily by selecting Configure -> Related Lists from the hamburger menu on the main Application configuration page:

Configuring Related Lists on the main Application page

This will bring up the Available and Selected Related Lists for an Application, and you just need to find the System Property->Application entry in the Available bucket, highlight it, and then use the right arrow button to push it over into the Selected bucket. Oh, and don’t forget to click on that Save button once you’re done.

Activating the System Properties Related List

One other little thing that would be handy would be a link to the property admin page somewhere on the main Application configuration page. Even though our setup automation will be creating a link to that page somewhere on the main navigation menu, it would still be handy to have a link to that same page right here where we are configuring the app. The format of the URL for that link in both instances is the address of the page, plus a couple of parameters, one for the page title and the other for the name of the Category:

/system_properties_ui.do?sysparm_title=<appName> Properties&sysparm_category=<appName>

To build a link to that page on the main Application configuration page can be easily accomplished with another UI Action similar to the first UI Action that we built for our button, but this time we will select the Form link checkbox instead of the Form button checkbox. The code itself is just constructing the URL and then navigating to that location:

function openPage() {
	var appname = document.getElementById('sys_app.name').value;
	var url = '/system_properties_ui.do?sysparm_title=' + encodeURIComponent(appname) + '%20Properties&sysparm_category=' + encodeURIComponent(appname);
	window.location.href = url;
}

There may be more elegant ways to do that, but this works, so that’s good enough for now. This link should show after the Application Properties have been configured, so the display rules are basically opposite of those for our button: the button will show until you use it and the link will only show after you use the button. We can pretty much steal the condition from other UI Action, then, and just flip the last portion that checks for the presence of the Category:

gs.getCurrentApplicationId() == current.sys_id && (new GlideRecord('sys_properties_category').get('name',current.name)) && (new GlideRecord('sys_properties_category_m2m').get('category.name',current.name))

Those of you who are paying close attention will notice that we also add yet one more check to our condition, and that was just to make sure that there was at least one System Property defined for the app. There is no point in bringing up the property value maintenance page if there aren’t any properties. Assuming that you have set up application properties and you have defined at least one property, you should see the new link appear down at the bottom of the main Application configuration page:

New link to the property value maintenance page

That should take care of all of those other little odds and ends … now all that is left is to build out the code that will handle all of those initial setup tasks. Last time, I said that we would take care of that here, but as it turns out, that that was a lie. We’ll have to take care of that next time.

Scoped Application Properties

“I have been impressed with the urgency of doing. Knowing is not enough; we must apply. Being willing is not enough; we must do.”
Leonardo da Vinci

Many times when I build a scoped application in ServiceNow I will end up creating one or more System Properties that are specific to the application. To allow application administrators to maintain the values for those properties, I will usually create a menu item for that purpose, either under the application’s menu section if there is one, or under the the general Systems Properties menu item if there is not. To set all of that up requires a certain amount of work, and being a person who likes to avoid work whenever possible, I decided that it might be worth looking into possible ways to automate all of that work so that I could accomplish all of those tasks with minimal effort. It’s not that I’m afraid of work — I can lay right down next to a huge pile of work and sleep like a babe — it’s just that if there is a way to automate something to save myself some of that time and effort, I’m all in.

So what exactly is it that I seem to keep doing over and over again whenever I start out on a new app? I guess the main thing is the menu option that brings up the System Properties related to the application. If the application happens to have a menu section of its own, then I usually add this down at the bottom of that section and label it Application Properties. For applications that do not have a menu section of their own, I will use the name of the application (plus the word Properties) for the label, and stick it in the existing System Properties section of the menu. Regardless of where the menu item lives, it will point to the stock ServiceNow System Properties value entry page, which is not driven by Scope, but by Category. In order to bring up just the properties for this particular application, I also need to create a Category, usually using the name of the application as the category name, and then put all of application’s properties in that Category. Of course, the easiest way to put all of an application’s properties into a specific category is to create a Business Rule that does just that whenever a new property is created. Now, you don’t want just anyone to be tinkering with these important properties, so yet another thing that I always end up having to create is a Role that can be used to secure access to the properties.

None of these things are complicated, but it all takes time, and the time adds up. If you could do it all with just the push of a button, that would be pretty sweet. So what would we need to do that? Well, right off the bat, I guess the first thing that you would need would be that button. Let’s create a UI Action tied to the application form. It doesn’t have to do anything at this point; we just want to get the button out there to start things off.

UI Action to Set Up Application Property Support

Open up the New UI Action form by clicking on the New button at the top of the UI Action list, and enter Setup Properties in both the Name and Action Name fields. Check the Form button checkbox and the Active checkbox and that should be enough to save the record and create the UI Action. Pull up any Scoped Application at this point, and you should see the new button at the top of the screen.

Setup Properties button on Custom Application page

You really only want to see that button on apps that haven’t already been set up for application-level properties. Once you push that button and everything gets set up, the button should go away and never be seen or heard from again. Pushing the button will initiate a number of different things, so there will be a number of potential things that you could check to see if it’s work has already been accomplished. One of the easiest is probably the presence of a System Properties Category with the same name as the application. To configure the UI Action to only appear if such a category does not yet exist, you can add this line to the Condition property of the UI Action:

gs.getCurrentApplicationId() == current.sys_id && !(new GlideRecord('sys_properties_category').get('name',current.name))

The above line also ensures that you are in the right scope to be editing the app, as you wouldn’t want anyone setting everything up under the wrong scope. So now we have the button and we have it set up so that it only shows up when it is needed. We also have to figure what, exactly, is going to happen when we click on the button, but that’s a little more complicated, so let’s just set that aside for now. A considerably easier task would be to create that Business Rule that makes sure that all System Properties created for this app get placed into the Category of the same name. That one is pretty straightforward.

(function executeRule(current, previous) {
	var sc = new GlideRecord('sys_scope');
	sc.get(current.sys_scope);  
	if (sc.name != "Global" && sc.name != "global") {
		var category = new GlideRecord('sys_properties_category');
		if (category.get('name', sc.name)) {
			var prop = new GlideRecord('sys_properties');
			prop.addQuery('sys_scope', current.sys_scope);
			prop.orderBy('suffix');
			prop.query();
			while (prop.next()) {
			    var propertyCategory = new GlideRecord('sys_properties_category_m2m');	
				propertyCategory.addQuery('property', prop.sys_id);
				propertyCategory.addQuery('category', category.sys_id);
				propertyCategory.query();
				if (!propertyCategory.next()) {
					propertyCategory.initialize();
					propertyCategory.category = category.sys_id;
					propertyCategory.application = current.sys_scope;
					propertyCategory.order = 100;
					propertyCategory.property = prop.sys_id;
					propertyCategory.insert();	
				}
			}
		}
	}
})(current, previous);

This is a fairly simple script that implements fairly simple rules: get the scope of the current property, make sure it isn’t global, see if there is a Category on file with the same name as the Scope, and if so, put the current property into that category. That’s it. Once this is active, whenever you create a non-global System Property, it will automatically be assigned to the Category created for that application.

This is probably a good place to stop for now. Next time out, we will get into the details of just what happens when you push that new button we created.