Periodic Review, Part V

“I find that the best way to do things is to constantly move forward and to never doubt anything and keep moving forward.”
John Frusciante

Last time, we wrapped up the initial testing of our first table and related form, so now it is time to move on to the other tables on our list. The next table that we will want to deal with is the table that contains the details for each run of the review process, which we will call Review Execution.

New Review Execution table

The first field that we will want to have is a reference back to the associated configuration record. We will call that one configuration. Then we will need a short and long description field, the run date, start and end times, some counters, a current state, and a completion code.

Fields for the new Review Execution table

Once we save all of the fields, we can pull up the form and arrange everything to suit our needs.

Form layout for the new Review Execution table

One last thing that we will want to do with this table is to set up a couple of state values, one for running and one for completed, and set the default value to running so that whenever a new record is created, it is automatically set to the running state.

Choices and default for the state field

That should take care of the Review Execution table. Now we will need to build tables for every notice that comes out of the review process and every item that appears on each notice. Let’s start with the notice table first, which we can call Review Notice.

Review Notice table

This table will have several reference fields, the first one being a link back to the associated execution. Other references will link to the email that was sent out, the recipient of the email, and the user who responded to the notice.

Fields for the new Review Notice table

With that completed, we can then lay out the form in the way that we would like it to appear.

Form layout for the Review Notice table

The last table that we will need to build before we can start looking at the actual process of sending out the notices is the table of items associated with each notice. We will call that one Review Notice Item.

Review Notice Item table

This one will contain reference fields as well, including a reference to the associated notice, but the reference to the actual item being reviewed will be a little bit different. Because we are setting this up as a generic process that can review virtually any item, for the link to the item we will be using a Document ID field for the reference. More on that a little later, but for now, here are the fields for this table.

Fields for the new Review Notice Item table

Both Reference type fields and Document ID type fields contain sys_ids, but on a reference field, the table containing the record with that sys_id is defined as a part of the field definition. On a Document ID field, the record could potentially be on any table in the system, so you need a second, dependent field of type Table to specify which table contains the record referenced. Fortunately for our purposes, the field does not need to be on the same table as the one that contains the Document ID. The table field that we need is actually a part of the Review Configuration where we define what items are to be reviewed. To set that up, we need to go into the field definition and select that field in the Dependent field section of the form.

Document ID dependent field configuration

With that out of the way, we can once again lay out the fields as we like on the associated form.

Form Layout for the new Review Notice Item table

That should be all we need to get started on building the actual process that performs the review. There will eventually be one or more tables yet to define, but we can save that until they are needed. For now, let’s set that aside so that next time we can start working on the actual review process itself.

Periodic Review, Part III

“We conquer by continuing.”
George Matheson

Last time, we created our app, added a few properties, and built our first table. Today we are going to work with that table to configure the layout of both the list and the form, and then maybe do a few other things before we move on to the rest of the tables. Let’s start with the list.

To edit the fields that will show up on the list view, we can bring up the list view and then select Configure -> List Layout from the context menu.

Configuring the list layout

Using the slush bucket, we can select Number, Short description, Item label and Description from the available fields on the table.

Selecting the fields to appear on the list view

That will give us a list view that looks like this.

Newly configured list view

Using basically the same method, we can arrange the fields on the form view.

Configuring the form layout

The form is a bit more complicated than the list, so to help organize things, we can divide the form into sections. After we lay out the main section of the form, we can scroll down to the Section list and click on the New… option, which brings up a small dialog box where we can give our new section a name.

Creating a new section on the form

Once we have created the new section, we can drag in and arrange all of the fields that we would like to see in that section of the form.

Populating the fields in our new section

Once that has been completed, we can take a look at our new form.

The new form layout

We are still not quite done with the form just yet, though. All of the columns that reference fields on the selected table should have a selection list that is limited to just the fields on that table. To accommodate that, we need to pull up the dictionary record for each of those fields and set up a dependency. To do that, right click on the field label and select Configure Dictionary from the resulting context menu.

Editing the dictionary record from the form

Using the Advanced view, go into the Dependent Field tab and check the Use dependent field checkbox and select Table from the list of fields.

Setting up the dependent field

This process will need to be repeated for all of the columns that represents fields on the configured table.

The last thing that we need to add to this form, at least for now, is the ability to test the Filter against the specified table. It would probably be more user-friendly if our Filter field was some kind of query builder, but since it is just a simple String field, the least we can do is to provide some mechanism to test out the query string once it has been entered. The easiest way to do that would be to create a UI Action called Test Filter that used the Table and the Filter fields to branch to the List view of that table. Building a link to the List view in script would look something like this:

current.table + '_list.do?sysparm_query=' + encodeURIComponent(current.filter)

Branching to that page in a UI Action script would then just be this:

action.setRedirectURL(current.table + '_list.do?sysparm_query=' + encodeURIComponent(current.filter));

Clicking on the button would then take you to the list where you could see what records would be selected using that filter. To create the UI Action, we can use the context menu on the form and select Configure -> UI Actions and then click on the New button to create a new UI Action for that form.

Creating a new UI Action to test out the entered query filter

Once our action has been configured and saved, the button should appear at the top of the form.

New form button from new UI Action configuration

That should just be about it for our first table and all of the associated fields, forms, and views. Next time, we can use our Service Account Management app as a potential first user of this app and see if we can set up the configuration for that app before we move on to creating other tables.

Collaboration Store, Part LIX

“When something you make doesn’t work, it didn’t work, not you. You, you work. You keep trying.”
Zach Klein

Last time, we created a couple of new shared functions to send over a logo image and associate that image with its base record. Unfortunately, the function that sends over the image file doesn’t actually work. Yes, it creates an attachment record on the target system, and yes, that attachment gets linked to its base record, but the image itself does not come across correctly, and the resulting file is not a valid image file. Yes, I should have tested that before I stuck the code out there, but it all seemed as if it should work, so I just threw it out there without first giving it a try.

I tried a few things to get it to go, but none of them did the trick. I went back to the getContent method instead of getContentBase64, but that didn’t work, so I tried getContentStream, but that didn’t do it, either. Then I tried adding a Content-Transfer-Encoding: base64 header, but that didn’t help, no matter what method I used to snag the content. So, it’s back to the drawing board on that one to see if we can’t figure out how to get that working correctly.

In the meantime, I decided to start logging all of this REST API activity so that I would have some record of what’s been happening between the instances. I have long thought that there should be some form of activity log tracking all of the important things going on with the records, and I even built a table for that early on, but that table was never used. This time, though, I was looking for something specific to the REST API activity, which has a number of specific data points. So, I created a new table called REST API Log to start tracking every request and response.

New REST API Log table

Then I added the following function to create records in this new table.

logRESTCall: function (targetGR, result, payload) {
	var logGR = new GlideRecord('x_11556_col_store_rest_api_log');
	logGR.instance = targetGR.getUniqueValue();
	logGR.url = result.url;
	logGR.method = result.method;
	if (payload) {
		logGR.request_body = JSON.stringify(payload, null, '\t');
	}
	logGR.response_code = result.status;
	if (result.obj) {
		logGR.response_body =  JSON.stringify(result.obj, null, '\t');
	} else {
		logGR.response_body =  result.body;
	}
	logGR.error = result.error;
	logGR.error_code = result.error_code;
	logGR.error_message = result.error_message;
	logGR.parse_error = result.parse_error;
	logGR.insert();
}

Then, at the end of each common REST API function, I added this line right before the final return statement:

this.logRESTCall(targetGR, result, payload);

Now, not every REST API call in the system uses these common functions, but my intent is to go back and correct that wherever appropriate, so eventually that should cover most of them, and then I can see what I need to do with the rest to get that activity logged as well. But it’s a start, anyway.

So now I have to get busy figuring out how to get my logo image over to another instance successfully. I’m sure that there is a way to do that; I just haven’t figured it out yet. Hopefully, we can explain how that is done next time out.

Collaboration Store, Part XX

“It’s always too soon to quit!”
Norman Vincent Peale

Now that we have the needed global Script Include out of the way, we can turn our attention to building the Publish to Collaboration Store UI Action. Before we do that, though, we are going to need a couple more tables, one for the published applications, and another for the application versions. I won’t waste a lot of time here on the process of building a new table; that’s pretty basic Now Platform stuff that you can find just about anywhere. I also did not spend a lot of time creating the tables. At this point, they are very basic and just enough to get things going. I am sure at some point I will be coming back around and adding more valuable fields, but for right now, I just included the bare necessities to create this initial publication process.

Here is the form for the base application table:

Base Member Application table input form

… and here is the form for the application versions:

Member Application Version table input form

There is an obvious one-to-many relationship between the application record and all of the various version records for an application. In practice, all of the versions would be listed under an application as a Related List on the application form. We can show examples of that once we build up some data.

Now that that is done, we can get back to our UI Action. Since our action is going to be very similar to the stock Publish to Update Set… action, the easiest way to create ours is to pull up the stock action and use the context menu to select Insert and Stay to clone the action.

Select Insert and Stay from the context menu to clone the UI Action

Now we just have to rename the action and update the script, which actually turned out to be more modifications that I had anticipated, primarily because our action is not in the global scope. First, we had to rename the function to avoid a conflict with the original action with which we will be sharing the page. Also, the gel function and the window object are not available to scoped actions, and of course, we needed to point to our own UI Page instead of the original and put in our own title.. Here is the modified script:

function publishToCollaborationStore() {
	var sysId = g_form.getUniqueValue();
	var dialogClass = GlideModal ? GlideModal : GlideDialogWindow;
	var dd = new dialogClass("x_11556_col_store_publish_app_dialog_cs");
	dd.setTitle(new GwtMessage().getMessage('Publish to Collaboration Store'));
	dd.setPreference('sysparm_sys_id', sysId);
	dd.setWidth(500);
	dd.render();
}

I also removed all of the conditionals for now, since some of that code was not available to a scoped UI Action, and we will end up having our own conditions, anyway, with which we can deal later on. Right now, I am still just focused on seeing if I can get all of this to work.

The next thing that we need to do is to build our own UI Page, on which we can also get a head start by cloning the original using the same Insert and Stay method. The modifications here are a little more extensive, as we want to do much more than just create an Update Set. One thing that we will want to do is check to make sure that the version number is not one that has already been published. To support that, I added some code at the top to pull all of the current versions so that I can stick them into a hidden field.

var existingVersions = '';
var mbrAppId = 'new';
var appSysId = jelly.sysparm_sys_id;
var appGR = new GlideRecord('sys_app');
appGR.get(appSysId);
var appName = appGR.getDisplayValue();
var appVersion = appGR.version || '';
var appDescription = appGR.short_description;
var mbrAppGR = new GlideRecord('x_11556_col_store_member_application');
if (mbrAppGR.get('application', appSysId)) {
	mbrAppId = mbrAppGR.getUniqueValue();
	var versionGR = new GlideRecord('x_11556_col_store_member_application_version');
	versionGR.addQuery();
	versionGR.query('member_application', mbrAppId);
	while (versionGR.next()) {
		existingVersions += '|' + versionGR.getDisplayValue('version');
	}
	existingVersions += '|';
}

To pass this value to the client side for validation, I added a hidden field to the HTML for the page:

<input id="existing_versions" type="hidden" value="$[existingVersions]"/>

To check the validity on the client side, I retained the original version validity check and then added my own check of the existing versions right underneath using the same alert approach to delivering the bad news:

this.existingVersions = gel('existing_versions').value;
...
var vd = g_form.validators.version;
if (!vd)
	console.log("WARNING.  Cannot find validator for version field in parent form.");
else {
	var answer = vd.call(g_form, this.versionField.value);
	if (answer != true) {
		alert(answer);
		return null;
	}
}
if (this.existingVersions.indexOf('|' + versionField.value + '|') != -1) {
	alert('Version ' + versionField.value + ' has already been published to the Collaboration Store');
	return null;
}

The last thing that we have to change is what happens once the Update Set has been created. In the original version that we cloned, once the Update Set has been produced, the operator is then taken to the Update Set form to see the newly created Update Set. We don’t want to do that, as we still have much work to do to, including the following:

  • Update or Create an Application record
  • Create a new Version record linked to the Application record
  • Convert the Update Set to XML
  • Attach the XML to the Version Record
  • Send all of the updated/created records over to the Host instance, including the attached XML

We will need some kind of vehicle in which to perform all of the these tasks and keep the operator updated as to the progress and success or failure of each one of the steps. That sounds like quite a bit of work, so I think that will be a good topic for our next installment.

Collaboration Store, Part II

“Tell me and I forget. Teach me and I remember. Involve me and I learn.”
Benjamin Franklin

Now that I have gone and thrown the idea out there for all to see, it’s time to get to work and see if I can actually pull this off. To begin, I need to set up the Scoped Application, which is basically a repeat of what I went through to set up the Scoped Application for my little Webhooks project, so there is no need to repeat all of that here. Here is how it came out:

Initial Collaboration Store Scoped Application

With that out of the way, the next order of business is to create that first table in which to store all of the instances. Again, that is pretty standard stuff and not really worthy of a step by step walk-through of the process, but here is the associated form, which will give you an idea of the columns that I have selected at this point in the process:

Member Organization table input form

Now we have a place to store the information on the participating instances, so it’s time to build the initial set-up process that will populate this table. Before we dive into that, though, I should mention that when I set up the application, I also set up a few System Properties using the UI Action that I created for that purpose a couple of years ago.

UI Action to set up System Properties for a Scoped Application

That’s been a handy little tool that does a number of things under the hood, but we don’t need to get into all of that here. If you are interested in that for any reason, you can grab an Update Set from here. For this phase of the project, I came up with three properties that I think will be needed in order to do what I would like to do. That may change over time as I get more into the weeds, but for now, here is the list:

Initial System Properties for the Collaboration Store application

That should give us all of the artifacts that we should need to start working on the initial set-up process. As you might have noticed in the above screen shot, I have already created a menu item to launch the initial set-up process from the navigation side bar. Right now, you can also see some of the other menu options for the app, but sometime before things are ready for prime time, my plan is to make all other menu items inactive, and then once the set-up process has been completed, a final step in the process would activate all of the others and inactivate the set-up menu item. For now, though, you can see everything, and it will probably be that way for some time until we get much closer to the end of things.

As for the set-up process itself, there are a number of different ways to go here. I could build something the main UI, where the primary technology is Apache Jelly. I could also build a Service Portal widget, where the primary technology is AngularJS. Both of those are considered Old School at this point, though, and all of the cool kids are now using the Now® Experience UI Framework and the ui-component extension for application development. While that seems like the appropriate way to go, my personal skill set does not yet include mastery of that particular technology, and I don’t really feel like this project would be a good place to address that particular shortcoming in my technical expertise. Since the initial set-up is just one small part of this effort, I am going to take the easy way out and just build a simple widget.

Building a brand new widget starting with a blank canvas is a little bit of a project, though, so that seems like something worthy of its own dedicated installment. Rather than start on that here, let’s take that up next time out.

Fun with Webhooks, Part VII

“The question isn’t who is going to let me; it’s who is going to stop me.”
Ayn Rand

Now that we have proven that the essential elements of our Subflow all work as intended, it’s time finish out the remainder of the flow’s activities, which include logging the HTTP POST and Response details as well as reporting any undesired results to Event Management. Let’s start with logging the activity.

The first thing that we will need in order to record our Webhook POSTs is a table in which to store the data. As we did with our original Webhook Registry table, we can navigate to System Definition -> Tables and click on the New button to bring up the Table definition screen.

New Webhook Log table

And once again we will give it a Label and let the system generate the associated Name. We will also want to uncheck the Create module checkbox again to prevent the generation of a number of artifacts for which we have no use. Once the table has been defined, we can start adding fields, and the first field that we will want to add is a Reference to the Webhook Registry table. Every log record will be linked to the registry for which the activity was POSTed, so we will want to establish that relationship with a Reference field that we can label Registry.

The other Reference field that we will want is a link back to the original Incident that is the subject of the POST. Since we set things up in a way that would allow us to support tables other than Incident, we will want to do this with a Document ID field rather than a direct reference to the Incident table. This time, when we configure the Dependent Field, we can dot walk through the registry reference to get to the table name column in that related table. This will save us from having to have a column on the log table for the name of the table that holds the record associated with the Document ID, and it will ensure that all of the Document IDs related to each Registry will only come from the table associated with that Registry.

Selecting the Registry record’s Table column as the Dependent Field

The rest of the fields in the log table are just what we sent over, and what we received in response:

  • Payload – The data that we will be POSTing
  • URL – The URL to which we will be POSTing our Payload
  • Status – The HTTP Response Code returned by the target server
  • Body – The body of the message returned by the target server
  • Error – The error flag
  • Error Code – The error code
  • Error Message – The error message
  • Parse Error – Any error that occurred while parsing the body of the response

After defining all of the fields on the table, I brought up the table’s form and arranged all of the fields on the screen in a manner that I thought was most appropriate.

Webhook Log form layout

Those of you who are paying close attention will also have noticed that I added the JSON View Dictionary Attribute to both the Payload and Body fields, just to make reading the JSON content a little easier.

Now that we have a table, we can start putting records into it. We will do this in our Subflow, right after we POST the payload. This is just a simple, out-of-the-box Create Record action that we can configure using data pills from various other steps.

Logging the Webhook POST and Response

In addition to capturing everything related to each POST, the other thing that we wanted to do was to capture any issues that might come up during this process. We are already aware of two possible issues, one being a bad response and the other being a good response, but with an unparsable response body. After we do the POST and log the result, we can throw in a few more conditionals to pick those up, and then add a Log Event Action to the flow for each.

Complete Subflow with Event logging for error conditions

Whenever we log an Event, we will want to capture as much information as we can about what went wrong. In this case, however, we have already logged everything about the transaction to the Webhook Log table, so really all we will need to provide is a way to find that record. Putting the sys_id in the additional_info field should do the trick. Here is how I populated all of the data for the Event:

Event Log data

That should complete the Subflow, at least for now. We may end up adding some other features in the future, but for now, this accomplishes everything that we set out to do. We still need to do a lot more testing to verify that all of these various branches in the tree work out as we are hoping, but the building part should be done now, at least for this portion.

As far as the remaining development goes, we still have to build out the My Webhooks portal widget and we also need to go back into the Script Include and add support for Basic Authentication. We also need to add code, and possibly additional fields in the registry record, for any other authentication protocols that we would like to support. So once again we find ourselves at a crossroads: we can either jump into the Service Portal world and start working on our widget, or we can turn our attentions to authentication and finish things up in that area. There is no need to make any decision on that today, though. We’ll figure all of that out when we meet again.

Fun with Webhooks, Part II

“Well done is better than well said.”
Benjamin Franklin

Now that we have the idea fairly well formulated, it’s time to get to work. The first thing that we need to do is create our Scoped Application, which you can do by navigating to My Company Applications and then clicking on the Create new button in the upper right-hand corner. I still use the classic UI rather than the Studio, so you may approach this task a little differently, but the end result should be the same.

New Simple Webhook Application

At this point, I have just created the most basic of empty shells. There are no modules or tables or roles or any other artifacts … it’s basically just the bare application record itself. Creating the app does create a scope, and also puts you into that scope, so as long as you don’t change that, everything that you do from this point on will be built in that scope. The first thing that we will want to build is the database table for our registry, which we can do by navigating to System Definition -> Tables and clicking on the New button.

New Webhook Registry table

Once you give it a Label it will generate the appropriate name, and the next thing that you are going to want to do is to uncheck the Create module option, as we only want the table at this point and we don’t need all of those other artifacts generated. We want to give our registrations an ID using the platform’s built-in tools, so we will want to create a field labeled Number of type String and put the following in the Default value:

javascript:getNextObjNumberPadded();

We have a number of other columns to define, but let’s save this for now and go set up our auto-numbering before we forget. To do that, navigate to System Definition -> Number Maintenance and click on the New button. Select our new table from the list and then let’s set the Prefix to WHR for Webhook Registry,

Setting up auto-numbering for our new Webhook Registry table

With that out of the way, we can go back to our table and add the rest of the fields. Here are the ones that I added to get things started:

  • Active – True/False, with default of True
  • Table – Table Name, with default of incident
  • Owner – Reference to the sys_user table
  • Type – String, with four initial choices (Single Item, Caller / Requester, Assignment Group, and Assignee)
  • URL – URL
  • Authentication – String, with two initial choices (None and Basic)
  • User Name – String
  • Password – Password
  • Document ID – Document ID
  • Person – Reference to the sys_user table
  • Group – Reference to the sys_user_group table

We may add a few more later on as we add new features, but this will get us by for now. Event though my intent is to focus solely on the Incident table for now, I went ahead and added the Table column and just defaulted it to Incident. That was mainly so that I could use as the Dependent field on the Document ID field. If you have never worked with Document ID fields before, you can just think of them as a special form of Reference field where you don’t have to specify the table that is being referenced. Instead, you point to another field on the record that contains the name of the table. This way, one of your records can reference one table and another record in that same table can reference a different table, all based on the table specified in the Dependent field. To set up the Dependent field, use the Dependent Field tab on the Dictionary Entry for the Document ID field.

Document ID Dependent field specification

After entering all of the fields, I used the Show form link at the bottom of the page to bring up the form for the new table. Using the Configure -> Form Layout option, I rearranged the fields on the screen to my liking, and then removed the Table field completely, as that will default to Incident for now and we don’t want anyone changing it to anything else at this point.

After getting everything laid out just right, the next thing that I did was to add a few UI Policies to control certain fields based on the value in other fields. For example, I hide the User Name and Password fields unless you set the Authentication to Basic. Similarly, the presence of the Document ID, Group, and Person fields are all dependent on the value of the Type field. Basically, this just hides fields that are not needed and reveals them when they are.

The other things that I wanted to do on this form was to give the user the ability to test their URL. That seemed like a good use for a new UI Action, but rather than putting all of the code for that process in the Action itself, I decided to start a Script Include to house these types of utilitarian functions. Putting all of that together seems like a good exercise for our next installment in this series, so this looks like a good stopping point for now.