Collaboration Store, Part LXXXII

“It’s really complex to make something simple.”
Jack Dorsey

Last time, we wrapped up an initial version of the storefront and released a new Update Set in the hopes that a few folks out there would pull it down and take it for a spin. While we wait patiently to hear back from anyone who might have been willing to download the new version and run it through its paces, let’s see if we can’t add a little more functionality to our shopping experience. The page that we laid out had a place for two widgets, but we only completed the primary display widget so far. The other, smaller portion of the screen was reserved for some kind of search/filter widget to help narrow down the results for installations where a large number of applications have been shared with the community. Adding that second widget to the page would give us something like this:

Storefront with new search/filter widget added

Rather than have the two widgets speak directly to one another, my thought was that we could use the same technique that was used with the Configurable Data Table Widget Content Selector bundled with the SNH Data Table Widgets. Instead of having one widget broadcast messages and the other widget listen for those messages, the Content Selector communicates with the Data Table widget via the URL. Whenever a selection is made, a URL is constructed from the selections, and then the Content Selector branches to that URL where both the Content Selector and the Data Table widget pull their control information from the shared URL query parameters. We can do exactly the same thing here for the same reasons.

For this initial attempt, I laid out a generic search box and a number of checkboxes for various states of applications on the local instance. To support that, we can use the following URL parameters: search, local, current, upgrade, and notins. In the widget, we can also use the same names for the variables used to store the information, and we can populate the variables from the URL. In fact, that turns out to be the entirety of the server-side code.

(function() {
	data.search = $sp.getParameter('search') ? decodeURIComponent($sp.getParameter('search')) : '';
	data.local = $sp.getParameter('local') == 'true';
	data.current = $sp.getParameter('current') == 'true';
	data.upgrade = $sp.getParameter('upgrade') == 'true';
	data.notins = $sp.getParameter('notins') == 'true';
})();

And here is the HTML that I put together to display the information.

<div class="panel">
  <div class="row">
    <div class="col">
      <p>&nbsp;</p>
    </div>
  </div>
  <div class="row">
    <div class="col">
      <input ng-model="c.data.search" name="search" id="search" type="text" class="form-control" placeholder="&#x1F50D;"/>
    </div>
  </div>
  <div class="row">
    <div class="col">
      <input ng-model="c.data.local" name="local" id="local" type="checkbox"/>
      <label for="local">${Local}</label>
    </div>
  </div>
  <div class="row">
    <div class="col">
      <input ng-model="c.data.current" name="current" id="current" type="checkbox"/>
      <label for="current">${Current}</label>
    </div>
  </div>
  <div class="row">
    <div class="col">
      <input ng-model="c.data.upgrade" name="upgrade" id="upgrade" type="checkbox"/>
      <label for="upgrade">${Upgrade Available}</label>
    </div>
  </div>
  <div class="row">
    <div class="col">
      <input ng-model="c.data.notins" name="new" id="new" type="checkbox"/>
      <label for="notins">${Not Installed}</label>
    </div>
  </div>
  <div class="row">
    <div class="col text-center">
      <button ng-click="search()" class="btn btn-primary" title="${Click to search the Collaboration Store}">${Search}</button>
    </div>
  </div>
</div>

To format things a little nicer, and to kick that little magnifying glass search icon over to the right, we also need to throw in some CSS.

.col {
  padding: .5vw;
}

label {
  margin-left: .5vw;
}

::placeholder {
  text-align: right;
}

There is one client-side function referenced in the HTML for the button click, and that’s pretty much all there is to the widget’s Client script.

api.controller = function($scope, $location) {
	var c = this;

	$scope.search = function() {
		var search = '?id=' + $location.search()['id'];
		if (c.data.search) {
			search += '&search=' + encodeURIComponent(c.data.search);
		}
		if (c.data.local) {
			search += '&local=true';
		}
		if (c.data.current) {
			search += '&current=true';
		}
		if (c.data.upgrade) {
			search += '&upgrade=true';
		}
		if (c.data.notins) {
			search += '&notins=true';
		}
		window.location.search = search;
	};
};

The search() function builds up a URL query string based on the operator’s input and then updates the current location with the new query string, essentially branching to the new location. When the new page loads, both the search widget and the storefront widget can then pull their information from the current URL. We can test all of this out now by saving the new widget, pulling up our collaboration_store page in the Service Portal Designer, and then dragging our new widget onto the page in the space already reserved for that purpose.

Dragging the new widget onto the existing page

With that completed, we can now try out the page and see that entering some data and clicking on the Search button actually does reload the page with a new URL. However, at this point the content of the primary page never changes because we have yet to add code to the main widget to pull in the URL parameters and then use that data to adjust the database query. That sounds like a good subject for our next installment.

Scripted Value Columns, Enhanced, Part III

“It’s not about ideas. It’s about making ideas happen.”
Scott Belsky

Last time, we played around with the Customer column in our new example data table, and today we will jump back over to the Labor Hours column and see if we can create that mouse-over breakdown of who put in the hours. The first thing that we will need to do is to gather up the data, so will need to add a groupBy to our GlideAggregate so that we can obtain the hours per technician.

var timeGA = new GlideAggregate('time_card');
timeGA.addQuery('task', item.sys_id);
timeGA.addAggregate('SUM', 'total');
timeGA.groupBy('user');
timeGA.orderBy('user');
timeGA.query();

Once we have the data, we will need to format it for display. There are a number of different ways that we can approach this, including straight CSS and many creative variations, but the easiest thing to do is to just use the title attribute of a span, which seems to accomplish the same thing.

Mouseover breakdown of labor hours

To create the text, we can establish some variables before we go into the while loop, and then inside the loop we can build up the text, one person’s hours at a time.

var hours = 0;
var tooltip = '';
var separator = '';
var timeGA = new GlideAggregate('time_card');
timeGA.addQuery('task', item.sys_id);
timeGA.addAggregate('SUM', 'total');
timeGA.groupBy('user');
timeGA.orderBy('user');
timeGA.query();
while (timeGA.next()) {
	var total = timeGA.getAggregate('SUM', 'total') * 1;
	hours += total;
	tooltip += separator + timeGA.getDisplayValue('user') + ' - ' + total.toFixed(2);
	separator = '\n';
}

Once we accumulate the total hours and build up the tooltip text, we can then format the HTML we want, but only when there are hours charged to the task.

if (hours > 0) {
	response = '<span style="text-align: right; width: 100%;" title="' + tooltip + '">' + hours.toFixed(2) + '</span>';
}

Putting it all together, our new function to provide right-justified total hours with a mouseover breakdown looks like this:

getScriptedValue: function(item, config) {
	var response = '';

	var hours = 0;
	var tooltip = '';
	var separator = '';
	var timeGA = new GlideAggregate('time_card');
	timeGA.addQuery('task', item.sys_id);
	timeGA.addAggregate('SUM', 'total');
	timeGA.groupBy('user');
	timeGA.orderBy('user');
	timeGA.query();
	while (timeGA.next()) {
		var total = timeGA.getAggregate('SUM', 'total') * 1;
		hours += total;
		tooltip += separator + timeGA.getDisplayValue('user') + ' - ' + total.toFixed(2);
		separator = '\n';
	}
	if (hours > 0) {
		response = '<span style="text-align: right; width: 100%;" title="' + tooltip + '">' + hours.toFixed(2) + '</span>';
	}

	return response;
}

I think that is enough examples for folks to get the basic idea. Adding the ability to include HTML with a scripted value opens up a number of possibilities. We have just explored a couple of them here, but I am sure that specific requirements will drive many other variations from those willing to give it a try and see what they can do with it.

Here is an Update Set with the modifications, including this new example page. Feedback can be left here in the comments, or in the discussion area where it has been posted out on Share. If you have been able to utilize this feature for anything interesting, a screenshot would definitely be something in which folks would have an interest, so please let us all in on what you were able to accomplish.

Collaboration Store, Part LXXIV

“Mistakes should be examined, learned from, and discarded; not dwelled upon and stored.”
Tim Fargo

Last time, we attempted to solve the problem of the instance logo image not being captured during the initial set-up process. Although the modifications that we made resolved the problem for a Host instance set-up, there is still a problem with the Client instances, as there is still no code in the set-up process that sends the logo image over the Host during registration. The periodic instance sync process won’t resolve that issue, either, as that compares what the Host has with what the Clients have, and the Host was never sent the image. We need to add some additional logic to send over the image when the Client is first registered with the Host. Here is the relevant code from the set-up widget:

if (data.instance_type == 'host') {
	csu.createUpdateWorker(mbrGR.getUniqueValue());
} else {
	var resp = csu.registerWithHost(mbrGR);
	if (resp.status == '202') {
		mbrGR.initialize();
		mbrGR.instance = input.store_info.instance;
		mbrGR.accepted = input.store_info.accepted;
		mbrGR.description = input.store_info.description;
		mbrGR.name = input.store_info.name;
		mbrGR.email = input.store_info.email;
		mbrGR.token = input.store_info.sys_id;
		mbrGR.active = true;
		mbrGR.host = true;
		mbrGR.insert();
		fixLogRecords(mbrGR);
	} else {
		mbrGR.deleteRecord();
		var errMsg = resp.error_message;
		if (resp.obj && resp.obj.result && resp.obj.result.error) {
			errMsg = resp.obj.result.error.message + ': ' + resp.obj.result.error.detail;
		}
		gs.addErrorMessage(errMsg);
		data.validationError = true;
	}
}

All we are doing here is sending over the basic information about the Client for the instance record on the Host, and then creating a record in our own instance table for the Host instance. There is no attempt to send over the associated logo image. We should be able to add a little something right after we fix the REST API log records that were created before the Host instance record was built.

if (logoId) {
	var attachmentGR = new GlideRecord('sys_attachment');
	if (attachmentGR.get(logoId)) {
		csu.pushImageAttachment(attachmentGR, mbrGR, 'x_11556_col_store_member_organization', resp.obj.result.info.sys_id);
	}
}

There are a couple of issues with this code, however. The first problem is that we are reusing the instance GlideRecord for both the Client instance as well as the Host instance, so if we want the logo sys_id from the Client instance, we need to grab that and save it before we initialize the record and start building the record for the Host instance.

var logoId = mbrGR.getValue('logo');
mbrGR.initialize();
mbrGR.instance = input.store_info.instance;
mbrGR.accepted = input.store_info.accepted;
mbrGR.description = input.store_info.description;
mbrGR.name = input.store_info.name;
mbrGR.email = input.store_info.email;
mbrGR.token = input.store_info.sys_id;
mbrGR.active = true;
mbrGR.host = true;
mbrGR.insert();
fixLogRecords(mbrGR);
if (logoId) {
	var attachmentGR = new GlideRecord('sys_attachment');
	if (attachmentGR.get(logoId)) {
		csu.pushImageAttachment(attachmentGR, mbrGR, 'x_11556_col_store_member_organization', resp.obj.result.info.sys_id);
	}
}

The other problem is that we need the sys_id of the Client instance record on the Host system, and the current registration process does not send back the sys_id of the instance record created during registration. We have to have that so that we can attach the logo image to that record, so we will need to go into the function that processes the registration request and add that data point to the response.

processRegistrationRequest: function(data) {
	var result = {body: {error: {}, status: 'failure'}};

	var mbrGR = new GlideRecord('x_11556_col_store_member_organization');
	if (mbrGR.get('instance', data.instance)) {
		result.status = 400;
		result.body.error.message = 'Duplicate registration error';
		result.body.error.detail = 'This instance has already been registered with this store.';
	} else {
		mbrGR.initialize();
		mbrGR.name = data.name;
		mbrGR.instance = data.instance;
		mbrGR.email = data.email;
		mbrGR.description = data.description;
		mbrGR.token = data.sys_id;
		mbrGR.active = true;
		mbrGR.host = false;
		mbrGR.accepted = new GlideDateTime();
		if (mbrGR.insert()) {
			result.status = 202;
			delete result.body.error;
			result.body.info = {};
			result.body.info.message = 'Registration complete';
			result.body.info.detail = 'This instance has been successfully registered with this store.';
			result.body.info.sys_id = mbrGR.getUniqueValue();
			result.body.status = 'success';
			sn_fd.FlowAPI.startSubflow('New_Collaboration_Store_Instance', {new_instance: data.instance});
		} else {
			result.status = 500;
			result.body.error.message = 'Internal server error';
			result.body.error.detail = 'There was a problem processing this registration request.';
		}
	}

	return result;
}

That should do it. Now, not only are we able to capture the logo image during the initial set-up process, if the instance is a Client instance, we also send that logo image over to the Host so that it can be distributed to all of the other Client instances. Of course, now we need a new Update Set that includes all of these changes, so here you go:

Same rules apply as before; this is a drop-in replacement for any of the previous 0.7.x version. More information on previewing, committing, and testing can be found here and here and here. And as always, feedback of any kind in the comments section is welcome, encouraged, and very much appreciated. Any and all information on your experiences, positive, negative, or otherwise, would be very welcome, and will give us a little something to review next time out.

Note to testers: On this version, it might be worthwhile to delete all of the instance records on your Host instance and have all of your Client instances re-register using a logo image to make sure all of this works. If all goes well, the logo images for all instances and all apps should appear on all instances. If you run into any issues, please report them in the comments, and if everything works out, please let us know that as well — thanks!

Collaboration Store, Part LXXIII

“A little more persistence, a little more effort, and what seemed hopeless failure may turn to glorious success.”
Elbert Hubbard

Last time, we went over some of the remaining problems with this iteration of the software, and stopped short of providing a remedy for one of the issues, the fact that there is no capacity to include an instance logo image during the initial set-up process. Today we will take a look at correcting that design flaw.

Not one to create anything from scratch when someone else might have already put in all of that effort, I looked around for a portal page that had an image upload feature already in place. It turns out that the stock user_profile page, which uses the User Profile widget, has just the sort of thing that I was thinking of.

Stock User Profile page with existing image upload feature

Starting with the HTML, I dug into the code for the page and identified this section as pertaining to the image and upload button.

<div class="row">
  <div class="avatar-extra-large avatar-container" style="cursor:default;">
    <div class="avatar soloAvatar bottom">
      <div class="sub-avatar mia" ng-style="avatarPicture"><i class="fa fa-user"></i></div>
    </div>
  </div>
</div>
<div class="row">
  <button ng-if="::connectEnabled()" ng-click="openConnectConversation()" type="button"
          class="btn btn-primary send-message"><span class="glyphicon glyphicon-comment pad-right"></span>${Message}</button>
  <!-- file upload -->
  <span ng-if="::data.isLoggedInUsersProfile">
    <input ng-show="false" type="file" accept="image/jpeg,image/png,image/bmp,image/x-windows-bmp,image/gif,image/x-icon,image/svg+xml" ng-file-select="attachFiles({files: $files})" />
    <button ng-click="uploadNewProfilePicture($event)"
            ng-keypress="uploadNewProfilePicture($event)" type="button"
            class="btn btn-default send-message">${Upload Picture}</button>
  </span>
</div>

It was all pretty close to what I wanted, but I did make a few little tweaks here and there and finally ended up with this.

<div class="row" style="padding: 15px;">
  <div class="avatar-extra-large avatar-container" style="cursor:default;">
    <div class="avatar soloAvatar bottom">
      <div class="sub-avatar mia" ng-style="instanceLogoImage"><i class="fa fa-image"></i></div>
    </div>
  </div>
</div>
<div class="row" style="padding: 15px;">
  <input ng-show="false" type="file" accept="image/jpeg,image/png,image/bmp,image/x-windows-bmp,image/gif,image/x-icon,image/svg+xml" ng-file-select="attachFiles({files: $files})" />
  <button ng-click="uploadInstanceLogoImage($event)"
          ng-keypress="uploadInstanceLogoImage($event)" type="button"
          class="btn btn-default send-message">${Upload Instance Logo Image}</button>
</div>

Without bothering to build out the referenced functions to make it all actually work, I could still pull the page up and see how it rendered out at this point, just to get an idea of how it was going to look.

New initial set-up page with added logo image feature

So that looks pretty good. Now we need to add the client-side code needed to make it work. As with the HTML, we will turn to the User Profile widget to find some code to start out with.

$scope.uploadNewProfilePicture = function($event) {
	$event.stopPropagation();
	if($event.keyCode === 9) {
		return;
	}
	var $el = $element.find('input[type=file]');
	$el.click();
}

$scope.attachFiles = function(files) {
	var file = files.files[0];

	var validImage = false;

	switch(file.type) {
		case 'image/jpeg':
		case 'image/png':
		case 'image/bmp':
		case 'image/x-windows-bmp':
		case 'image/gif':
		case 'image/x-icon':
		case 'image/svg+xml':
			validImage = true;
			break;
		default:
			break;
	}

	if(!validImage) {
		alert(file.name + " " + i18n.getMessage("isn't a recognized image file format"));
		return;
	}

	snAttachmentHandler.create("live_profile", $scope.data.liveProfileID).uploadAttachment(file, {
		sysparm_fieldname: "photo"
	}).then(function(response) {
		var obj = {};
		obj.newAvatarId = response.sys_id;
		$scope.avatarPicture = {
			'background-image': "url('" + response.sys_id + ".iix')",
			'color': 'transparent'
		};
		$rootScope.$broadcast("sp.avatar_changed", obj);
		// timeout is required for screen reader to pick up the message once file upload prompt is dismissed
		$timeout(function() {
		   spAriaUtil.sendLiveMessage(i18n.getMessage('Profile picture updated successfully'));
		}, 500);
	});
}

$scope.avatarPicture = "";

$http.get('/api/now/live/profiles/sys_user.' + $scope.data.sysUserID).then(function (response) {
	if (response.data.result && response.data.result.avatar){
		var avatarPicture = response.data.result.avatar.replace("?t=small", "");
		$scope.avatarPicture = {
			'background-image': "url('" + avatarPicture + "')",
			'color': 'transparent'
		};
	}
});

Once again, I went through the code and made a few minimal modifications to make things work for our purpose and ended up with this.

$scope.uploadInstanceLogoImage = function($event) {
	$event.stopPropagation();
	if($event.keyCode === 9) {
		return;
	}
	var $el = $element.find('input[type=file]');
	$el.click();
};

$scope.attachFiles = function(files) {
	var file = files.files[0];

	var validImage = false;

	switch(file.type) {
		case 'image/jpeg':
		case 'image/png':
		case 'image/bmp':
		case 'image/x-windows-bmp':
		case 'image/gif':
		case 'image/x-icon':
		case 'image/svg+xml':
			validImage = true;
			break;
		default:
			break;
	}

	if(!validImage) {
		alert(file.name + " " + i18n.getMessage("isn't a recognized image file format"));
		return;
	}

	snAttachmentHandler.create('x_11556_col_store_member_organization', $scope.data.sys_id).uploadAttachment(file, {
		sysparm_fieldname: "logo"
	}).then(function(response) {
		$scope.instanceLogoImage = {
			'background-image': "url('" + response.sys_id + ".iix')",
			'color': 'transparent'
		};
	});
};

$scope.instanceLogoImage = "";

All of that would work great just as it is except for one thing: the snAttachmentHandler.create function wants to attach the file to an existing record, and the current design of the process doesn’t actually create the record in the database until after the Host instance accepts the registration request. We had that same problem with the Update Set XML file during the application publishing process, and I ended up attaching the file to the system application record, and then copying it over to the version record later, once the version record was created. Here, though, there really isn’t a record to which it could be attached temporarily, so in order for this to work, I am going to have to go ahead and create the record up front, and then do an update instead of an insert once the Host has granted access to the community. That took a little bit of re-engineering, but once that was done, everything seemed to work out OK. This is something that really needs to be tested thoroughly, though, but just from my cursory check-out it appears to be functioning as desired.

Image upload function in operation

With that up and running, it is time to release yet another Update Set for testing purposes. As with the 0.7.1 release, this should be a straight drop-in replacement for the 0.7/0.7.1 Update Set, and all of the other artifacts should be OK just as they are.

More information on previewing, committing, and testing can be found here and here and here. As always, feedback of any kind in the comments section is welcome, encouraged, and very much appreciated. Any information on your experiences is always a treat, and will give us a little something to review next time out.

Collaboration Store, Part LXXII

“Sometimes when you innovate, you make mistakes. It is best to admit them quickly and get on with improving your other innovations.”
Steve Jobs

Last time, we put out a new version of the Scoped Application Update Set that addressed a few of the reported issues. There are still a few unresolved issues, however, and some that have not even been discussed as yet. One that was reported last time, which I have been able to replicate, is a Commit failure of an attempted insert on the sys_scope_privilege table.

Duplicate entry ‘5b9c19242f6030104425fcecf699b6ec-global-sys_app_module-sys_db…’ for key ‘source_scope_2’

This causes the Update Set commit to be reported as a failure, even though everything else was installed without issue and everything works just fine despite this particular problem. I searched through the Update Set hoping to find an update that was in there twice for this item, but it only appears once, so I am not sure why we would be getting a duplicate entry error. The entry says ‘INSERT OR UPDATE’, so you would think that it would know if it was already there and do an UPDATE instead of an INSERT, but that’s obviously not what is happening. I’m going to have to do a little more research to see if I can find the cause of this one. Even though it doesn’t seem to hurt anything, I don’t like it and I would like to figure out how to prevent that from happening.

The other thing that seems to happen to folks during the Preview phase is that they run into a number of issues that need to be addressed and the Preview never runs clean. Up to this point, my recommendation has been to just accept all remote updates and then do the Commit. There are two areas, though, where I now believe that it would be better to skip the updates rather than accept them. One is any update related to System Properties and the other is any update related to the left navigation menu items. Once you have gone through the set-up process, the property values have all been establish based on the information entered during set-up. You do not want to overlay those with an Update Set or you will have to go through the set-up process all over again. Also, once you complete the set-up process, the set-up menu item is deactivated and all of the other menu items are revealed. You don’t want that to be reverted either, so if you are installing an update over an existing installation that has already been set up, you should skip any update related to these two items. All of the rest, you should still go ahead and accept. I don’t like having to have these kinds of specialized instructions, though, so I am going to be looking at ways to restructure things so that this is not an issue. The goal would be to always have a clean Preview and a clean Commit, regardless of whether it was a fresh install or an update of an existing install.

One issue that I did attempt to correct in the most recent version was the attachment copy problem, a problem that resulted in the attachment of the wrong file to the version record. It turns out that this bad code was replicated a number of times and was only fixed in one place. To correct that problem, and to make future maintenance a little easier, I built a common attachment copy function that contained the corrected code, and then called that function from every place that previously had a copy of the incorrect logic.

copyAttachment: function(fromTable, fromId, toTable, toId, attachmentId) {
	var response = {success: false};

	var gsa = new GlideSysAttachment();
	var values = gsa.copy(fromTable, fromId, toId, toId);
	if (values.length > 0) {
		for (var i=0; i<values.length; i++) {
			var ids = values[i].split(',');
			if (ids[0] == attachmentId) {
				response.success = true;
				response.newId = ids[1];
				gsa.deleteAttachment(attachmentId);
			} else {
				gsa.deleteAttachment(ids[1]);
			}
		}
	} else {
		response.error = 'Unrecognized response from attachment copy: ' +  JSON.stringify(values) + '\nFrom table: ' + fromTable +'; From ID: ' + fromId + '; To table: ' + toTable +'; To ID: ' + toId + '; Sys ID: ' + attachmentId;
	}

	return response;
}

Once that was in place, I updated the functions that previously included that code to just call that function and evaluate the results.

processPhase4: function(answer) {
	var response = this.CSU.copyAttachment('sys_app', answer.appSysId, 'x_11556_col_store_member_application_version', answer.versionId, answer.attachmentId);
	if (response.success) {
		answer.attachmentId = response.newId;
	} else {
		answer = this.processError(answer, 'Copy of Update Set XML file failed - ' + response.error);
	}

	return answer;
}

This code works for both Update Set XML files as well as logo image files.

copyLogoImage: function(sysAppGR, applicationGR) {
	var logoId = '';

	var csu = new CollaborationStoreUtils();
	var response = csu.copyAttachment('ZZ_YYx_11556_col_store_member_application', applicationGR.getUniqueValue(), 'ZZ_YYsys_app', sysAppGR.getUniqueValue(), applicationGR.getValue('logo'));
	if (response.success) {
		logoId = response.newId;
	} else {
		gs.error('Copy of application logo image failed - ' + response.error);
	}

	return logoId;
}

Speaking of logo image files, when I updated everything to include logo images for each instance, I neglected to add an image upload function to the initial set-up process. Since the Host’s data for each Client instance is collected during the set-up process, no Client instances will have logo images on the Host, which means no logo images will ever be shared with other Clients. I need to fix that, and then release yet another bug-fix Update Set that will address as many of these issues as possible. Adding the image upload to the set-up process may get a little involved, so let’s save that for our next installment.

Scripted Value Columns, Part VIII

“Too many men work on parts of things. Doing a job to completion satisfies me.”
Richard Proenneke

Last time, we wanted to wrap up the modifications on the last two wrapper widgets and put out a new Update Set, but we discovered that we missed an important element in our list of things that would need to be modified, the Configurable Data Table Widget Content Selector widget. We need to take a look at that guy and see what needs to be done to accommodate scripted value columns, and then retest the third wrapper widget, which shares the page with this component.

A quick scan of the Server script for aggarray comes up empty, but in the Client script, we come across this:

s.aggregates = '';
if (tableInfo[state].aggarray && Array.isArray(tableInfo[state].aggarray) && tableInfo[state].aggarray.length > 0) {
	s.aggregates = JSON.stringify(tableInfo[state].aggarray);
}

Making a copy of that and doing a little string replacement here and there gives us an equivalent block of code for the new scripted value column configurations.

s.scripteds = '';
if (tableInfo[state].svcarray && Array.isArray(tableInfo[state].svcarray) && tableInfo[state].svcarray.length > 0) {
	s.scripteds = JSON.stringify(tableInfo[state].svcarray);
}

And that seems to be all there is to that. Now we can go back to our last test and run it again to see if that fixed our problem.

Successful test of the third wrapper widget

That’s better! Now we have a column for the Last Comment, and we even have a row with some data in it. Good deal. And just to check on the content selector widget, we can look at the URL that it built to see how the configuration options for the scripted value columns appeared in the URL.

/sp?id=my_things&table=incident&filter=caller_idDYNAMIC90d1921e5f510100a9ad2572f2b477fe^active%3Dtrue&fields=number,opened_by,opened_at,short_description&scripteds=[{"heading":"Last Comment","name":"last_comment","label":"Last Comment","script":"global.ScriptedJournalValueProvider"}]&aggregates=&buttons=&refpage={"sys_user":"user_profile"}&px=requester&sx=open&spa=1&p=1&o=opened_at&d=asc

Well, that’s the whole thing, but we can zoom in on the part in which we have an interest.

scripteds=[{"heading":"Last Comment","name":"last_comment","label":"Last Comment","script":"global.ScriptedJournalValueProvider"}]

So that is the last of the wrapper widgets, and unless we have left something else out, that’s the last of the work to be done to implement this new feature. Now all that is left is to bundle the whole thing up into a new Update Set and post it out on Share as a new version.

Here is the new Update Set, and here is where you can find it on Share. If you happen to use it, find it to be of value, or run into any issues, please let us all know in the comments below.

Collaboration Store, Part LXXI

“Continuous delivery without continuous feedback is very, very dangerous.”
Colin Humphreys

Last time, we started to tackle some of the issues that were reported with the last set of Update Sets released for this project. Today we need to attempt to address a couple more things and then put out another version with all of the latest corrections. One of the issues that was reported was that the application publishing failed because the Host instance was unavailable. While technically not a problem with the software, it seems rather rude to allow the user to go through half of the process only to find out right in the middle that things cannot proceed. A better approach would be check on the Host first, and then only proceed if the Host is up and running.

We already have a function to contact the Host and obtain its information. We should be able to leverage that existing function in a new client-callable function in our ApplicationPublisher Script Include.

verifyHost: function() {
	var hostAvailable = 'false';
		
	if (gs.getProperty('x_11556_col_store.host_instance') == gs.getProperty('instance_name')) {
		hostAvailable = 'true';
	} else {
		var csu = new CollaborationStoreUtils();
		var resp = csu.getStoreInfo(gs.getProperty('x_11556_col_store.host_instance'));
		if (resp.status == '200' && resp.name > '') {
			hostAvailable = 'true';
		}
	}

	return hostAvailable;
}

First we check to see if this is the Host instance, and if not, then we attempt to contact the Host instance to verify that it is currently online. To invoke this new function, we modify the UI Action that launches the application publishing process and add a new function ahead of the existing function that kicks off the process.

function verifyHost() {
	var ga = new GlideAjax('ApplicationPublisher');
	ga.addParam('sysparm_name', 'verifyHost');
	ga.getXMLAnswer(function (answer) {
		if (answer == 'true') {
			publishToCollaborationStore();
		} else {
			g_form.addErrorMessage('This application cannot be published at this time because the Collaboration Store Host is offline.');
		}
	});
}

With this code in place, if you attempt to publish a new version of the application and the Host is unreachable, the publication process will not start, and you will be notified that the Host is down.

Error message when Host instance is not available for accepting new versions

That’s a little nicer than just throwing an error half way through the process. This way, if the Host is out of service for any reason, the publishing process will not even begin.

I still do not have a solution for the application logo image that would not copy, but these other issues that have been resolved should be tested out, so I think it is time for a new Update Set for those folks kind enough to try things out and provide feedback. There were no changes to any of the globals, so v0.7 of those artifacts should still be good, but the Scoped Application contains a number of corrections, so here is a replacement for the earlier v0.7 that folks have been testing out.

If you have already been testing, this should just drop in on top of the old; however, if this is your first time, or you are trying to install this on a fresh instance, you will want to follow the installation instructions found here, and just replace the main Update Set with the one above. Thanks once again to all of you who have provided (or are about to provide!) feedback. It is always welcome and very much appreciated. Hopefully, this version will resolve some of those earlier issues and we can move on to discovering new issues that have yet to be detected. If we get any additional feedback, we will take a look at that next time out.

Scripted Value Columns, Part VII

“Unplanned occurrences are reminders to check your tendency to think that you’re the one in control.”
James Martin

Last time, we created another example of how one might utilize the new scripted value column feature, this time with catalog item variables instead of Incident journal entries. There are a number of other things that we could try, but two examples should be enough to get the point across, and I’ll leave it to others to come up with additional examples of their own.

We still have two more wrapper widgets to update, though, and we still have that annoying misalignment between the original columns and the new. Here is the way things come out right now:

Misalignment of original columns and new columns

… and here is the way that it should look:

Correct alignment of original columns and new columns

I was able to capture that second image because I found and fixed the problem. I had to replace this HTML:

<sn-avatar ng-if="item[field].value && item[field].type == 'reference' && item[field].table == 'sys_user'" primary="item[field].value" class="avatar-small" show-presence="true" enable-context-menu="false"></sn-avatar>
<a ng-if="$first" href="javascript:void(0)" ng-click="go(item.targetTable, item)" aria-label="${Open record}: {{::item[field].display_value}}">{{::item[field].display_value | limitTo : item[field].limit}}{{::item[field].display_value.length > item[field].limit ? '...' : ''}}</a>
<a ng-if="!$first && item[field].type == 'reference' && item[field].value" href="javascript:void(0)" ng-click="referenceClick(field, item)" aria-label="${Click for more on }{{::item[field].display_value}}">{{::item[field].display_value | limitTo : item[field].limit}}{{::item[field].display_value.length > item[field].limit ? '...' : ''}}</a>
<span ng-if="!$first && item[field].type != 'reference'">{{::item[field].display_value | limitTo : item[field].limit}}{{::item[field].display_value.length > item[field].limit ? '...' : ''}}</span>   

… with this:

<span style="display: inline-flex;">
  <span style="display: inline-flex;" ng-if="item[field].value && item[field].type == 'reference' && item[field].table == 'sys_user'">
    <sn-avatar primary="item[field].value" class="avatar-small" show-presence="true" enable-context-menu="false"></sn-avatar>
    &nbsp;
  </span>
  <a ng-if="$first" href="javascript:void(0)" ng-click="go(item.targetTable, item)" aria-label="${Open record}: {{::item[field].display_value}}">{{::item[field].display_value | limitTo : item[field].limit}}{{::item[field].display_value.length > item[field].limit ? '...' : ''}}</a>
  <a ng-if="!$first && item[field].type == 'reference' && item[field].value" href="javascript:void(0)" ng-click="referenceClick(field, item)" aria-label="${Click for more on }{{::item[field].display_value}}">{{::item[field].display_value | limitTo : item[field].limit}}{{::item[field].display_value.length > item[field].limit ? '...' : ''}}</a>
  <span ng-if="!$first && item[field].type != 'reference'">{{::item[field].display_value | limitTo : item[field].limit}}{{::item[field].display_value.length > item[field].limit ? '...' : ''}}</span>
</span>

Now, without getting into too much detail that no one really cares about, the source of the problem was the sn-avatar tag, which I added a while back so that user columns would have the avatar in front of the name. For some reason, the tag renders out a carriage return and a handful of spaces just before the avatar image. With the ng-if attribute set to false, this collection of white space is still rendered on the page, even when the avatar itself is not. I solved that problem by wrapping the avatar tag with a span and putting the ng-if attribute on the outer span rather than on the sn-avatar tag. That took care of things for columns where there was no avatar, but the user columns, which show the avatar, were still out of alignment with the rest of the columns. Adding style=”display: inline-flex; took care of that problem with the avatar, but then the user name ended up underneath the avatar instead of next to it. To solve that problem, I wrapped the whole thing in another span with the same style attribute. Now everything lines up the way that it should.

Now that that is out of the way, we still have two more wrapper widgets to update. Let’s jump into the SNH Data Table from Instance Definition and do the same kind of searching we did before, looking for some code that might need to be copied and modified. On this particular widget, such a search turns up nothing at all in either the Server script or the Client script, so the only thing that we really need to do is to add another entry to the Option schema for our new scripted value column specification.

{"hint":"A JSON object containing the specifications for scripted value columns",
"name":"scripteds",
"default_value":"",
"section":"Behavior",
"label":"Scripted Value Column Specifications (JSON)",
"type":"String"}

To test this, we can modify our scripted_value_test_2 page to use this widget instead of the SNH Data Table from JSON Configuration widget, and then transfer our configuration options from the Script Include to the widget options.

Widget configuration options for the modified wrapper widget

Now all we need to do is to save it and then run out to the Service Portal and take a quick peek.

Testing the modified SNH Data Table from JSON Configuration widget

So that all looks good. And much, much better now that the column data all lines up as it should! It’s nice to finally have that fixed. That takes care of wrapper widget #2. Now let’s take a look at that last one, the SNH Data Table from URL Definition widget. The only line that appears to require modification is this one:

copyParameters(data, ['aggregates', 'buttons', 'refpage', 'bulkactions']);

… which we can convert to this to pick up our new configuration option:

copyParameters(data, ['scripteds', 'aggregates', 'buttons', 'refpage', 'bulkactions']);

Now we need to test it, so we will need to find or create a page that use this widget. Let’s take a look at the ones that are already out there by checking out the Related List down at the bottom of the form.

Pages that use the SNH Data Table from URL Definition widget

The page my_things looks like a good candidate, so we can take a look at the configuration script that it uses and then edit it to add one or more scripted value columns. One of the tables utilized on that page is the Incident table, so let’s go ahead and use our existing value provider script to add a journal entry column to one of those.

name: 'incident',
displayName: 'Incident',
open: {
	filter: 'caller_idDYNAMIC90d1921e5f510100a9ad2572f2b477fe^active=true',
	fields: 'number,opened_by,opened_at,short_description',
	svcarray: [{
		name: 'last_comment',
		label: 'Last Comment',
		heading: 'Last Comment',
		script: 'global.ScriptedJournalValueProvider'
	}],
	aggarray: [],
	btnarray: [],
	refmap: {
		sys_user: 'user_profile'
	},
	actarray: []
}

Now let’s take a look.

First test of the modified SNH Data Table from URL Definition widget

Well, that didn’t work! It’s always something. Even if there were no comments on any of these Incidents, we should still have a column heading for our new scripted value column. I don’t think this problem is in the widget that we just modified, however. This widget shares the page with the Configurable Data Table Widget Content Selector widget, and that is a widget that we have not even touched. That is going to have to be modified to accommodate our new feature as well, as it builds the URL that the SNH Data Table from URL Definition widget turns to for its configuration information. This was not on our list of things to do for this feature, but it definitely needs to be done.

I was hoping to wrap things up with this installment, but now we have a new widget to modify and more testing to do, so I think we will just save all of that, plus the Update Set creation, for our next time out.

Scripted Value Columns, Part VI

“Do not be too timid and squeamish about your actions. All life is an experiment. The more experiments you make the better.”
Ralph Waldo Emerson

Last time, we used the new scripted value columns feature to create columns from the comments and work notes of an Incident. Today we are going to do something similar, but instead of working with journal entries, we will try something with the optional variables that can be associated with Service Catalog items. For our demonstration, we will use the Executive Desktop catalog item, which has a number of defined ordering options.

Executive Desktop catalog item

Let’s see if we can’t create a list of all orders for this item, and include some of the catalog item variables on the list. To begin, let’s make another copy of our example scripted value provider and call it ScriptedCatalogValueProvider.

var ScriptedCatalogValueProvider = Class.create();
ScriptedCatalogValueProvider.prototype = {
	initialize: function() {
	},

	getScriptedValue: function(item, config) {
		return Math.floor(Math.random() * 100) + '';
	},

	type: 'ScriptedCatalogValueProvider'
};

Let’s also make a copy of our last configuration script and change things over from the Incident table to the Requested Item table, and define some scripted value columns for some of the catalog item variables associated with this item.

var ScriptedValueConfig2 = Class.create();
ScriptedValueConfig2.prototype = Object.extendsObject(ContentSelectorConfig, {
	initialize: function() {
	},

	perspective: [{
		name: 'all',
		label: 'all',
		roles: ''
	}],

	state: [{
		name: 'all',
		label: 'All'
	}],

	table: {
		all: [{
			name: 'sc_req_item',
			displayName: 'Requested Item',
			all: {
				filter: 'cat_item=e46305bdc0a8010a00645e608031eb0f',
				fields: 'number,request,requested_for',
				svcarray: [{
					name: 'cpu',
					label: 'CPU Speed',
					heading: 'CPU Speed',
					script: 'global.ScriptedCatalogValueProvider'
				},{
					name: 'memory',
					label: 'Memory',
					heading: 'Memory',
					script: 'global.ScriptedCatalogValueProvider'
				},{
					name: 'drive',
					label: 'Hard Drive',
					heading: 'Hard Drive',
					script: 'global.ScriptedCatalogValueProvider'
				},{
					name: 'os',
					label: 'Operating System',
					heading: 'Operating System',
					script: 'global.ScriptedCatalogValueProvider'
				}],
				aggarray: [],
				btnarray: [],
				refmap: {},
				actarray: []
			}
		}]
	},

	type: 'ScriptedValueConfig2'
});

Finally, let’s make a copy of our last test page and call it scripted_value_test_2, and then edit the widget options to use our new configuration file. Now let’s jump out to the Service Portal and pull up the page and see what we have so far.

First test of our new portal page

So far, so good. Everything seems to work, but of course all we have in our new columns is the random numbers that we threw in there for demonstration purposes earlier. But that just means that all we have left to do now is to figure out what kind of script we need to pull out the actual variable values that are associated with each catalog item order. To begin, let’s map our variable names, which are also our column names, to their catalog item variable (question) sys_ids.

questionMap: {
	cpu: 'e46305fbc0a8010a01f7d51642fd6737',
	memory: 'e463064ac0a8010a01f7d516207cd5ab',
	drive: 'e4630669c0a8010a01f7d51690673603',
	os: 'e4630688c0a8010a01f7d516f68c1504'
}

This will allow us to use some common code for all of the columns by passing in the correct sys_id for the catalog item variable associated with that column.

getScriptedValue: function(item, config) {
	var response = '';

	var column = config.name;
	if (this.questionMap[column]) {
		response = this.getVariableValue(this.questionMap[column], item.sys_id);
	}

	return response;
}

Now we need to build the getVariableValue function, which takes the sys_id of the question and the sys_id of the requested item as arguments. To locate the value selected for this question for this order, we use the sc_item_option_mtom table, which maps requested items to individual question responses.

getVariableValue: function(questionId, itemId) {
	var response = '';

	var mtomGR = new GlideRecord('sc_item_option_mtom');
	mtomGR.addQuery('request_item', itemId);
	mtomGR.addQuery('sc_item_option.item_option_new', questionId);
	mtomGR.query();
	if (mtomGR.next()) {
		response = mtomGR.getDisplayValue('sc_item_option.value');
	}

	return response;
}

Now let’s save that and give the page another look.

Second test of our new portal page

Well, that’s better, but it is still not exactly what we want. The values that appear in the columns are the raw values for the variables, but what we would really like to see is the display value. To get that, we have to go all the way back to the original question choices and find the choice record that matches both the question and the answer. For that, you need the sys_id of the question and the value of the answer.

getDisplayValue: function(questionId, value) {
	var response = '';

	var choiceGR = new GlideRecord('question_choice');
	choiceGR.addQuery('question', questionId);
	choiceGR.addQuery('value', value);
	choiceGR.query();
	if (choiceGR.next()) {
		response = choiceGR.getDisplayValue('text');
	}

	return response;
}

To utilize this new function, we have to tweak our getVariableValue function just a little bit to make the call.

getVariableValue: function(questionId, itemId) {
	var response = '';

	var mtomGR = new GlideRecord('sc_item_option_mtom');
	mtomGR.addQuery('request_item', itemId);
	mtomGR.addQuery('sc_item_option.item_option_new', questionId);
	mtomGR.query();
	if (mtomGR.next()) {
		var value = mtomGR.getDisplayValue('sc_item_option.value');
		if (value) {
			response = this.getDisplayValue(questionId, value);
		}
	}

	return response;
}

Now let’s take one more look at that page now that we have made these modifications.

Third and final test of our new portal page

That’s better! Now we have the same text in our columns that the user saw when placing the order, and that’s what we really want to see here. Or at least, that’s what I wanted to see. You may be interested in something completely different, but that’s the whole point of this approach. You can basically script whatever you want to put whatever you want to put in whatever column or columns you would like to define. These are just a couple of examples, but there really is no limit to what you might be able to do with a little imagination and some Javascript.

So here is the final version of our second close-to-real-world example value provider script:

var ScriptedCatalogValueProvider = Class.create();
ScriptedCatalogValueProvider.prototype = {
	initialize: function() {
	},

	questionMap: {
		cpu: 'e46305fbc0a8010a01f7d51642fd6737',
		memory: 'e463064ac0a8010a01f7d516207cd5ab',
		drive: 'e4630669c0a8010a01f7d51690673603',
		os: 'e4630688c0a8010a01f7d516f68c1504'
	},

	getScriptedValue: function(item, config) {
		var response = '';

		var column = config.name;
		if (this.questionMap[column]) {
			response = this.getVariableValue(this.questionMap[column], item.sys_id);
		}

		return response;
	},

	getVariableValue: function(questionId, itemId) {
		var response = '';

		var mtomGR = new GlideRecord('sc_item_option_mtom');
		mtomGR.addQuery('request_item', itemId);
		mtomGR.addQuery('sc_item_option.item_option_new', questionId);
		mtomGR.query();
		if (mtomGR.next()) {
			var value = mtomGR.getDisplayValue('sc_item_option.value');
			if (value) {
				response = this.getDisplayValue(questionId, value);
			}
		}

		return response;
	},

	getDisplayValue: function(questionId, value) {
		var response = '';

		var choiceGR = new GlideRecord('question_choice');
		choiceGR.addQuery('question', questionId);
		choiceGR.addQuery('value', value);
		choiceGR.query();
		if (choiceGR.next()) {
			response = choiceGR.getDisplayValue('text');
		}

		return response;
	},

	type: 'ScriptedCatalogValueProvider'
};

The added columns still do not align with the original columns, which I would like to fix before I build a new Update Set, and we still have a couple more wrapper widgets to address, but I think we are getting close. Maybe we can wrap this whole thing up in our next installment.

Scripted Value Columns, Part V

“Make incremental progress; change comes not by the yard, but by the inch.”
Rick Pitino

Last time, we had enough parts cobbled together to demonstrate that the concept actually works. Of course, all we had to show for it was some random numbers, but that told us that the specified script was being called for each row, which is what we were after. Now that we know that the basic structure is performing as desired, we can revisit the configurable Script Include component and see if we can come up with some actual use cases that might be of value to someone.

One of the questions that triggered this idea was related to comments and work notes on Incidents. Assuming that the main record in the table is an Incident, we can clone our example Script Include to create one dedicated to pulling data out of the latest comment or work note on an Incident. We can call this new Script Include ScriptedJournalValueProvider.

var ScriptedJournalValueProvider = Class.create();
ScriptedJournalValueProvider.prototype = {
	initialize: function() {
	},

	getScriptedValue: function(item, config) {
		return Math.floor(Math.random() * 100) + '';
	},

	type: 'ScriptedJournalValueProvider'
};

We will want to delete the example code in the getScriptedValue function and come up with our own, but other than that, the basic structure remains the same. Assuming that we want our script to be able to handle a number of attributes of an Incident Journal entry, we can use the name of the column to determine which function will fetch us our value.

getScriptedValue: function(item, config) {
	var response = '';

	var column = config.name;
	if (column == 'last_comment') {
		response = this.getLastComment(item, config);
	} else if (column == 'last_comment_by') {
		response = this.getLastCommentBy(item, config);
	} else if (column == 'last_comment_on') {
		response = this.getLastCommentOn(item, config);
	} else if (column == 'last_comment_type') {
		response = this.getLastCommentType(item, config);
	}

	return response;
}

This way, we can point to this same script in multiple columns and the name of the column will determine which value from the last comment or work note gets returned.

Since all of the functions will need the data for the last entry, we should create a shared function that they all can leverage to obtain the record. As with many things on the ServiceNow platform, there are a number of ways to go about this, but for our demonstration purposes, we will read the sys_journal_field table looking for the last entry for the Incident in the current row.

getLastJournalEntry: function(sys_id) {
	var journalGR = new GlideRecord('sys_journal_field');
	journalGR.orderByDesc('sys_created_on');
	journalGR.addQuery('name', 'incident');
	journalGR.addQuery('element_id', sys_id);
	journalGR.setLimit(1);
	journalGR.query();
	journalGR.next();
	return journalGR;
}

Now that we have a common way to obtain the GlideRecord for the latest entry, we can start building our functions that extract the requested data value. Here is the one for the comment text.

getLastComment: function(item, config) {
	var response = '';

	var journalGR = this.getLastJournalEntry(item.sys_id);
	if (journalGR.isValidRecord()) {
		response = journalGR.getDisplayValue('value');
	}

	return response;
}

The others will basically be copies of the above, modified to return different values based on their purpose. The whole thing, all put together, now looks like this.

var ScriptedJournalValueProvider = Class.create();
ScriptedJournalValueProvider.prototype = {
	initialize: function() {
	},

	getScriptedValue: function(item, config) {
		var response = '';

		var column = config.name;
		if (column == 'last_comment') {
			response = this.getLastComment(item, config);
		} else if (column == 'last_comment_by') {
			response = this.getLastCommentBy(item, config);
		} else if (column == 'last_comment_on') {
			response = this.getLastCommentOn(item, config);
		} else if (column == 'last_comment_type') {
			response = this.getLastCommentType(item, config);
		}

		return response;
	},

	getLastComment: function(item, config) {
		var response = '';

		var journalGR = this.getLastJournalEntry(item.sys_id);
		if (journalGR.isValidRecord()) {
			response = journalGR.getDisplayValue('value');
		}

		return response;
	},

	getLastCommentBy: function(item, config) {
		var response = '';

		var journalGR = this.getLastJournalEntry(item.sys_id);
		if (journalGR.isValidRecord()) {
			response = journalGR.getDisplayValue('sys_created_by');
		}

		return response;
	},

	getLastCommentOn: function(item, config) {
		var response = '';

		var journalGR = this.getLastJournalEntry(item.sys_id);
		if (journalGR.isValidRecord()) {
			response = journalGR.getDisplayValue('sys_created_on');
		}

		return response;
	},

	getLastCommentType: function(item, config) {
		var response = '';

		var journalGR = this.getLastJournalEntry(item.sys_id);
		if (journalGR.isValidRecord()) {
			response = journalGR.getDisplayValue('element');
		}

		return response;
	},

	getLastJournalEntry: function(sys_id) {
		var journalGR = new GlideRecord('sys_journal_field');
		journalGR.orderByDesc('sys_created_on');
		journalGR.addQuery('name', 'incident');
		journalGR.addQuery('element_id', sys_id);
		journalGR.setLimit(1);
		journalGR.query();
		journalGR.next();
		return journalGR;
	},

	type: 'ScriptedJournalValueProvider'
};

Now that we have a Script Include to utilize, we need to put together a new page so that we can configure it to make use of it so that we can test it out. Let’s make a quick copy of the page that we were using for testing last time and call it scripted_value_test. Also, let’s make a quick copy of the test configuration script that we were using earlier and call it ScriptedValueConfig.

var ScriptedValueConfig = Class.create();
ScriptedValueConfig.prototype = Object.extendsObject(ContentSelectorConfig, {
	initialize: function() {
	},

	perspective: [{
		name: 'all',
		label: 'all',
		roles: ''
	}],

	state: [{
		name: 'all',
		label: 'All'
	}],

	table: {
		all: [{
			name: 'incident',
			displayName: 'Incident',
			all: {
				filter: 'caller_idDYNAMIC90d1921e5f510100a9ad2572f2b477fe^active=true',
				fields: 'number,opened_by,opened_at,short_description',
				svcarray: [{
					name: 'last_comment_on',
					label: 'Last Comment',
					heading: 'Last Comment',
					script: 'global.ScriptedJournalValueProvider'
				},{
					name: 'last_comment_by',
					label: 'Last Comment By',
					heading: 'Last Comment By',
					script: 'global.ScriptedJournalValueProvider'
				}],
				aggarray: [],
				btnarray: [],
				refmap: {
					sys_user: 'user_profile'
				},
				actarray: []
			}
		}]
	},

	type: 'ScriptedValueConfig'
});

Now let’s pull up our new page in the Service Portal Designer and point the table widget to our new configuration script.

Configuring the new test page to use the new test configuration script

Once we save that, we can pop over to the Service Portal and pull up our new page to try it out.

First test of our first real world utilization of this feature

Beautiful! Our new scripted value provider Script Include was called by the core SNH Data Table widget and it returned the requested values, which were then displayed on the list with all of the other standard table columns. That wasn’t so hard, now was it?

Of course, we still have a couple more wrapper widgets to modify (and test!), and I would like to produce another example, maybe something to do with catalog item variables, but I think we are close. One thing I see that I never noticed before, though, is that the added columns don’t quite line up with the original columns. Maybe it is a CSS thing, or maybe it is something a little more diabolical, but I want to take a look at that and see what is going on there. All of the data in the columns should be displayed consistently; I don’t like it when things don’t all line up correctly. I need to figure out what is going on there and see what I can do about it.

Anyway, we still have a little more work to do before we can wrap this all up into a new Update Set and post a new version out on Share, but we will keep plugging along in our next installment.