“I dream of men who take the next step instead of worrying about the next thousand steps.” — Theodore Roosevelt
Last time, we did a little bit of work on the list and form for our new table, and now we need to try it out and make sure that everything works as intended. As we mentioned earlier, we will be using our recent Service Account Management demonstration app as the test case for this project, so let’s pull up the form and start entering some data.
The first thing that we will want to do is to select our Service Account table from the list of tables. Once that has been established, all of the table field drop-down choices will be fields from that table. Now we can select the columns to be used for each of the configuration values that are fields from the specified table.
Here, we run into a little bit of a problem. The fields we need for the Short Description, Description, and Recipient columns are all values from the base table. For the Escalation recipient column, however, what we would really like to specify there is the Manager of the Owner, which is a reference field on the sys_user table called manager. We would normally write that as a dot-walked field name using the value owner.manager. Since this is a drop-down, though, we don’t have that option. We’ll have to do a little research to see if there is a way around that limitation.
Those are the only required fields, so we can save the record at this point, just to establish our new configuration. Once we do that, though, we will want to bring the record back up again, as we need to test out the Test Filter button that we added to the form to make sure that it works. For now, let’s just use active=true as our test filter and see what happens.
Once we save the configuration record with the new filter, we can click on the Test Filter button as see where we land.
Well, there is good news and bad news here. The good news is that we ended up on the Service Account table’s list page, which is where we wanted to be. The bad news is that there are no records being displayed. Fortunately, the reason for that is that I cleared out all of my sample test cases on the instance once that project was complete, so the problem is not because of any flaw in the UI Action. Still, if I want to use the Service Account Management app for testing, I am going to need to recreate some of those test cases before we get too far. Anyway, you can look at the filter display on the list and see that the filter that we entered on the configuration is accurately represented in the resulting list. So that looks good.
There are a lot more fields to enter, but some of them, such as the Resolution Script or the Resolution Subflow will have to wait until we actually build a Script Include or a Subflow. For now, I think we have done enough to check out all of the form customizations, so now it is time to get back to building tables. Next time, we will jump into creating the table that we will use to store the execution information each time the process runs. That one should be pretty straightforward, so maybe we may even get a chance to tackle another one after that.
Last time, we laid out the basic concept for this little project and today it’s time to get to work. To begin, we need to create a new Scoped Application that we will call Periodic Review. It’s a process that we have done here many times before, so there is no need to waste time going over that again, but here is a screenshot of our new application.
Now that we have our application defined, we should probably set up a few properties so that it can be easily configured for different implementations. A few years back, we implemented a tool just for that purpose, so let’s click on that little Setup Properties button and get things started. Once that has been done, you will see that the button no longer appears, and we can use the System Properties tab to start adding properties via the New button.
For now, let’s just add a few things that we can think of right at the moment, and if we need more we can circle back add additional properties at a later time.
The purpose of each of these properties will become clear once we get into things, but for now, it’s enough to know that they exist. You will also notice a new action item above the list called Manage Application Properties. That will take you to the admin screen where you can adjust the values.
With that out of the way, we can start building our tables. We will need a few of them, but let’s start out with the basic configuration record for each implementation of the review process. The assumption here is that each implementation of the review process will review a specific artifact, and that artifact is maintained in another table in the Service Now instance. For example, if wanted to implement the review process for our recent Service Account application, the referenced table would be the Service Account table in that application. So one of the important fields on this table will be the table that contains the items to be reviewed. Other fields on the table will need to define the configuration for reviewing the item in question, and they may evolve over time as we work our way through the development, but let’s begin with what we envision so far.
We will want to number these records, as so many things are in the Now Platform so we will need a field called number and we will need to set up the numbering, which we have also done before. Next would be the name of the table, and then we will need a few fields that will tell us where to find certain things on that particular table.
Recipient column – The name of the column containing the person to be notified
Escalation recipient column – The name of the column containing the person to be notified if the original recipient does not take action in a timely manner
Short description column – The name of the column that contains a brief description of the item to be reviewed
Description column – The name of the column that describes the details item to be reviewed
In addition to the details about the subject table, we will also need a little more information about this particular implementation of the review process.
Short description – The common name of the review process
Description column – A more detailed description of the review process
Item label – How do we refer to the items being reviewed
Frequency – How often to we run the review process for these items
Filter – An encoded query filter for the specified table that the process will use to select the items up for review
Next scheduled date – The date of the next run
Action page header – Optional text to be included at the top of the page where the responsible person will indicate the appropriate action for each item listed
Response page content – Optional text to be included on the page presented when all of the appropriate actions have been submitted
Resolution Script – A reference to a Script Include that will run to process the actions requested
Resolution Subflow -A reference to a Subflow that will run to process the actions requested (if both a script and a flow have been specified, only the script will run)
Fallback recipient – A reference to the User table for a person to receive the notices if the designated recipient field is empty or if that user has no email address
One other thing that we will need is some text for all of the possible notices that might go out, including the original notice, a reminder notice if no action has been taken after a specified period, and possibly even an escalation notice to go to some higher authority if there is still no action. To accommodate those needs will need to add the following fields.
Notice header
Notice subject
Reminder header
Reminder subject
Escalation header
Escalation subject
That should complete our first table for our new application.
Before we jump into the next table, we should probably organize the layout of both the form and the list, and also take care of any policies and actions that we might want on the form. Since that’s probably a little bit of work, let’s jump into that next time out.
“Ideas are of themselves extraordinarily valuable, but an idea is just an idea. Almost any one can think up an idea. The thing that counts is developing it into a practical product.” — Henry Ford
When we wrapped up the Service Account Management project, we intentionally left out a critical part of the complete life-cycle of a service account, the periodic review of the account to ensure that the account was still needed. We did that intentionally because it was our opinion that it was best to leave that function to a generic third party product that could handle such a requirement for any number of use cases beyond just the management of service accounts. Virtually anything that is created, deployed, or installed for a temporary purpose should be reviewed on occasion to make sure that it is still needed, and if it is determined that it is no longer needed, some action should be take to revoke, deactivate, or uninstall the item for a number of reasons, including security and resource utilization. Regardless of the nature of the item, the process should basically be the same.
To have some generic product that would work for just about anything, there would have to be some kind of registration or set-up process to be used for each specific type of item that you wanted to review. And of course, there would have to be some meaningful name for these instances or use cases and they would need to be stored in some appropriately named table. For our purposes we could refer to these implementations of the product as Reviewed Artifacts, and we could create a table of that name that contained all of the information needed to run the review process for that particular implementation.
In practice, there would be some scheduled job that would run every day and refer to this table to see if there was any work to be done that day, and if there was, process each artifact’s workload in turn, sending out notices to the appropriate individuals informing them of the need to take some action to reaffirm the need for the items in question. Another table could keep track of these runs, and yet another could track the individual items associated with each run. Rather than send multiple notices to a single individual who might be responsible for more than one item, though, it would probably be better to consolidate all of the items for a specific individual onto a single notice, and so it might be better to have a table of notices sent out, and then a subordinate table of the items associated with that notice. In that case, the item table would point to the notice table, the notice table would point to the run table, and the run table would then point to the master configuration record for that particular reviewed artifact.
Upon receiving the notice of action required, you would want the recipient to then indicate whether or not each item on the notice was still required. For that, the notice could provide a link to a page that would display the list of items and provide a series of check boxes for various resolutions. To maximize flexibility, the possible resolutions could be customized for each reviewed artifact, and those options would be configured as part of the set-up for each new reviewed artifact and stored in yet another related table.
Once the recipient made their selections and submitted the response, the system could then update the item records within the system and also send the responses to some configured Script Include or Flow that would take the appropriate actions on the source records based on those responses.
To set all of this up for a new reviewed artifact, then, you would need to provide the source table containing the artifacts to be reviewed, the fields on the table that contain various bits of information such as the recipient of the notice and the description of the item, the frequency of the review, some artifact-specific verbiage for the notices, the options to be provided on the response entry page, and some artifact-specific process to handle the responses. Once we get into things, we may find that we will need other data points as well, but this should get us started.
It seems like a lot, but we will just take things on one piece at a time and see how it goes. Next time out we will get to work and create a Scoped Application and start throwing together some tables.
“Everything ends; you just have to figure out a way to push to the finish line.” — Jesse Itzler
Last time, we wrapped up the work on the example Service Account dashboard, although we did leave off a few potential enhancements that could improve its value. There is always more that could be done, such as the addition of an Admin Perspective showing all of the accounts and requests or an Expiring State showing all of the accounts that are coming up for review. Since this is just an example, we don’t need to invest the time in building all of those ideas out; some things should be left as an exercise for those who would like to pull this down and play around with it.
What we should do now, though, is take a quick step back and see what we have so far and what might be left to do before we can call this good enough to push out. When we first set out to do this, we identified the following items that would need to be developed:
One or more Service Catalog items to create, alter, and terminate accounts
A generic workflow for the catalog item(s)
A type-specific workflow for each type of account in the type table
Some kind of periodic workflow to ensure that the account is still needed.
We have basically created everything on our list except for that last item, but we have also indicated that the process to check back every so often and see if the account was still needed is something that could be handled by a stand-alone generic product that could perform that function for all kinds of things that would benefit from a periodic review. If we assume that we will turn that process over to a third party, then we would seem to have just about everything that we need.
There is one other thing that would be helpful, though, and we neglected to included it on our original list. It would be nice to have some kind of menu item to launch all of these processes that we have built, so let’s put that together real quick and get that out of the way. I am thinking of something like this:
Service Accounts
New Service Account
My Service Accounts
Service Accounts
Service Account Types
The first item would initiate a request for the Service AccountCatalog Item, the second would bring up the dashboard, and the last two would just bring up the list view of our two tables. Those last two would also be limited to admins only and the rest would be open to everyone. Here is the high-level menu entry.
… and here are the four submenu options for this high-level menu item:
Which produces a menu that looks like this:
So that’s about it for this little example project. Again, this is not intended to be a fully functional product that you would simply install and start using. This is just an example with enough working parts to get things started for anyone who might want to try to create something along these lines. Obviously, you would have your own list of types, your own implementation workflows for each type, your own approval structure for each type, and your own language in all of the notices, so it’s not as if someone could build all of that out in a way that would work for everyone. But for anyone would like a set of parts to play with to get things started, here is an Update Set that contains everything that we have put together during this exercise.
“Beginning in itself has no value; it is an end which makes beginning meaningful; we must end what we begun.” — Amit Kalantri
Last time, we added the Requested Item table to our Service Account dashboard so that we could see the pending requests, but we left off with a field name error and the desire to add a few item variables to the table using some Scripted Value Columns. Today, we will fix up that little error, and add some columns to both tables, hopefully wrapping things up, at least for this version of the dashboard.
In our field list for the new table, we had included the field name opened, when in actuality, the correct field name for the opened date/time is opened_at. That’s an easy fix, and now our field list looks like this:
number,opened_at,request.requested_for,stage
While we are in the configuration updating field lists, let’s also add the new link to the original request to the field list for the Service Account table, which will now look like this:
Also, since that new column will be a link to the sc_req_item table, let’s map that table to the ticket page by adding a new entry to the reference map.
That should take care of the errors and oversights. Now let’s take a look at adding some item variables to the pending request view. We put some catalog item variables on an example table not too long ago, so let’s just follow that same approach and maybe steal a little code from that guy so that we don’t end up reinventing an existing wheel. Here is the script that we built for that exercise.
var ScriptedCatalogValueProvider = Class.create();
ScriptedCatalogValueProvider.prototype = {
initialize: function() {
},
questionMap: {
cpu: 'e46305fbc0a8010a01f7d51642fd6737',
memory: 'e463064ac0a8010a01f7d516207cd5ab',
drive: 'e4630669c0a8010a01f7d51690673603',
os: 'e4630688c0a8010a01f7d516f68c1504'
},
getScriptedValue: function(item, config) {
var response = '';
var column = config.name;
if (this.questionMap[column]) {
response = this.getVariableValue(this.questionMap[column], item.sys_id);
}
return response;
},
getVariableValue: function(questionId, itemId) {
var response = '';
var mtomGR = new GlideRecord('sc_item_option_mtom');
mtomGR.addQuery('request_item', itemId);
mtomGR.addQuery('sc_item_option.item_option_new', questionId);
mtomGR.query();
if (mtomGR.next()) {
var value = mtomGR.getDisplayValue('sc_item_option.value');
if (value) {
response = this.getDisplayValue(questionId, value);
}
}
return response;
},
getDisplayValue: function(questionId, value) {
var response = '';
var choiceGR = new GlideRecord('question_choice');
choiceGR.addQuery('question', questionId);
choiceGR.addQuery('value', value);
choiceGR.query();
if (choiceGR.next()) {
response = choiceGR.getDisplayValue('text');
}
return response;
},
type: 'ScriptedCatalogValueProvider'
};
We can make a copy of this script and call ours ServiceAccountDashboardValueProvider. Most of this appears to be salvageable, but we will want to build our own questionMap using the columns that we will want to use for our use case. To find the sys_ids for the variables that we will want to use, we can pull up the Catalog Item to get to the list of variables, and then pull up each variable and use the context menu to snag the sys_id for each one.
Once we gather up all of the sys_ids, we will have a new map that looks like this:
That should be enough to make things work; however, in our case the types of variables involved will return the display value directly, so we do not need to go through that secondary process to look up the display value from the value. We can simply delete that unneeded function and return the value directly in this instance. That will make our new script look like this:
var ServiceAccountDashboardValueProvider = Class.create();
ServiceAccountDashboardValueProvider.prototype = {
initialize: function() {
},
questionMap: {
account_id: '59fe77a4971311100362bfb6f053afcc',
type: 'f98b24a4971711100362bfb6f053afa0',
group: '3d4fbba4971311100362bfb6f053afe3'
},
getScriptedValue: function(item, config) {
var response = '';
var column = config.name;
if (this.questionMap[column]) {
response = this.getVariableValue(this.questionMap[column], item.sys_id);
}
return response;
},
getVariableValue: function(questionId, itemId) {
var response = '';
var mtomGR = new GlideRecord('sc_item_option_mtom');
mtomGR.addQuery('request_item', itemId);
mtomGR.addQuery('sc_item_option.item_option_new', questionId);
mtomGR.query();
if (mtomGR.next()) {
response = mtomGR.getDisplayValue('sc_item_option.value');
}
return response;
},
type: 'ServiceAccountDashboardValueProvider'
};
Now all we need to do is to pull up the dashboard under the new configuration and see how it all looks. First, let’s take a look at the new column that we added for the original request.
There is only data there for the most recent test, but that’s just because that field did not exist on the table until recently. Now let’s click on the Pending state and see how our item variables came out.
Very nice! OK, I think that about does it for this version of the sample dashboard. There is still some work that we could do on the Fulfiller perspective, and it might be nice to add an Admin perspective that showed everything, but since this is just an example of what might be done, I will leave that as an exercise for those who might want to play around with things a bit. Next time, let’s take a look at what now have up to this point, and at what there might be left to do before we can wrap this one up and call it done.
“There are no big problems, there are just a lot of little problems.” — Henry Ford
Last time, we wrapped up the initial table configuration for our Service Account dashboard and tested everything out to make sure that it all worked as intended. We also identified the fact that we need to add a second table to the configuration so that we can see the pending requests that have not yet created a record in the Service Account table.Before we do that, though, I decided that it would be useful to add a link to the original request on the Service Account record so that you could easily pull up the request from the account record.
To populate the field during the creation of the Service Account record, I pulled the Service Account Request Fulfillment flow up in the App Engine Studio and added an entry to drag in the data pill from the original request in the trigger.
With that out of the way, we can return our attention to adding the new table to the dashboard configuration. To do that, we go back to the Content Selector Configuration Editor that we recently updated to correct a few issues related to Scoped Applications. Before we do that, though, let’s pull up the list of Requested Items and build ourselves a filter that we can use to show all of the open items for Service Accounts requested by the current operator.
We are looking for active items requesting the Service Account catalog item requested by the currently logged on user. The filter can be found in the URL under the sysparm_query parameter.
Of course, we have to do a few change alls to get rid of all of the double encoding present in this version, but once we do that we will have a workable filter for the pending state on our newly added table.
Now let’s jump into the editor and add our new table.
For now, let’s assume that we don’t want anything to appear in the Active and Retired states, and we can use the same technique that we used on the original table when we didn’t want anything to appear for that table in the Pending state. We’ll set the field list to simply number, and set the filter to number=0.
For the Pending state, we add a few more relevant fields and use the filter we snagged from the list URL earlier.
We can do a little more with this, but let’s save what we have for now and take it out for a spin, just to make sure that everything is still in working order. Saving the changes should take us to the generated script, which now looks like this.
Now all we need to do is to pull up the dashboard with the modified configuration, click on the Pending state, and take a quick peek.
Well, that’s not too bad. Looks like we screwed up on the field name for the open date, but other than that, things look pretty good. I want to add a few more columns from the catalog item variables anyway, which we can do by configuring some Scripted Value Columns, so let’s fix our little error and deal with those new fields in our next installment.
“You’ve got to think about big things while you’re doing small things, so that all the small things go in the right direction.” — Alvin Toffler
Recently I was playing around with the Content Selector Configuration Editor to create a dashboard for my Service Account Management app, which is a Scoped Application, and realized that the last fix that I put in to make things work with a Scoped Application did not quite go far enough. Looking things over it is quite clear that the original design had only considered support for global scripts, and my first attempt to rectify that oversight did not resolve all of the issues for configuration scripts that were not in the global scope. Today it is time to finish that up and correct all of the other shortcomings in that tool when working outside of the global scope.
For starters, the pick list for available scripts in the current version includes all of the configuration scripts for all scopes. What we really need is to limit that selection list to just those scripts in the current scope. Otherwise, you could potentially be editing a script in one scope while you are working in another scope, which will not end well if it works at all. To limit the list to just the scripts in the current scope, we need to add something like this to the filter:
It would also be good to add a little more information to the help text for that field, so the entire snh-form-field tag now looks like this:
<snh-form-field
snh-label="Content Selector Configuration"
snh-model="c.data.script"
snh-name="script"
snh-type="reference"
snh-help="Select the Content Selector Configuration from the current Scope that you would like to edit."
snh-change="scriptSelected();"
placeholder="Choose a Content Selector Configuration"
table="'sys_script_include'"
default-query="'active=true^sys_scope=javascript:gs.getCurrentApplicationId()^scriptLIKEObject.extendsObject(ContentSelectorConfig^ORscriptLIKEObject.extendsObject(global.ContentSelectorConfig'"
display-field="'name'"
search-fields="'name'"
value-field="'api_name'"/>
That solves one problem, but there are others. When building the new script from the user’s input in the save() function of the widget’s server script, this conditional only reduces the API Name to the root name for global scripts:
if (data.scriptInclude.startsWith('global.')) {
data.scriptInclude = data.scriptInclude.split('.')[1];
}
This needs to be done for scripts in any scope, so the entire conditional should just go away and simply be reduced to this:
Further down in that same function, this line again assumes that you are working in the global scope:
scriptGR.api_name = 'global.' + name;
The API Name is actually set for you whenever you save a new script, so this line can actually just be removed entirely and things will work just fine.
With all of these changes, the new save() function now looks like this:
All in all, not a huge number of changes, but just enough to make things work. I bundled all of the relevant parts into another Update Set that includes these various changes, which you can find here. This component is also a part of the larger SNH Data Table Widget collection, so eventually I will need to publish a new version of that collection out on Share as well.
“On your darkest days do not try to see the end of the tunnel by looking far ahead. Focus only on where you are right now. Then carefully take one step at a time, by placing just one foot in front of the other. Before you know it, you will turn that corner.” — Anthon St. Maarten
Last time, we threw together the beginnings of a configuration script for Service Account dashboard using the Content Selector Configuration Editor. Now that we have a viable script, we need to create Service PortalPage that will utilize that configuration. To begin, we will pull up the list of Portal Pages and click on the New button to create a new page.
We will call our new page Service Account Dashboard and give it an ID of service_account_dashboard. Once we submit the form we can pull it back up and use the link down at the bottom of the form to bring it up in Service Portal Designer. Onto the blank canvas we will drag a 12-wide container, and beneath that one, we will drag in a 3/9 container. Into the upper 12-wide container, we will drag in the Dynamic Service Portal Breadcrumbs widget, and into the 3 portion of the 3/9 container, we will drag in the Content Selector widget. In the 9 portion of the 3/9 container, we will pull in the SNH Data Table from URL Definition widget. Now that we have placed all of the widgets, we will need to edit them, starting with the Content Selector.
Here is where we enter the full name of the configuration script that we created last time. Since this is a Scoped application, we need to include the scope with the name so that it can be successfully located. That’s all there is to configuring that widget, as most of the configuration information is contained in the referenced script. Configuring the Data Table widget is a little more involved.
Here we give it a title of Service Accounts and select an appropriate Glyph image. We check the Use Instance Title checkbox to get our title to show up, and we leave all of the rest of them unchecked. Once we save that and save the page, we should be ready to try it out, which we can do easily enough with the View page in new tab button up in the upper right-hand corner.
So far, so good. The default selection is active Service Accounts from the requester’s perspective, and you can see all of the account records from our failed and successful test submissions. I went ahead and retired one of them so that we could test the Retired state. Let’s click on the Retired button and see how that comes out.
That looks good as well. Now let’s try the Pending state, which should come up empty for the Service Account table, as pending requests have not gotten far enough along in the process to have created the record in that table yet.
Well, that’s not right! But you knew things were going too well at this point and it was about time for something to go horribly wrong. This is just a problem with our Filter, though, and should be easily remedied. We used the filter 1=0, which obviously did not work, so let’s try using an actual field from the table and do something like this in our config file:
filter: 'number=0',
Before we add that to all of the pending configurations, let’s pull up the dashboard again and see how that looks.
That’s better. Of course, to actually see the pending Service Accounts, we will need to add another table to our configuration. We can go back into the Content Selector Configuration Editor to do that, and then go back to the dashboard and check it out. That sounds like a good exercise for our next installment.
Last time, we were about to throw together a little dashboard of Service Account information when we ran into a little problem with the Content Selector Configuration Editor. Actually, it turned out to be a problem with the snh-form-field tag, but now that we have taken the time to fix that, we should be able to get back to where we were and continue on. So let’s get back into the configurator tool and try one more time to create a new configuration script.
Well, that’s much better! Now we can see all of the fields again in the modal pop-up as well as both of the buttons, so things are back to normal with the newer version. After creating the Requester perspective, we go through the process again to create the Fulfiller perspective.
Now, we could have used slightly different names, such as Owner and Provider, but again, this is just a sample of what could be; your mileage may vary. One thing that we did do on the Fulfiller perspective, though, was to add the itil role so that only actual fulfillers would have access to that portion of the dashboard.
Next, we need to add some states, and for our purpose, the states of Active, Retired, and Pending should suffice.
With that out of the way, now we can start completing the Tables section. Clicking on the Add a new Table button in the Requester tab will bring up the modal Table Selector pop-up.
Once the Table has been added, we can fill in all of the rest of the configuration data.
For the Active state, we will use following fields:
For the Retired tab, we will just change the above filter from active=true to active=false. Everything else can remain the same. For accounts in the pending state, there will be no record on the Service Account table just yet, so we can just set the filter to 1=0, which should always find no records. To see the pending accounts, we will need to add another table. We can deal with that later, though, so for now let’s just focus on the Service Account table and then see how it all comes out.
Basically, we go through pretty much the same process for the Fulfiller tab, and once we save all of our input, we end up with the following configuration script.
Now all we need to do is to create a Portal Page that will use the configuration script and we can take it out for a spin. That sounds like a good project for our next installment.
“We learn from failure, not from success!” — Bram Stoker
A while back I was working on my Collaboration Store project when I discovered a problem with the SNH Form Fields when running on my Tokyo instance. At the time, I was not able to diagnose the source of the problem, but I did manage to come up with a work-around, which I implemented on the page that I was developing at the time. What I did not do was to go back and refactor all of the other widgets that utilize the snh-form-field tag to implement the work-around on those as well, nor did I invest any time in actually hunting down the source of the actual problem with the tag, correcting it, and producing a new version.
Recently, I was working on my little Service Account Management app, and was rudely reminded of this unfortunate oversight. Initially, I thought that there was something wrong with my modal pop-up box, but after further review I realized this was the same snh-form-field issue that I had run into earlier on the other project. Clearly, it was long since time to address it.
To implement the work-around, I brought up a list of all of the Service Portalwidgets that contained the text ‘snh-form-field’ in the Body HTML template property. Then one by one, I pulled them up in the editor, searched for the tag, and then wrapped a SPAN around each one, mitigating the problem. For example, here is the original HTML for the Aggregate Column Editor widget:
It was not difficult work, but it was rather tedious. Eventually, I got through the entire list. Then I put together a new Update Set for the SNH Data Table Widgets and posted the new version (2.4) out on Share. Unfortunately, it wasn’t until I had already posted it out there that I realized that I had left out a critical widget in the build, so I had to build the Update Set a second time. It did not look like there was any way to replace the Update Set on Share for the 2.4 version, so I called the corrected Update Set 2.4.1. But that is not a legal version name on that site, so on Share, that version is known as 2.41. Anyway, it’s out there now, so if you are running, or planning to run, on Tokyo or Utah, you should definitely go out to Share and pull down the latest Update Set. But stay away from version 2.4, because that was just an error, and shouldn’t even be out there.
Oh, and if you run into any issues with the 2.4.1 version, please provide some details in the discussion section on Share, or in the comments below. Thanks!