“Never put off until tomorrow what you can do the day after tomorrow.” — Mark Twain
Last time, we started reworking our primary fulfillment Flow to resolve the problem that we encountered during the testing of our second example Subflow. In the midst of that process, we found that we needed to rework the inputs on our custom Action that calls the type-specific Subflow, so let’s fire up our old friend the Flow Designer and take care of that now. The first thing that we will want to do is to delete the existing Requested Item record input.
Once that is gone, we will want to replace it with a new input for the Catalog Task.
Next, we need to jump down to the script step and modify its inputs in the same way and map the new catalog task input to the matching action input.
And finally, we will want to modify the script to pass in the new catalog_task variable instead of the original requested_item variable.
(function execute(inputs, outputs) {
try {
var result = sn_fd.FlowAPI.getRunner()
.subflow(inputs.subflow)
.inForeground()
.withInputs({catalog_task: inputs.catalog_task})
.run();
var returned = result.getOutputs();
for (var name in returned) {
outputs[name] = returned[name];
}
} catch (e) {
outputs.success = false;
outputs.failure_reason = 'Subflow execution failed with error: ' + e.getMessage();
}
})(inputs, outputs);
Now we can Save and Publish our modified Action and get back to our work on the primary Flow, but before we can test anything, we will need to modify all of our Subflows. For one thing, our Subflows are not expecting to receive a Catalog Task right now, and of course, since we are sending in a Catalog Task, that will requiring altering the logic in both of our example Subflows. But let’s finish up the primary Flow before we worry too much about that.
Now that our Create Service Account custom Action takes a Catalog Task as an input, we can drag the data pill from the Look Up Record step into the appropriate input and click on the Done button.
At this point, the rest of the primary Flow should continue to work as originally conceived. Future steps will not execute until both parallel branches have been completed, so automated Subflows should end when the second branch is completed and manual Subflows should end when the first branch is completed. Either way, the rest of the primary Flow should execute as before once both branches have finished their work.
The next thing that we will need to do before we can resume testing is to modify one or both of the example Subflows. Both of those will require a bit of redesign to accommodate our new approach, so that sounds like a great subject for our next installment.
“Don’t waste your time being upset about something you can’t change. Begin again and do it better this time.” — Joyce Meyer
Last time, we finished up our second fulfillment Subflow and took it out for a spin. Unfortunately, our first test of the new process died before it ever really got off of the ground. It turns out that you cannot launch a Subflow from a script when that Subflow contains a manual task that may not be completed for some time. Our Subflow ended with an error message indicating that the Subflow was in a wait state, which is actually true, as the Subflow was waiting for the manual task to be completed.
This is no small problem. If we cannot convince the script to wait for the manual task to be completed, then we are going to have to rethink the entire approach to the primary Flow. We don’t really need a task to be created when the process is automated, but we could still open up the task in the primary Flow, and then pass the task to the Subflow instead of the Requested Item. This would mean that any automated Subflows would have the responsibility of closing the task, but that would work. The problem, of course, is that if you check the Wait box on the task, no further steps in the process will execute until the task is closed, and the task isn’t set to be closed until you execute those future steps. That’s not going to work!
Fortunately, the Flow Designer allows you to run separate steps in parallel. We can create the task in one branch and then call the Subflow in a separate parallel branch. This should work, although it is a complete redesign of everything that we have put together so far in the fulfillment realm. Oh, well … as one of my previous supervisors used to tell me all the time, if it was easy, then anyone could do it!
So here is what we need to do: instead of calling the appropriate Subflow and passing in the Requested Item, we will need to create two parallel paths with one responsible for creating the task and waiting for its completion and the other responsible for calling the appropriate Subflow. Since you cannot drag data pills from one parallel branch to another, our second branch will need to wait for a bit to insure that the task has been created, and then read the task from the database so that it can have a task data pill to pass in to the selected Subflow. This changes quite a number of artifacts that we have already built, so let’s dive in and start hacking things up. To begin, let’s pull up our primary Flow in the App Engine Studio and insert a Do the following in parallelFlow Logic step right after we fetch the Service Account Type record.
Under the first parallel path we will create our task using the Create Catalog Task action.
We will assign the task to the fulfillment group specified in the type record so that the task gets routed to the appropriate fulfillment group. For automated fulfillment processes, the automation will need to close the task with the appropriate closure code before returning to the primary Flow, which should keep the task from showing up in that group’s active work queue.
Our other branch will need access to that task record, but since you cannot drag data pills from one parallel branch to another, we will need to fetch the record from the database. Before we do that, though, we will want to make sure that the other branch has created the record, so we will want to wait a bit before we go searching for it. We can do that with another Flow Logic option, Wait for a specific duration of time.
10 seconds may be way too much time, or it may not be enough in some circumstances, but it is a good place to start and we can adjust it in the future if needed. After waiting for the record to be created, the next thing that we will want to do is to fetch the record so that we can pass it in to our appropriate Subflow. To do that, we will use a Look up record action.
Now that we have our task record in hand, we can drag our existing Create Service Account action in under this second parallel branch as the next step. However, the current version of the action is expecting a Requested Item record as input, and under this new approach, we are going to want to pass in the Catalog Task instead. We will need to modify that action to alter the input, which will also affect all of our existing Subflows as well. That sounds like a good subject for our next installment.
“The only difference between a problem and a solution is that people understand the solution.” — Charles Kettering
Last time, we started building out the second of our example fulfillment Subflows when we determined that we needed to build a custom Action to clear out the password value from the new Catalog Item Variable that we created so that the fulfilling technician could provide the new account’s password to the Flow. This is a relatively simple Action, so let’s jump right into the Flow Designer and get to it.
To begin, let’s click on the New button and select Action from the drop-down list.
We will give our new Action the name Clear Password From Task, and give it a single input, the task record that contains the password variable.
Since all we want to do here is to clear the value from the password variable, there is no need for any outputs, so we will just add a single Script step and insert the following code:
For that to work, we need to map the task record from the Action inputs to the script step’s inputs.
And that is all that there is to that. At this point, all we need to do is to Save and Publish our new Action and then we can jump back into our Subflow and use it in our next step.
Now all we have to do is to assign values to the Subflow outputs and that should complete the successful branch of our Subflow. Then we will need an Else condition to handle the possibility that the task was not completed successfully. But first, let’s wrap up the successful branch by assigning values to the Subflow outputs.
Here we run into a bit of a snag, as you cannot use the password value pill to populate the password output because the password output variable is a 2-way encrypted value and the password value pill is a simple string. To solve that problem, we resort to scripting, with the following code:
This may or may not work — we will have to try it. If it doesn’t properly convert, then we will have to come up with something else. But let’s give this a shot and see what happens.
So now all that is left is to assign the Subflow outputs in the event of a failure. We do that under an Else condition, and grab the reason from the task’s closure notes.
Once we Save and Publish our new Subflow, we can jump back into the Service Catalog and place another order to try it out. This time, we will select Active Directory from the list of types so that we can drive the process through our new Subflow and see what happens.
After clicking on the Request button, we are brought to our new Request record.
Clicking on the item will bring us to the Requested Item record.
At this point, we can see that something went terribly wrong somewhere along the line. The item is sitting at the Closed Incomplete state, and we never even got the chance to work the manual task that should have been created by our Subflow. That’s definitely not good. Well, that’s obviously not an optimal outcome for our test, but it should make for a fun debugging exercise for our next installment!
“There is joy in work. There is no happiness except in the realization that we have accomplished something.” — Henry Ford
Last time, we fixed a few problems with our ServiceNow account password, which wrapped up the work on that particular Subflow. That was also enough to prove out our primary Flow for fulfilling the Catalog Item we built to request a new Service Account. Now it is time build another Subflow for our other example account type, Active Directory. This time, instead of an automated fulfillment process (which, of course, is possible for A/D with some integration) we are going to assume that some technician has to create the account, so we will be issuing a Catalog Task to the folks who would be doing that. This is just to demonstrate the manual fulfillment process as opposed to the automated approach that we used last time.
Before we do that, though, let’s take a quick side trip to add Stages to our primary Flow. Flows associated with Catalog Items should always identify the fulfillment stages to provide insight into the status of the fulfillment process. We can add Stages to the Flow in the App Engine Studio by pulling up the Flow, clicking on the More Actions menu (‘…’) in the upper right-hand corner, and selecting Flow stages from the drop-down menu. Select Requested Item from the Add stages from a template selection and then click on the Add stages button. This will pull in all of the Stages defined for a Requested Item. Now all we have to do is to click on the Add a Stage button in the space above each step that begins a new Stage and choose the appropriate stage from the available list.
I selected Fulfillment for the initial Stage, Delivery for the step that creates the record in our account table, and Completed when the password email is sent out. Also, not shown on that screen shot, I selected Request Cancelled when the Service Account could not be created. That should be enough to get us started, but we may find the need to fine tune that a little bit once we see how it all comes out in testing.
There is one other thing that we need to do before we jump into building out the manual Subflow for Active Directory accounts. Since we are assuming that a technician will be creating the account and setting the initial password, we will need some way for the technician to provide the password back to the primary Flow so that it can be communicated to the requestor via the notification email. One way to do that would be to add one more variable to our Catalog Item and hide it everywhere except on the task record. Once the task is closed, we can then lift the value off of the task, pass it back to the primary Flow and then clear the value on the task record, as it will no longer be needed. With that out of the way, we should now be able to build out our new Subflow.
To begin, let’s pull up our old friend the Flow Designer and create a new Subflow called Active Directory Service Account Creation. We will use the same inputs and outputs that we created for our ServiceNowSubflow, as that is what our calling Action requires.
The first thing that we will want to do is to create the task, and there is an action already set up to do just that, so we select that and populate all of the appropriate fields.
We also want to check the Wait box, as we do not want the rest of the Flow to run until the task has been completed and we can see how it turns out. And we want to snag the variables that we need from the request so that they will be available on the task record.
Once the task is no longer active, we will want to check to make sure that it was completed successfully. We can do that with an If condition that looks to make sure that the State of the ticket is Closed Complete. If it is, then we will want to grab the password entered by the technician so that we can pass it back to the calling Flow. To do that, we use the Get Catalog Variables action.
Once we have a data pill with the password value that we can use for our Subflow outputs, we do not need to have the value in the database. We will want to clear the value from the new password variable that we just added. Unfortunately, while there is an existing Get Catalog Variables action shipped with the platform, there is no corresponding Set Catalog Variables action that we can use to remove the value of the variable. For that, we are going to have to create yet another custom Action.
Even though an Action that clears the value of a variable is relatively simple, creating a custom Action is a little bit of an involved process. This seems like a good place to stop for now, then, and we can jump right into creating that new Action right at the top of our next installment.
“Mistakes are part of the dues one pays for a full life.” — Sophia Loren
Last time, we did a little testing on our new Catalog Item fulfillment Flow and discovered that the password was not set up correctly on the requested ServiceNowService Account that was created during the process. Before we get too worked up about that problem, though, we should at least celebrate the fact that the primary Flow worked as intended, calling the type-specific Subflow to create the account, and then notifying the requester and closing out the Requested Item. Also, the Subflow that we set up for ServiceNow accounts also worked for the most part, creating the account and sending the details back to the main Flow. Of course, none of that really matters if you cannot successfully sign on using the password sent to the requester. So we do have to deal with that today, but other than that, things are actually looking pretty good.
So, what to do about the password issue? Well, like many things on the Now Platform, there are a variety of ways to go here. Using script, we can set the clear text password by using the setDisplayValue method instead of the setValue method. To do that, though, we would have to create the account first, and then add some kind of additional step where we could dip into scripting, most likely some kind of custom action. If we have to add another step, though, there is already an Update Password action that we can use, but rather than a String input, it requires that you pass in a 2-way encrypted value. Although that adds its own unique complexities to the process, it is actually a better way to go, as it keeps the clear text password value out of the execution log where malicious actors might stumble upon it. So let’s go that route, as it not only addresses this issue, but improves the process overall.
The first thing that we need to do then is to change the variable type for every occasion of the account_password variable from String to Password (2 Way Encrypted). This affects the primary Flow, all type-specific Subflows, as well as the Action that we created to generate the password. Initially, I thought that the password field would have to be encrypted before delivering it to the Action outputs, but it turns out that this is all handled internally once you change the field type, and all you have to do is to return the clear text password just as we were doing originally.
Now that we are generating and passing around an encrypted password, we can remove the password field from the original account creation step in the Subflow and then add a new step right after the account has been created to set the password.
That takes care of the password issue, but there is still one more thing that we may need to do. Since we switched from passing around the clear text password to using the encrypted password, we might have to decrypt it before we send it out to the requester. For now, let’s just drag in the account_password pill from the account creation step and see what happens. If that doesn’t work out for us, we will have to do a little more research and see what we have to do to get the encrypted password back into clear text for the email step.
That completes all of the modifications so now all that is left is to place another order and see if all of the stuff that worked last time is still working and if we are now sending the requester a password that they can actually use. We’ll go through the same steps that we went through last time, so no need to repeat all of that here, but let’s take a look and see how things turned out when we repeated all of those ordering steps.
Well, there is good news and bad news here. The good news is that the switch to Password (2 Way Encrypted) has indeed corrected the issue with the password. The password that would have been sent to the requester is, in fact, the password that you can use to log on to ServiceNow with the new account. The bad news is that the special characters in the password caused a problem with the HTML in the password email. The simple solution to that would be to change the email body format from HTML to plain text, but I don’t see any way to do that with the options available on the Send Email action.
I took a look at all of the data pill transform options, but there doesn’t seem to be a built-in function to sanitize a string for HTML, so I ended up wrapping the password data pill with a <pre> tag and that seems to have resolved the issue.
There are still a few things that we could do to improve things a bit, but everything seems to be working at this point. We still have to build out another Subflow for our second account type example, though, so let’s jump right into that next time out.
“Never discourage anyone who continually makes progress, no matter how slow.” — Plato
Last time, we built out a fulfillment Subflow for one of our two example Service Account types, so now we can build the primary Flow that will call that Subflow and do all of the other work required to fulfill the request. Although you can configure a Flow to call a Subflow using the Flow Designer, you have to specify the Subflow during the development process. In our case, the Subflow that we will want to use will be dependent on the type of Service Account requested, so we will not know which Subflow will need to be called until execution time. Ideally, we would want to look the type requested, read the record for that type to get the Subflow, and then execute the Subflow specified in the type record. Since there doesn’t seem to be a way to do that out of the box, we will need to build out a custom Action to make the Subflow call via script. I created a simple Action called Create Service Account that takes the name of the Subflow and the Requested Item record as input and returns the same outputs as our fulfillment Subflows. The script for that Action looks like this:
(function execute(inputs, outputs) {
try {
        var result = sn_fd.FlowAPI.getRunner()
            .subflow(inputs.subflow)
            .inForeground()
            .withInputs({requested_item: inputs.requested_item})
            .run();
var returned = result.getOutputs();
for (var name in returned) {
outputs[name] = returned[name];
}
    } catch (e) {
        outputs.success = false;
outputs.failure_reason = 'Subflow execution failed with error: ' + e.getMessage();
    }
})(inputs, outputs);
All it does is launch the Subflow with the Requested Item record passed and returns whatever is returned by the called Subflow. This will essentially perform the same function as the Call Subflow action, but with the added benefit of allowing us to pass in the name of the Subflow to be called rather than have it hard-coded in the Flow.
Now that we have the ability to call a configured Subflow, we can jump back into the App Engine Studio and build out the primary Flow. On the dashboard for our application, we can scroll down to the Logic and automation section and then click on the Add button right after the section header.
Once you click on the Add button, a selection list appears, with Flow being the first option.
On this screen, we simply select Flow from the available options, which takes us to the next screen.
On this screen we enter Service Account Request Fulfillment in both the Name and the Description fields and then click on the Continue button.
The next screen just comes up long enough for the basic Flow to be created, after which we are brought to the successful completion screen.
At this point, the Flow now exists, but it has no steps, so we will want to click on that Edit this flow button to start building out the logic of the Flow. We will select Service Catalog as the trigger, and as we did with our sample Subflow, the first thing that we will want to do is to gather up the variable values from the Requested Item.
This time, we will need the type, the responsible_group, and the account_id from the request. The next thing that we will want to do is to read the Service Account Type record for the requested type, so we will select Look Up Record for our action.
We are looking for the record where the Name field matches the type selected on the Catalog Request. Once we have the variable values from the request and the matching type record, we can call our type-specific Subflow using the Action that we created for that purpose.
Now we need to check to make sure that the account was created, so we add an If condition based on the success flag returned by the Subflow.
If the account was created successfully, we will want to create a record for the account in our Service Account table, but before we can do that we need to fetch the record for the responsible group from the sys_user_group table. We have the name of the group from the catalog item variables, but we need the sys_id of the record to populate the Service Account record. We can do this with another Look Up Record action, searching the table for a record with the same name.
With the user group record now in hand, we have enough information to create the new record in our Service Account table.
Once we have created a record for the new Service Account, we will want to inform the user that the account has been created and is available for use. I took a short-cut here and just stubbed out a simple email, but ideally you would want to use a mail template and include some boiler-plate verbiage about company policies on the use of Service Accounts, security concerns, and related information on the owner’s responsibilities. All of that would ultimately be up to anyone attempting to implement such as system, and has little relevance to the workings of the process, so I will leave that to others and just include something simple as an example.
The password for the account should be sent out in a separate email, but before we do that, let’s go ahead and close the Requested Item now that the account has been created and the requested notified. To do that, we will select Update Record for our action and then drag in the pill for the Requested Item record, which will then populate the Table value.
Now all that is left to do is to send the password for the account to the requester. I took another short-cut here in that the only contents of the body of the email is the password itself with no other information, but again, this is only a sample. An encrypted email from a template would obviously be preferable here, but this at least provides a placeholder for performing this task in a much better way.
That completes the process for a successful account creation, but if the account could not be created for any reason, we still need to close out the Requested Item with that information. To do that, we will add an Else condition to our If and then insert one more step under the Else.
And that’s all there is to that. Now that we have our completed Flow, we can go back into our Catalog Item and specify this Flow for fulfillment. Once that has been done, we can finally request the item to generate a Requested Item record that we can use to test all of this out. That will probably be a little bit of an adventure, so let’s save all of that for our next time out.
“One step at a time is all it takes to get you there.” — Emily Dickinson
Last time, we built out the majority of the Catalog Item that we are creating for the purpose of requesting a new Service Account. All that is left for us to do now is to create the process that will fulfill the request once it has been made. Because our intent is to create a single Catalog Item for any type of account, and each type of account will have its own unique account creation process, we will want to build a generic workflow of some kind that will look at the type requested and then launch the appropriate fulfillment process based on that type. Before we jump in and start creating things, though, we will need to come up with a strategy that will facilitate combining a generic outer process with a type-specific inner process.
Assuming that each type will have its own Subflow unique to the type, we will want to store a link to that Subflow on the type record that we created earlier. This way, the generic process can look at the type requested, read the record for that type, and then grab the link to the Subflow so that it can be launched. To make that work, we will need to devise some kind of standard for the Subflow in terms of inputs and outputs so that the generic process can consistently communicate with any given fulfillment Subflow. For input, the Requested Item record should suffice, but the outputs will depend on how much work will be delegated to the Subflow and how much will be handled by the primary process.
In addition to the creation of the account, our process should also perform a number of other tasks such as notifying the user of the account’s creation, sending them the assigned password, creating the record in the accounts table, and closing the request. Since everything except the account creation is universal, our Subflow should be limited to just the steps necessary to create the account, and all other activities should be handled by the primary request. For that to work, the Subflow will need to return sufficient data to the primary process so that it can successfully complete all of the other work. We also have to allow for the possibility that the account could not be created for whatever reason, and that information will need to be communicated back as well. This may change over time as we get into the weeds, but for now, I think that the following should be able to handle everything needed.
Success flag
Failure Reason
Account ID
Account Password
Account Owner Instructions
Since we will need a working Subflow to test out the primary process, it might be wise to start with that first, and then circle back to the primary process once we have a working Subflow to call. Before we do that, though, we need to figure out a way to generate a password. If you search through all of the Script Includes bundled with the product for anything that has the word password in the name, you can find a lot of password generation scripts already out there. Unfortunately, these are all in the global scope, and their access is limited to the global scope, so we can’t call any of these from our Scoped Application. If you search the Internet for password generation scripts, you can find even more, but most of these are much more sophisticated that I was looking for. Ultimately, I borrowed a few things from here and there and came up with a simple Action that had no inputs, one output, and a single script execution step with the following logic.
(function execute(inputs, outputs) {
var upper = 'ABCDEFGHIJKLMNOPQRSTUVWXYZ';
var lower = 'abcdefghijklmnopqrstuvwxyz';
var number = '0123456789';
var special = '!@#$%&*()_+<>[].~';
var source = upper + lower + number + special;
outputs.password = '';
while (outputs.password.length < 32) {
outputs.password += source.charAt(Math.floor(Math.random() * (source.length + 1)));
}
})(inputs, outputs);
This does not guarantee that you will have at least one of each category of character, but with 32 positions, the chances are still pretty good that you will. I ran it a few times to see, and I didn’t think that the results looked all that bad.
dPoxtBDUqaCtFaH2knI]XI0Ts*#1rKyT
xnDRzngKOMy~py8xK[xV]2S2CTkh]RWv
knUHn*0WJ4H_Ww90gq[2TqJmKKb0Xu9v
OcWe4TKQN#0[RM1wj$p$(#_Sp.~CVTiM
>wEE!CbrFGwYccWb#H>+6pyEpZwjcJ%A
Each one seems to have a little bit of everything, so I think it is good enough for what we are trying to do here. So now that we have that in place, we can start working on our Subflow. Unfortunately, you can build a Flow in the App Engine Studio, but not a Subflow (or at least if you can, I cannot figure out how to do that). So, to build out our Subflow, we will have to jump into the old Flow Designer. To create a new Subflow, we select New -> Subflow up in the upper right-hand corner and give it a name of ServiceNow Service Account Creation. The first thing that we will do is create all of the inputs and outputs listed above.
Once we have our inputs and outputs defined, we can jump down in the Actions section and start adding steps to the flow. The first thing that we will want to do is to grab the requested account ID from the Requested Item, so we will select Get Catalog Variables from the list of available actions and drag in the Requested Item pill from the inputs.
For the template, we select our new Service AccountCatalog Item, and once we do that, all of the variables defined for that item appear in the Available list, from which we can select account_id. This will give us an account_id pill which we can use in future steps.
The next thing that we will want to do is to find out if the requested account ID is already in use. We can do that by reading the sys_user table, so for our next step we select Look Up Record from the list of actions and select the User table from the list of tables.
Our only condition for this is that the User ID is the requested account ID, which we again drag over from the inputs. Once we attempt to fetch the record, the next thing that we will want to do is to check and see if a record was returned, so for our next step we will select If from the Flow Logic options.
If a record was returned with the requested account ID, then we want to stop the process and return a failure message back to the calling Flow. To do that, we select Assign Subflow Outputs for our next step and fill in the values for the output fields.
Hopefully, the account will not already exist, so we will want to put all of our other steps under an Else condition. The first step under our Else condition will be to generate the password using the Action that we created earlier.
Once we generate the password we have enough data to create the user record, so for our next step we will select Create Record from the list of actions and populate the record with data pills from the input and preceding processes.
Once the record has been created, we need to report back to the calling Flow, so once again we will select Assign Subflow Outputs and populate the appropriate fields.
Now all we need to do is to Save and Publish our Subflow and we will have one example to use for testing. We will still need to create one more for our other example use case, but we won’t need to do that just yet. Let’s see if we can make all of this work with our first example before we run off and build another.
At this point, it would be good to test this out to make sure that it all works, but to do that we will need a valid Requested Item record to send in as input, and since we have not completed our Catalog Item, we can’t really do that just yet. Let’s build our primary Flow first, and then update the Catalog Item to point to that Flow, and then we can publish the item and do some testing. To build the Flow, we can go back into the App Engine Studio, so let’s dive into that next time out.
“You don’t have to be good to start … you just have to start to be good!” — Joe Sabah
Last time out we really did not accomplish anything of any consequence, but today let’s see if we can actually make some progress on something of real value. We need to create a process that will run every so often and check to make sure that all of the instances in the community have all of the latest content. We can use the Flow Designer to create and schedule a flow that will run daily on the Host instance and compare the artifacts present on each Client instance with the artifacts present on the Host, and then send over any missing items that never made it over originally for whatever reason. Doing it this way will eliminate the need to track and record errors when they happen, since we will just compare the Client inventory with that of the Host and correct any discrepancies. We don’t really need to know what failed or why.
We could make all of the REST API calls to the Client instances using the Integration Hub, but since that is a collection of optional products, not every instance will have all of those components installed and we would not want to make that a prerequisite for the product. To make this work on any ServiceNow configuration, we will want to roll our own calls via Javascript, and we will just use the Flow Designer to schedule a simple flow that loops through all of the Client instance records and then calls a Script Action that will do all of the heavy lifting. Before we build the Action though, we will want to build a Script Include function that can be called in the Action. Since this will be a significant script, we should probably create a new Script Include specific to this purpose rather than adding a number of new functions to any of our existing Script Includes. We can call our new Script IncludeInstanceSyncUtils and create a simple function called syncInstance with a single argument of the sys_id of the record for the instance to be synced. For now, we can just stub out the function with a simple logging statement and then circle back later to fill in the details. Right now, we just want to have something to call in our Flow DesignerAction.
With that out of the way, we can now fire up the Flow Designer and navigate to New -> Action. We will call our new ActionSync Instance and configure a single Input called Instance ID.
For the Script step, we will just pass the Input to our new Script Include function:
(function execute(inputs, outputs) {
var isu = new InstanceSyncUtils();
isu.syncInstance(inputs.instance_id);
})(inputs, outputs);
There is currently nothing returned by the function, so there is no need to define any Outputs. At this point, all we need to do is to Save and Publish the Action before turning our attentions to the actual Flow itself.
To build a Flow, go back to the Home Page of the Flow Designer and select New -> Flow. On the resulting pop-up Flow Properties panel, enter all of the details for the Flow and set the Run As value to System User.
Since we want this Flow to run on its own every day, we can set the Trigger to Daily, and the Time to 12:30, which should run it in the middle of the lunch hour in the Host time zone, when most instances should be available.
Since we only want this Flow to run on the Host instance, the first thing that we need to do is to create a Flow Variable that indicates whether or not this instance is the Host. To set the value of the variable, we add a Set Flow Variables step that uses the following script to determine if this instance is the Host based on System Properties.
The next step then will be a conditional that checks the value of the new Flow Variable.
Failing this test will terminate the Flow, but if this is the Host instance, then the next thing that we need to do is gather up all of the Client instance records and then loop through them and call our new Action for each one in turn.
Inside the following For Each Item loop, we can then invoke our new Action, passing in the sys_id of the current record.
That completes the work on the Flow, and all we need to do now is to Save and Activate the Flow. At this point, the Flow will kick off every day at 12:30 local time, but it won’t really do much except put a few entries in the System Log. The real work will be done in the Script Include, and since that’s a major effort in and of itself, we’ll leave that for our next exciting episode.
“There are two ways to write error-free programs; only the third works.” — Alan J. Perlis
Now that we have all of the Javascript functions needed to push a new application version out to all of the Client instances, we need to create an Action that will call the root function and then create a Subflow that will utilize the new Action. Since we already created a similar Action for the process that shares new Client instances with existing Client instances, the easiest thing to do will be to bring that guy up in the Flow Designer and make a copy. To make a copy, pull up the Action that you want to copy, and use the ellipses menu to select Copy from the drop-down menu.
Once we have our copy, we need to make the necessary changes to adapt it to our purpose. The publishNewVersion function that we just created takes three arguments (newVersion, targetInstance, and attachmentId), so we will need to modify the Action Inputs to bring in those values. The new Action Inputs now look like this:
Similarly, we will need to modify the Input Variables of the Script Step to pass along these values:
The last thing that we need to do is modify the script itself. The existing script in the cloned Action was pretty simple, as it was just a call to one of our Script Include functions:
(function execute(inputs, outputs) {
var csu = new CollaborationStoreUtils();
csu.publishNewInstance(inputs.new_instance, inputs.target_instance);
})(inputs, outputs);
We also want to call a function, and it is in the very same Script Include, so our modification is just a matter of changing the name of the function along with the arguments passed.
(function execute(inputs, outputs) {
var csu = new CollaborationStoreUtils();
csu.publishNewVersion(inputs.new_version, inputs.target_instance, inputs.attachment_id);
})(inputs, outputs);
That’s it for the changes, so now all we have to do is Save and Publish the new Action and we are ready to utilize it in our new Subflow. Creating the Subflow will be another Copy operation, this time from our existing New_Collaboration_Store_InstanceSubflow. We’ll call our copy Distribute_New_Application_Version, and once again, we will need to make some modifications to adapt it to our new purpose. The original Subflow had a single input, the New Instance. Instead of a new instance, we will be distributing a new application version, so we can replace the New Instance input with a New Version input that will contain the sys_id of the new version record. We will also need the sys_id of the Attachment record, so we will add an Attachment ID input as well.
The target instances for this process will be the same as those for the original Subflow: all instances in the community except for the Host instance and the instance from which the data was provided (which could also be the Host in this case, since Host instances can publish applications just like any other Client instance). To filter out the data providing instance, we will need to add a new step to use the passed version record sys_id to fetch the GlideRecord that will lead us to the instance that produced the application.
This gives us the information that we need to complete the filter on the query of the instance table so that we can gather up all of the instances except for the Host and the instance that produced the application.
Now all we need to do is to replace the original custom Action inside of the loop with the new Action that we just created.
This completes the changes needed to the cloned Subflow, so again, all we need to do is to Save and Publish and we are all ready to go. Now all we need to is to come up with a way to launch this Subflow whenever a new application version is sent over to the Host instance. We’ll take a look at that next time out, and hopefully release a new Update Set as well so that those of you that have been kind enough to participate in the testing can give this all a trial run.
“Someone’s sitting in the shade today because someone planted a tree a long time ago.” — Warren Buffett
While we wait for additional test results to come in from the corrected Update Set for our latest iteration, it would be a good time to take a look at what we will need to do to complete the last remaining step in the application publishing process. What we have completed so far pushes all of the new application version artifacts to the Host instance. Now we need to build out the process for the Host instance to send out all of those artifacts to all of the other Client instances. Fortunately, we already have in hand most of the code to do that; we just need to clone a few things and make a number of modifications to fit the new purpose.
We have already built a Subflow to push new Client instance information out to all of the existing Client instances during the new instance registration process. Pushing out a new version of an application to those same instances is essentially the same process, so we should be able to clone that guy, and the associated Action, to create our new process. We have also created functions to push over the application record, the version record, and the attached Update Set XML file, and we should be able to copy over those guys as well to complete the process. In fact, we should probably start with the JavaScript functions, as we will need to call the root function in the Action, which will need to be created before it can be referenced in the Subflow.
The functions as written are hard-coded to send everything over to the Host instance. We will want to do basically the same thing, but this time we will be sending things out to various Client instances. Ideally, I should just refactor the code to have the destination instance and credentials passed in so that one function would work in both circumstances, but at this point it was less complicated to just make copies of each function and hack them up for their new purpose. I don’t really like having two copies of essentially the same thing, though, so one day I am going to have to go back and rework the entire mess.
But for now, this is what I came up with when I copied what was once called processPhase5 to create the new publishNewVersion function:
publishNewVersion: function(newVersion, targetInstance, attachmentId) {
var targetGR = new GlideRecord('x_11556_col_store_member_organization');
if (targetGR.get('instance', targetInstance)) {
var token = targetGR.getDisplayValue('token');
var versionGR = new GlideRecord('x_11556_col_store_member_application_version');
if (versionGR.get(newVersion)) {
var canContinue = true;
var targetAppId = '';
var mbrAppGR = versionGR.member_application.getRefRecord();
var request = new sn_ws.RESTMessageV2();
request.setHttpMethod('get');
request.setBasicAuth(this.WORKER_ROOT + targetInstance, token);
request.setRequestHeader("Accept", "application/json");
request.setEndpoint('https://' + targetInstance + '.service-now.com/api/now/table/x_11556_col_store_member_application?sysparm_fields=sys_id&sysparm_query=provider.instance%3D' + mbrAppGR.getDisplayValue('provider.instance') + '%5Ename%3D' + encodeURIComponent(mbrAppGR.getDisplayValue('name')));
var response = request.execute();
if (response.haveError()) {
gs.error('CollaborationStoreUtils.publishNewVersion: Error returned from Target instance ' + targetInstance + ': ' + response.getErrorCode() + ' - ' + response.getErrorMessage());
canContinue = false;
} else if (response.getStatusCode() == '200') {
var jsonString = response.getBody();
var jsonObject = {};
try {
jsonObject = JSON.parse(jsonString);
} catch (e) {
gs.error('CollaborationStoreUtils.publishNewVersion: Unparsable JSON string returned from Target instance ' + targetInstance + ': ' + jsonString);
canContinue = false;
}
if (canContinue) {
var payload = {};
payload.name = mbrAppGR.getDisplayValue('name');
payload.description = mbrAppGR.getDisplayValue('description');
payload.current_version = mbrAppGR.getDisplayValue('current_version');
payload.active = 'true';
request = new sn_ws.RESTMessageV2();
request.setBasicAuth(this.WORKER_ROOT + targetInstance, token);
request.setRequestHeader("Accept", "application/json");
if (jsonObject.result && jsonObject.result.length > 0) {
targetAppId = jsonObject.result[0].sys_id;
request.setHttpMethod('put');
request.setEndpoint('https://' + targetInstance + '.service-now.com/api/now/table/x_11556_col_store_member_application/' + targetAppId);
} else {
request.setHttpMethod('post');
request.setEndpoint('https://' + targetInstance + '.service-now.com/api/now/table/x_11556_col_store_member_application');
payload.provider = mbrAppGR.getDisplayValue('provider.instance');
}
request.setRequestBody(JSON.stringify(payload, null, '\t'));
response = request.execute();
if (response.haveError()) {
gs.error('CollaborationStoreUtils.publishNewVersion: Error returned from Target instance ' + targetInstance + ': ' + response.getErrorCode() + ' - ' + response.getErrorMessage());
canContinue = false;
} else if (response.getStatusCode() != 200 && response.getStatusCode() != 201) {
gs.error('CollaborationStoreUtils.publishNewVersion: Invalid HTTP Response Code returned from Target instance ' + targetInstance + ': ' + response.getStatusCode());
canContinue = false;
} else {
jsonString = response.getBody();
jsonObject = {};
try {
jsonObject = JSON.parse(jsonString);
} catch (e) {
gs.error('CollaborationStoreUtils.publishNewVersion: Unparsable JSON string returned from Target instance ' + targetInstance + ': ' + jsonString);
canContinue = false;
}
if (canContinue) {
targetAppId = jsonObject.result.sys_id;
}
}
}
} else {
gs.error('CollaborationStoreUtils.publishNewVersion: Invalid HTTP Response Code returned from Target instance ' + targetInstance + ': ' + response.getStatusCode());
}
if (canContinue) {
this.publishVersionRecord(targetInstance, token, versionGR, targetAppId, attachmentId);
}
} else {
gs.error('CollaborationStoreUtils.publishNewVersion: Version record not found: ' + newVersion);
}
} else {
gs.error('CollaborationStoreUtils.publishNewVersion: Target instance record not found: ' + targetInstance);
}
},
Unlike the original application publication process, we are not bouncing back and forth between the client side and the server side, so with the successful completion of one artifact, we can simply move on to pushing over the next artifact. To create the next step in the process, the publishVersionRecord function, I started out with a copy of the original processPhase6 function. Here is the end result:
The last step in the process, then, is to send over the Update Set XML file attachment. For that one, I started out with the processPhase7 function and ended up with this:
publishVersionAttachment: function(targetInstance, token, targetVerId, attachmentId) {
var gsa = new GlideSysAttachment();
var sysAttGR = new GlideRecord('sys_attachment');
if (sysAttGR.get(attachmentId)) {
var url = 'https://';
url += targetInstance;
url += '.service-now.com/api/now/attachment/file?table_name=x_11556_col_store_member_application_version&table_sys_id=';
url += targetVerId;
url += '&file_name=';
url += sysAttGR.getDisplayValue('file_name');
var request = new sn_ws.RESTMessageV2();
request.setBasicAuth(this.WORKER_ROOT + targetInstance, token);
request.setRequestHeader('Content-Type', sysAttGR.getDisplayValue('content_type'));
request.setRequestHeader('Accept', 'application/json');
request.setHttpMethod('post');
request.setEndpoint(url);
request.setRequestBody(gsa.getContent(sysAttGR));
response = request.execute();
if (response.haveError()) {
gs.error('CollaborationStoreUtils.publishVersionAttachment: Error returned from Target instance ' + targetInstance + ': ' + response.getErrorCode() + ' - ' + response.getErrorMessage());
} else if (response.getStatusCode() != 201) {
gs.error('CollaborationStoreUtils.publishVersionAttachment: Invalid HTTP Response Code returned from Target instance ' + targetInstance + ': ' + response.getStatusCode());
}
} else {
gs.error('CollaborationStoreUtils.publishVersionAttachment: Invalid attachment record sys_id: ' + attachmentId);
}
},
So that takes care of the functions. Now we have to create an Action in the Flow Designer that will call the function, and then produce a new Subflow that will leverage that action. That sounds like a good subject for our next installment.
Special Note to Testers
If you want to test the set-up process, you can do that with a single instance. You just have to select the Host option during the set-up process. If you want to test the Client set-up process, you will need at least two instances, one to serve as the Host, which you will need to reference when you set up the Client (you cannot be a Client of your own Host). But if you really want to test out the full registration process, including the Subflow that informs all existing instances of the new instance, you will need at least three participating instances: 1) the Host instance, 2) the new Client instance, and 3) an existing Client instance that will be notified of the new member of the community.
The same holds true for the application publishing process. You can test a portion of the publishing process by publishing an app on a single Host instance, but if you want to test out all of the parts and pieces, you will want to publish the app from a Client instance and make sure that it makes its way all the way back to the Host. Once we complete the application distribution process, full testing will require at least three instances to verify that the Host distributes the new version of the application to other Clients.