Poor Man’s Workflow

There are a lot of great workflow systems out there.  System Center Orchestrator is a personal favorite, although probably because I’m a “Softie” at heart.  Recently a former co-worker of mine posted asking if anyone had any sample scripts to manage the workflow of multiple scripts.

At my current company we don’t have SCORCH setup and of course standing up any infrastructure comes at a cost.  We recently kicked off an effort to automate our new hire process and were asked to see what we could do with minimal effort and infrastructure cost.

I wrote a “Poor Man’s Workflow” solution that integrates with a feed from our HR system (Workday) to a workflow script that kicks of various sub-scripts which provision the network  account, SIP, mailbox, and records in our ticketing system (BMC Remedy).

This script solution has been scrubbed some into template format for you!  It contains a of a main workflow script which is configurable on what sub-scripts to run.  It will consume data from folder or a SharePoint list and provides email notifications.

The solution can be downloaded here Workflow_v1.0.zip.

Solution Overview

Workflow.ps1 

-CachePassword – Optional parameter used to cache credentials in a secured data file in %APPDATA%\WFD.xml file.  This is consumed later by the workflow sub-scripts as needed. For example, if the service account the scheduled task is running under doesn’t have the permissions needed to connect to a SQL DB or Web Service.

-WorkflowFile [TestFile] – Run the workflow script and process a single data file. Ignores the timer switch and does not process the SharePoint or RETRY data.

-Timer [Minutes] – Number of minutes the script should run before timing out.  For example, you may want to set the script to timeout after 55 minutes and set a scheduled task to execute the script every hour.  This will ensure if the script dies unexpectedly, it will pickup at the top of the hour. If no timer is specified the script exits after processed all data files.

The main workflow script looks for workflow events from a SharePoint list, consuming the data by converting all fields in the list to xml and storing in \Data\Pending\EventID.xml. Any workflow data files automatically downloaded to \Data\Pending or saved to \Data\Retry are passed to each sub-script specified in the configuration file.

Each sub-script is run synchronously and each sub script saves its results to the workflow data file under Workflow/Status/Task. The main workflow script evaluates the results and sets the status on the SharePoint site (to prevent rerunning same event data) and sends out the appropriate email based on the workflow status.

NOTE: If someone can give me a use case for async, I will consider implementing this.

Workflow.xml

Configuration file for the main workflow.

SharePoint – To configure SP as the event source for your data set the following nodes under settings/sharepoint:

  • enabled – true
  • uri – The URL to the WSDL for the lists interface on the specific site/sub-site
  • list – The friendly name of your list (e.g. WorkflowFeed)
  • view – The GUID for the view on the list that contains the appropriate data.
  • pending – The SharePoint CAML query to get items that are pending.

Connections – If you are connecting to a SQL database or other web-services you can put your connection information under settings/connections.  The Sub-Example1.ps1 script has sample code to decrypt stored passwords for the account specified here.  Example:

  • con1/db – Name of the database
  • con1/server – Server or Server + Instance name
  • con1/account – Name of the SQL account to use for the connection
  • svcaccount- Name of a Windows Service account to use.
  • data/query – SQL statements to run and cache in the \Data folder.

See Sub-Example1.ps1 for how this is used.

Scripts – Sub scripts to execute from the main workflow for each workflow event data.

  • id – Unique dentifier for the script.  It is used in the component name for logging, and to mark tasks results for a specific script.
  • name – Friendly name of the script used in logging and reporting.
  • script – Name of the PowerShell sub-script
  • types – specifies which workflow event types (actiontype) the script supports

Email – Email notifications can be used for success or failures. Contains the configuration and email templates

  • enabled- true (to enable email notifications, otherwise ‘false’)
  • server – SMTP server name
  • port – SMTP port
  • header – Style used in email templates

Each template (email/settings/message) has type=”template_name” used by the Send-Mail function:

  • from: email address of sender
  • to: email address of recipient (semi-colon to separate multiple)
  • bcc: email address of recipient on blind copy
  • subject: subject line of the email
  • body: main body of email

NOTE: Variables in the templates are marked %VariableName% are case sensitive, and will be replaced with values from the WORKFLOW/DATA or WORKFLOW/STATUS/TASK/RESULT fields.

SUB-Example1.ps1

-WorkflowFile [File] – Workflow event data to be processed. When using SharePoint this is downloaded to \Data\Pending\EventID.xml, or any data in \Data\Retry.

-TaskId [ID} Task Identifier. Used in the variable names for the results and the component for the central logging.

The temple includes the following sections

  • [Header Code] – Defines logging, switches. Load workflow data.
  • [Main] – Main code should go here. It is recommended that any workflow control is performed at this level.  Any called functions should only return back data/results and setting the workflow status and returns should stay under Main().
  • [Footer Code] – Writes any workflow data back to disk.

This script gives an example of SQL Data Caching.  It executes the query from settings/connection/data/query and caches it to \Data\QueryName.csv. It is designed to only sync the data every 12 hours.   This is handy to optimize reading of SQL data when the script is running in a loop every 1 minute and the source data doesn’t change frequently (e.g. for BMC Remedy we cached down the location data, request types, etc)

SUB-Example2.ps1

Mostly the same as Example1. This script gives an example of reading the results from the previous workflow script:

 $TaskOneValue = $WorkflowData.SelectSingleNode("workflow/status/task[@name='Task1']").result

Handy when one script is depedent on the results of the previous script (e.g. first script creates AD account, and second script creates the Exchange mailbox).

0001.xml

Sample workflow data file.  For testing, copy this to the Data\Retry folder.

  • workflow [id] – unique ID for the event.
  • workflow\data\type- the action type for the event. Used to match to the “Types” attributes for scripts.
  • action – sample data
  • actiondate – sample data
  • email – sample data (e.g. could be used in email template)

Helper Functions

These PowerShell functions are used by the sub-scripts.

  • Update-String: Function takes a string and replaces any matching %VariableNames% with a value.  Data is taken from the child nodes of workflow/data in the workflow data file.
  • Set-Data: The main script and sample scripts load the workflow/data into a global variable $DataSet.  This function is used to change the workflow/data in the data file. Necessary if you want the values to be saved to disk, or used in the email templates. E.g. for our onboarding automation we looked up the ManagerEmail from AD and set this on the workflow to be used in the email notification template.
  • Set-WorkflowStatus: Used to set the workflow/status/task nodes passed from the sub-script to the main workflow script.  The data is consumed and used for error reporting.
  • TraceLog: Logs a message in cmtrace format to the central log.  Specify type of 2 = Warning, or 3 = Error for highlighting in cmtrace tool.

Primary Functions

These PowerShell functions are used by the Workflow script in addition to the helper functions:

  • Update-SharePointItem: When leveraging SharePoint as the event data feed, this function is used to set the status to reflect the outcome of the automation.  The field is hard coded to “AutomationStatus” and “RequestID”.  Customize away!
  • Send-Mail: Sends a SMTP email based on the specified template.  The workflow data and any specified error is used to replace variable strings from the template.  The limits to the template is your imagination. 🙂
  • Process-Data: Executes the sub-scripts and the workflow for the specified workflow data file.

Folder Structure

The following structure is created automatically the first time you run the Workflow.ps1.

  • Logs – Contains the Workflow.log.
  • Data – SQL cache data saves here.
    • Pending – SP event data downloaded here
    • Retry – Manually copy data here (or setup job to drop here)
    • Flags – Used to mark status if SP not accessible/writable for status.
    • Completed – Successful event data file moved here.
    • Failed – Failed event data moved here.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s