Blog

OS Deployment – Helper Script

This is a quick post to share out my OS Deployment script:

OSD.ps1

  • Mode
    • Config – Applies configuration set specified in sub-mode.
    • Bios – Applies BIOS configurations by model/type.
    • Report – Sends report notification.  Sub-modes are Error/Final.
  • SubMode – SubMode specific to mode.
  • Test – Runs in test mode, primarily for testing notification templates, etc.
  • User – Species the script is running by user, not system. Uses OSD_User.log.
  • LogPath – Specifies path to log files, primarily used with SubMode to generate OSD report based on collected data.

OSD.xml

Configuration file with three main nodes:

  • Configs – Configurations applied by config mode.
  • Systems – Used by BIOS mode
  • Report – Used by Report mode

More to come later with screenshots, and sample task sequences.

OSD_v1.0.zip

 

Patch Communications

A significant part of the monthly grind for patches, also know as Software Updates, for an Enterprise company is managing the release schedule, sending out consistent communications, and capturing compliance data for the executive leadership team (ELT).

To avoid busy work, I have written automation to accomplish the goal of sending out consistent communications both for users, leadership, and administrators for operational insights. Enabling the ConfigMgr administrator to keep a pulse on the progression of the rollout each month.

The PatchSendComms.ps1 PowerShell script I am sharing with you can be used for any communication, not just patching. It is designed to pull information from a SQL Database, enabling you to target a list of contacts based on the returned query results  and include tables within the email communication.

It does however, offer very specific functionality tailored to patching by enabling the scheduling communications on a specific number of days offset from Patch Tuesday (in combination with a daily scheduled task to run the script in scheduled mode).

What are some things that you may want to send an automated communication about?

  • Release Schedule – Your field technicians or IT team may want to receive a summary of the patch release schedule for the month. Also helpful is sending the schedule the day before the official release schedule email.  This can be critical if you are relying on maintenance windows set on the collections.
  • Deployment Review – Send a summary of your scheduled deployments for the month. Ensure the deployment details are correct.
  • Compliance– Send current compliance summary to field technicians to help drive compliance adoption. Send and official compliance score card at a specific interval (30 day, 60 day, etc.). Attach the list of workstations and/or servers out of compliance.
  • User Communications – Send a pilot and/or production notification to your users, providing them with key information about monthly patching.

The script solution can be downloaded from PatchSendComms_v1.0.zip.

Solution Overview

PatchSendComms.ps1 

Generates a communication either by specifying a specific message ID or in schedule mode.  Communication templates and the schedule are loaded from the \PatchConfig.xml.

-Communication (Required – unless Schedule mode). Specifies which message template ID to use from PatchConfig.xml.

-DumpTemplate (Optional) – Generates HTML files to preview templates. For each message ID, an HTML file is generated in the \Templates sub-folder.

-DumpExpand (Optional) – Used in combination with DumpTemplate, specifies to expand PowerShell variables when generating the preview HTML templates.

-Resend (Optional). By default, the same message template will not be sent more than once per day. In the case of a contact list, the message is always processed, but individual contact emails not sent more than once a day.  This setting overrides and always resends the message.  This is useful for testing, or to reissue a corrected communication.

-ListSchedule (Optional) – Provides output of all scheduled communications.

-Schedule (Optional) – In schedule mode, communications are sent based on their number of days from Patch Tuesday, instead of sending an individual communication. If there are no matching scheduled communications, no communications are sent.

-PatchTuesday (Optional) – Overrides the Patch Tuesday date. Used for testing changes.

-To (Optional) – Overrides the recipient for the communication (does not affect contact list message templates). Used for testing changes.

PatchConfig.xml

Configuration file for the communication script.

Connection

connection – Specify ConfigMgr, SMTP and database environment.

  • id (required) – Identifier (sccm & smtp required)
  • server (required) – Server name used for the connection
  • sql (optional) – When a database connection, must specific the SQL server name
  • database (optional) – When a database connection, must specific the SQL database name
  • sitecode – Required for sccm
  • port – Required for smtp
  • sendusing – Required for smtp

For example, to add a connection to your CMDB SQL database you would use:

<connection id= "cmdb" server="sqlservername" database="cmdb"/>

Later referencing the connection to add table data, you would specify the identifier in the connection attribute:

<table id="myassets" connection="cmdb"><![CDATA[

Select Name, Owner From MyAssets

]]></table>

Notice

notifications/notice – Specify a specific communication that should be sent a specific number of days after Patch Tuesday.  Supports going back two months from the current date/time the script is run.

  • name (required) – Identifier that maps to the message id.
  • days (required) – Number of days from Patch Tuesday to trigger communication.
  • hour (optional) – Specific hour which communication should be triggered (24-hour time).

For example, you may want most communications to go out at 10 a.m. and your overall compliance report at the end of the day at 5 p.m.  Set a scheduled task to trigger the script with the following parameters every day at 10 a.m. and 5 p.m.

PatchSendComms.ps1 -Schedule

For communications you want to send at 5 p.m. you would specify the hours in 24-hour format:

<notice name="OverallCompliance" days="60" hour="17"/>

Header

header – HTML header used in email communications.  This is leveraged to inject HTML Styles in the message. Must be encapsulated in <![CDATA[  ]]>to prevent XML parser from barfing. Yes “barfing” is my technical description.

Formatting in Outlook can be very different from how it is presented in a web browser.  I will use the -DumpTemplates to generate the HTML templates, and then I will edit them to get the look and feel how I like it, before incorporating those changes back into PatchConfig.xml and generating and email to confirm the formatting works.

Message

message – Specify a specific communication that should be sent a specific number of days after Patch Tuesday.  Supports going back two months from the current date/time the script is run.

  • type (required) – Identifier for the message template.
  • notify (optional) – Applies to message templates with contact list only.
    • individual – if multiple users specified in a contact list, send email to each contact.
    • group (default) – if not specified, default value. If multiple users specified in a contact list, sends email to all contacts at once.

When using a contact list, you must specify %ContactList% for group, and %ContactEmail% for individual as a placeholder in one of the recipient fields (to/cc/bcc).

from (required) – Send from used in email.

to (required) – Recipients for message.

cc (optional) – Recipients copied on message.

bcc (optional) – Recipients blind copied on message.

subject (optional/parsed) – Subject of message. Supports PowerShell and Environment variable parsing.

image (optional) – Image to attach to email. Specify the file name.

Images can be referenced within the email communication using by specifying content identifier within the body.

Example adding logo email to message:

<image id="logo" type="image/png">logo.png</image>

Reference added image within the body:

<img src="cid:logo" alt="www.contoso.com"/>

attachment (optional) – File generated from SQL query to be attached to the message.  The node text should be wrapped in <![CDATA[ ]]> tags to ensure it safely parses.

  • id (required) – Identifier for attachment.
  • filename (required) – File name for attachment.
  • connection (required) – Identifier for connection used to perform the SQL data query.

table (optional) – Generates a data set with tables that can be used in body of message. The node text should be wrapped in <![CDATA[ ]]> tags to ensure it safely parses.

  • id (required) – Identifier for table. Placeholder %id_table% can be used in message body, or %id_table#% for data sets.
  • connection (required) – Identifier for connection used to perform the SQL data query.
  • type (optional) – Optionally a type of dataset can be specified if a query returns multiple tables. Otherwise only the first table in a query is returned.

To reference table in message body, specify the place holder value, for example for a table with the id of summary:

Table 1:

<blockquote>%summary_table1%</blockquote>

Table 2:

<blockquote>%summary_table2%</blockquote>

body (required/parsed) – HTML body used in message. Supports PowerShell and Environment variable parsing. It should be encapsulated in the <![CDATA[ ]]> tags to ensure it safely parses.  It must contain the following, at a minimum, if you want the header style to apply to the message:

<![CDATA[
<html>
<head>$Header</head>
<body>
     HTML body goes here.
</body>
</html>]]>

Example Communications

The solution comes with several canned examples for notifications.

ScheduleCheck – Summary of schedule sent day before Patch Tuesday. Especially helpful if maintenance windows are used.

DeploymentReview– Review of scheduled patch deployments, sent after cut off time for administrator to setup all deployments. Enables peer review/sign off and sanity check for administrator.

WorkstationPilot – Communication to users in the patch pilot group.

WorkstationProduction – Communication to all users.

December2018_WorkstationProduction.png

WorkstationCompliance– Compliance status for workstations, for field techs, or personnel responsible for working with users to get all devices patched.

PatchSchedule – Sample communication to be send to service desk or field technicians informing them of the patch schedule for the month.

December2018_PatchSchedule.png

OverallCompliance – Overall patch compliance for workstations and servers with attached list of out of compliance systems.

December2018_39day.png

Poor Man’s Workflow

There are a lot of great workflow systems out there.  System Center Orchestrator is a personal favorite, although probably because I’m a “Softie” at heart.  Recently a former co-worker of mine posted asking if anyone had any sample scripts to manage the workflow of multiple scripts.

At my current company we don’t have SCORCH setup and of course standing up any infrastructure comes at a cost.  We recently kicked off an effort to automate our new hire process and were asked to see what we could do with minimal effort and infrastructure cost.

I wrote a “Poor Man’s Workflow” solution that integrates with a feed from our HR system (Workday) to a workflow script that kicks of various sub-scripts which provision the network  account, SIP, mailbox, and records in our ticketing system (BMC Remedy).

This script solution has been scrubbed some into template format for you!  It contains a of a main workflow script which is configurable on what sub-scripts to run.  It will consume data from folder or a SharePoint list and provides email notifications.

The solution can be downloaded here Workflow_v1.0.zip.

Solution Overview

Workflow.ps1 

-CachePassword – Optional parameter used to cache credentials in a secured data file in %APPDATA%\WFD.xml file.  This is consumed later by the workflow sub-scripts as needed. For example, if the service account the scheduled task is running under doesn’t have the permissions needed to connect to a SQL DB or Web Service.

-WorkflowFile [TestFile] – Run the workflow script and process a single data file. Ignores the timer switch and does not process the SharePoint or RETRY data.

-Timer [Minutes] – Number of minutes the script should run before timing out.  For example, you may want to set the script to timeout after 55 minutes and set a scheduled task to execute the script every hour.  This will ensure if the script dies unexpectedly, it will pickup at the top of the hour. If no timer is specified the script exits after processed all data files.

The main workflow script looks for workflow events from a SharePoint list, consuming the data by converting all fields in the list to xml and storing in \Data\Pending\EventID.xml. Any workflow data files automatically downloaded to \Data\Pending or saved to \Data\Retry are passed to each sub-script specified in the configuration file.

Each sub-script is run synchronously and each sub script saves its results to the workflow data file under Workflow/Status/Task. The main workflow script evaluates the results and sets the status on the SharePoint site (to prevent rerunning same event data) and sends out the appropriate email based on the workflow status.

NOTE: If someone can give me a use case for async, I will consider implementing this.

Workflow.xml

Configuration file for the main workflow.

SharePoint – To configure SP as the event source for your data set the following nodes under settings/sharepoint:

  • enabled – true
  • uri – The URL to the WSDL for the lists interface on the specific site/sub-site
  • list – The friendly name of your list (e.g. WorkflowFeed)
  • view – The GUID for the view on the list that contains the appropriate data.
  • pending – The SharePoint CAML query to get items that are pending.

Connections – If you are connecting to a SQL database or other web-services you can put your connection information under settings/connections.  The Sub-Example1.ps1 script has sample code to decrypt stored passwords for the account specified here.  Example:

  • con1/db – Name of the database
  • con1/server – Server or Server + Instance name
  • con1/account – Name of the SQL account to use for the connection
  • svcaccount- Name of a Windows Service account to use.
  • data/query – SQL statements to run and cache in the \Data folder.

See Sub-Example1.ps1 for how this is used.

Scripts – Sub scripts to execute from the main workflow for each workflow event data.

  • id – Unique dentifier for the script.  It is used in the component name for logging, and to mark tasks results for a specific script.
  • name – Friendly name of the script used in logging and reporting.
  • script – Name of the PowerShell sub-script
  • types – specifies which workflow event types (actiontype) the script supports

Email – Email notifications can be used for success or failures. Contains the configuration and email templates

  • enabled- true (to enable email notifications, otherwise ‘false’)
  • server – SMTP server name
  • port – SMTP port
  • header – Style used in email templates

Each template (email/settings/message) has type=”template_name” used by the Send-Mail function:

  • from: email address of sender
  • to: email address of recipient (semi-colon to separate multiple)
  • bcc: email address of recipient on blind copy
  • subject: subject line of the email
  • body: main body of email

NOTE: Variables in the templates are marked %VariableName% are case sensitive, and will be replaced with values from the WORKFLOW/DATA or WORKFLOW/STATUS/TASK/RESULT fields.

SUB-Example1.ps1

-WorkflowFile [File] – Workflow event data to be processed. When using SharePoint this is downloaded to \Data\Pending\EventID.xml, or any data in \Data\Retry.

-TaskId [ID} Task Identifier. Used in the variable names for the results and the component for the central logging.

The temple includes the following sections

  • [Header Code] – Defines logging, switches. Load workflow data.
  • [Main] – Main code should go here. It is recommended that any workflow control is performed at this level.  Any called functions should only return back data/results and setting the workflow status and returns should stay under Main().
  • [Footer Code] – Writes any workflow data back to disk.

This script gives an example of SQL Data Caching.  It executes the query from settings/connection/data/query and caches it to \Data\QueryName.csv. It is designed to only sync the data every 12 hours.   This is handy to optimize reading of SQL data when the script is running in a loop every 1 minute and the source data doesn’t change frequently (e.g. for BMC Remedy we cached down the location data, request types, etc)

SUB-Example2.ps1

Mostly the same as Example1. This script gives an example of reading the results from the previous workflow script:

 $TaskOneValue = $WorkflowData.SelectSingleNode("workflow/status/task[@name='Task1']").result

Handy when one script is depedent on the results of the previous script (e.g. first script creates AD account, and second script creates the Exchange mailbox).

0001.xml

Sample workflow data file.  For testing, copy this to the Data\Retry folder.

  • workflow [id] – unique ID for the event.
  • workflow\data\type- the action type for the event. Used to match to the “Types” attributes for scripts.
  • action – sample data
  • actiondate – sample data
  • email – sample data (e.g. could be used in email template)

Helper Functions

These PowerShell functions are used by the sub-scripts.

  • Update-String: Function takes a string and replaces any matching %VariableNames% with a value.  Data is taken from the child nodes of workflow/data in the workflow data file.
  • Set-Data: The main script and sample scripts load the workflow/data into a global variable $DataSet.  This function is used to change the workflow/data in the data file. Necessary if you want the values to be saved to disk, or used in the email templates. E.g. for our onboarding automation we looked up the ManagerEmail from AD and set this on the workflow to be used in the email notification template.
  • Set-WorkflowStatus: Used to set the workflow/status/task nodes passed from the sub-script to the main workflow script.  The data is consumed and used for error reporting.
  • TraceLog: Logs a message in cmtrace format to the central log.  Specify type of 2 = Warning, or 3 = Error for highlighting in cmtrace tool.

Primary Functions

These PowerShell functions are used by the Workflow script in addition to the helper functions:

  • Update-SharePointItem: When leveraging SharePoint as the event data feed, this function is used to set the status to reflect the outcome of the automation.  The field is hard coded to “AutomationStatus” and “RequestID”.  Customize away!
  • Send-Mail: Sends a SMTP email based on the specified template.  The workflow data and any specified error is used to replace variable strings from the template.  The limits to the template is your imagination. 🙂
  • Process-Data: Executes the sub-scripts and the workflow for the specified workflow data file.

Folder Structure

The following structure is created automatically the first time you run the Workflow.ps1.

  • Logs – Contains the Workflow.log.
  • Data – SQL cache data saves here.
    • Pending – SP event data downloaded here
    • Retry – Manually copy data here (or setup job to drop here)
    • Flags – Used to mark status if SP not accessible/writable for status.
    • Completed – Successful event data file moved here.
    • Failed – Failed event data moved here.