Tag Archives: Scripting

KillAndExec.vbs – Ensuring application installer success with VBScript

Today’s scripting challenge…

We are attempting to use SCCM 2012 as a patch management solution for our centrally supported third party applications.  Great new features in SCCM 2012 allow us to write detection rules for applications to determine if superseded versions are present on the client system, and to trigger an immediate upgrade.  Cool Beans.  Problem is, a lot of application installers that ran reliably in our MDT “LiteTouch” environment (which is used to deploy new operating systems with no previously installed software) will not run silently or successfully on systems where previous application versions were already installed, and may currently be running.

This is an old problem for client system management… how can you update in-use files?  In most cases I have seen, the admin will schedule the updates to run when no one is logged in.  Unfortunately, this is an edge case for us.  Most systems are off when no one is logged in.  Another system is to force logoff for application updates.  While this would work, it seems like a “heavy” solution… why force the user to log off to update one application that may or may not be running?  Why force all applications closed on the off chance that one application will need to be terminated.

Our solution?  Kill only the processes that need to be terminated to ensure application installation success.  See the VBScript solution below (I flirted with writing this one in PowerShell, but the code signing requirements still intimidate me, and I may have the odd-duck XP client that still does not have PowerShell).  I have tested the script on Firefox, Thunderbird, VLC, Notepad++, WinSCP, Filezilla, and KeePass.  Rock On!

UPDATE: Since initial publication, I have added some logic to handle execution from “wscript”. If the script is executed from wscript.exe, console output will be suppressed. Additionally, the log file now is named “killAndExec-(exeFileName).log”. (This prevents SCCM from overwriting the log file the next time a program installer runs that also uses this script).

'KillAndExec.vbs script, J. Greg Mackinnon, 2012-09-13
' Kills processes named in the "kill" argument (comma-delimited)
' Runs the executable named in the "exec" argument
' Appends the executable arguments specified in the "args" argument (comma-delimited)
'Requires: "kill" and "exec" arguments.  The executable named in the "exec" arg must be in the same directory as this script.
'Provides:
' RC=101 - Error terminating the requests processes
' RC=100 - Invalid input parameters
' Other return codes - Pass-though of return code from WShell.Exec.Run using the provided input parameters

Option Explicit

const quote = """"

'Declare Variables:
Dim aExeArgs, aKills
Dim bBadArg, bNoArgs, bNoExeArg, bNoExec, bNoKill, bNoKillArg 
Dim cScrArgs
Dim iReturn
Dim oShell, oFS, oLog
Dim sBadArg, sCmd, sExe, sExeArg, sKill, sLog, sScrArg, sTemp

'Set initial values:
bBadArg = false
bNoArgs = false
bNoExeArg = false
bNoExec = false
bNoKill = false
bNoKillArg = false
iReturn = 0

'Instantiate Global Objects:
Set oShell = CreateObject("WScript.Shell")
Set oFS  = CreateObject("Scripting.FileSystemObject")

'''''''''''''''''''''''''''''''''''''''''''''''''''
' Define Functions
'
Sub subHelp
	echoAndLog "KillAndExec.vbs Script"
	echoAndLog "by J. Greg Mackinnon, University of Vermont"
	echoAndLog ""
	echoAndLog "Kills named processes and runs the provided executable."
	echoAndLog "Logs output to 'KillAndExec.vbs' in the %temp% directory."
	echoAndLog ""
	echoAndLog "Required arguments and syntax:"
	echoAndLog "/kill:""[process1];[process2]..."""
	echoAndLog "     Specify the image name of one or more processes to terminate."
	echoAndLog "/exe:""[ExecutableFile.exe]"""
	echoAndLog "     Specify the name of the executable to run."
	echoAndLog ""
	echoAndLog "Optional arguments:"
	echoAndLog "/args""[arg1];[arg2];[arg3]..."""
	echoAndLog "     Specify one or more arguments to pass to the executable."
	echoAndLog "/noKill"
	echoAndLog "     Switch to suppress default process termination.  Used for testing."
	echoAndLog "/noExec"
	echoAndLog "     Switch to suppress default program execution.  USed for testing."
End Sub

function echoAndLog(sText)
'EchoAndLog Function:
' Writes string data provided by "sText" to the console and to Log file
' Requires: 
'     sText - a string containig text to write
'     oLog - a pre-existing Scripting.FileSystemObject.OpenTextFile object
	'If we are in cscript, then echo output to the command line:
	If LCase( Right( WScript.FullName, 12 ) ) = "cscript.exe" Then
		wscript.echo sText
	end if
	'Write output to log either way:
	oLog.writeLine sText
end function

function fKillProcs(aKills)
' Requires:
'     aKills - an array of strings, with each entry being the name of a running process.   
	Dim cProcs
	Dim sProc, sQuery
	Dim oWMISvc, oProc

	Set oWMISvc = GetObject("winmgmts:{impersonationLevel=impersonate, (Debug)}\.rootcimv2")
	sQuery = "Select Name from Win32_Process Where " 'Root query, will be expanded.	
	'Complete the query string using process names in "aKill"
	for each sProc in aKills
		sQuery = sQuery & "Name = '" & sProc & "' OR "
	next
	'Remove the trailing " OR" from the query string
	sQuery = Left(sQuery,Len(sQuery)-3)

	'Create a collection of processes named in the constructed WQL query
	Set cProcs = oWMISvc.ExecQuery(sQuery, "WQL", 48)
	echoAndLog vbCrLf & "----------------------------------"
	echoAndLog "Checking for processes to terminate..."
	'Set this to look for errors that aren't fatal when killing processes.
	On Error Resume Next
	'Cycle through found problematic processes and kill them.
	For Each oProc in cProcs
	   echoAndLog "Found process " & oProc.Name & "."
	   oProc.Terminate()
	   Select Case Err.Number
		   Case 0
			   echoAndLog "Killed process " & oProc.Name & "."
			   Err.Clear
		   Case -2147217406
			   echoAndLog "Process " & oProc.Name & " already closed."
			   Err.Clear
		   Case Else
			   echoAndLog "Could not kill process " & oProc.Name & "! Aborting Script!"
			   echoAndLog "Error Number: " & Err.Number
			   echoAndLog "Error Description: " & Err.Description
			   echoAndLog "Finished process termination function with error."
			   echoAndLog "----------------------------------"
			   echoAndLog vbCrLf & "Kill and Exec script finished."
			   echoAndLog "**********************************" & vbCrLf
			   WScript.Quit(101)
	   End Select
	Next
	'Resume normal error handling.
	On Error Goto 0
	echoAndLog "Finished process termination function."
	echoAndLog "----------------------------------"
end function

function fGetHlpMsg(sReturn)
' Gets known help message content for the return code provided in "sReturn".
' Requires:
'     Existing WScript.Shell object named "oShell"
	Dim sCmd, sLine, sOut
	Dim oExec
	sCmd = "net.exe helpmsg " & sReturn
	echoAndLog "Help Text for Return Code:"
	set oExec = oShell.Exec(sCmd)
	Do While oExec.StdOut.AtEndOfStream  True
		sLine = oExec.StdOut.ReadLine
		sOut = sOut & sLine
	Loop
	fGetHlpMsg = sOut
end function
'
' End Define Functions
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
' Parse Arguments
If WScript.Arguments.Named.Count > 0 Then
	Set cScrArgs = WScript.Arguments.Named
	For Each sScrArg in cScrArgs
		Select Case LCase(sScrArg)
			Case "nokill"
				bNoKill = true
			Case "noexec"
				bNoExec = true
			Case "kill"
				aKills = Split(cScrArgs.Item(sScrArg), ";", -1, 1)
			Case "exe"
				sExe = cScrArgs.Item(sScrArg)
			Case "args"
				aExeArgs = Split(cScrArgs.Item(sScrArg), ";", -1 ,1)
			Case Else
				bBadArg = True
				sBadArg = sScrArg
		End Select
	Next
	If (IsNull(sExe) or IsEmpty(sExe)) Then
		bNoExeArg = True
	ElseIf (IsNull(aKills) or IsEmpty(aKills)) Then
		bNoKillArg = True
	End If
ElseIf WScript.Arguments.Named.Count = 0 Then 'Detect if required args are not defined.
	bNoArgs = True
End If 
' End Argument Parsing
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
' Initialize Logging
sTemp = oShell.ExpandEnvironmentStrings("%TEMP%")
sLog = "killAndExec-" & sExe & ".log"
Set oLog = oFS.OpenTextFile(sTemp & "" & sLog, 2, True)
' End Initialize Logging
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
' Process Arguments
if bBadArg then
	echoAndLog vbCrLf & "Unknown switch or argument: " & sBadArg & "."
	echoAndLog "**********************************" & vbCrLf
	subHelp
	WScript.Quit(100)
elseif bNoArgs then
	echoAndLog vbCrLf & "Required arguments were not specified."
	echoAndLog "**********************************" & vbCrLf
	subHelp
	WScript.Quit(100)
elseif bNoExeArg then
	echoAndLog "Required argument 'exe' was not provided."
	echoAndLog "**********************************" & vbCrLf
	subHelp
	wscript.quit(100)
elseif bNoKillArg then
	echoAndLog "Required argument 'kill' was not provided."
	echoAndLog "**********************************" & vbCrLf
	subHelp
	wscript.quit(100)
end if
' Log processes to kill:
for each sKill in aKills
	echoAndLog "Process to kill: " & sKill
next
' Log executable arguments:
echoAndLog "Executable to run: " & sExe
if not (IsNull(aExeArgs) or IsEmpty(aExeArgs)) then
	for each sExeArg in aExeArgs
		echoAndLog "Executable argument: " & sExeArg
	next
else 
	echoAndLog "Executable has no provided arguments."	
end if
' End Process Arguments
'''''''''''''''''''''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''''''''''''''''''''
'Begin Main
'
'Build full command string:
if inStr(sExe," ") then 'Spaces in the exe file
	sExe = quote & sExe & quote 'Add quotations around the executable.
end if
if not (IsNull(aExeArgs) or IsEmpty(aExeArgs)) then
	sCmd = sExe & " " 
	for each sExeArg in aExeArgs
		if inStr(sExeArg," ") then
			sExeArg = quote & sExeArg & quote 'Add quotations around the argument.
		end if
		sCmd = sCmd & sExeArg & " "
	next
else
	sCmd = sExe
end if
echoAndLog "Command to execute:"
echoAndLog sCmd

'Kill requested processes:
if bNoKill = false then
	fKillProcs aKills
else
	echoAndLog "/noKill switch has been set.  Processes will not be terminated."
end if
'Run the requested command:
echoAndLog vbCrLf & "----------------------------------"
if bNoExec = false then
	echoAndLog "Running the command..."
	on error resume next 'Disable exit on error to allow capture of oShell.Run execution problems.
	iReturn = oShell.Run(sCmd,10,True)
	if err.number  0 then 'Gather error data if oShell.Run failed.
	    echoAndLog "Error: " & Err.Number
		echoAndLog "Error (Hex): " & Hex(Err.Number)
		echoAndLog "Source: " &  Err.Source
		echoAndLog "Description: " &  Err.Description
		iReturn = Err.Number
		Err.Clear
		wscript.quit(iReturn)
	end if
	on error goto 0
	echoAndLog "Return code from the command: " & iReturn
	if iReturn  0 then 'If the command returned a non-zero code, then get help for the code:
		fGetHlpMsg iReturn
	end if 
else
	echoAndLog "/noExec switch has been set.  Executable will not run."
end if
echoAndLog "----------------------------------"

oLog.Close
wscript.quit(iReturn)
'
' End Main
'''''''''''''''''''''''''''''''''''''''''''''''''''
Advertisements

WiFi Profiles for Windows 8

So Windows 8 is here, to little fanfare at the University.  While I am always happy to have an updated version of Windows to work with, I see that I have yet to blog anything about it.  Perhaps that is because, unlike with the release of Windows 7, there was so little that was relatively “wrong” with the previous release.  I find myself with not much “to do” to get the enterprise ready for Windows 8.  Other reasons for the lack of hype… Windows 7 applications seem, for the most part, to “just work” on Windows 8, thus necessitating very little in the way of application compatibility planning.

Still, we have run into a few hiccups.  I spent most of the last two days updating the UVM WiFi Configuration Tool scripts and experimenting with Group Policy settings to make WPA2-protected wireless working consistently (Previously discussed here, way back in ought-eight.).  In the end, there was very little that I did to the WiFi policies that was Windows 8 specific.  The WiFi profile that we are using maintains backward compatibility with both Windows 7 and Windows Vista.

Here are the details:

  • The 802.1x settings in our WiFi profile was updated to use “user authentication” instead of “user or computer authentication”.  Under XP, this option was called “user reauthentication”.  “ReAuthentication” meant that the computer would attempt to log on as the computer account, but that if the connection was lost, it would re-authenticate as the logged on user.  Under XP, it was not possible to prevent computer authentication attempts.  However, under Win7/Win8, user authentication is just that… only user authentication is attempted, computer authentication is excluded.  We have verified this by looking at the RADIUS server logs.  Switching to “user authentication” will cut down on log errors on the RADIUS servers, and will result in fewer errors on client systems as well.
  • We have added a new trust anchor for our RADIUS server certificate in the WiFi profile.  This was necessitated by mergers and acquisitions on the CA business.  “Equifax” provided our original WPA2/PEAP certificate.  When we went to renew our certificate, we found that Equifax had been acquired by GeoTrust, and that new certificates would be issued from a GeoTrust intermediate CA.  However, this intermediate CA would be cross-signed using the Equifax root CA, so the Equifax trust anchor would still work.  The problem is that if a system has both the GeoTrustandEquifax certs present in the local trusted roots certificate store, it will validate the “radius.uvm.edu” up to the GeoTrust anchor, and will ignore the cross-signing with Equifax.  This results in WiFi connection errors.  When I add the GeoTrust cert as an additional trust anchor, the problem goes away.
  • The VBScript I use to install the WiFi profile is packaged inside a 7-Zip self extractor.  The use of this self-extractor triggers the Windows “Program Compatibility Assistant”, which in turn raises a “This program might not have installed correctly” error after the tool runs.  This problem is corrected by embedding a “manifest” file into the tool.  Typically, this is done using the “mt.exe” tool included in the Windows SDK.  Unfortunately, MT.exe corrupts self-extracting 7-Zip archives (this also is a known problem with WinRAR, and perhaps other similar tools).  Fortunately I was able to work around the problem using “Resource Tuner” from Heaventools.  I needed to add “trustInfo” and “compatibility” sections to the manifest.  My blog engine is really bad about posting XML content in a page, so I will forego posting the manifest here. You can find sample manifests pretty easily though Google.
  • When we run the packaged configuration tool, we get a warning that the application package is unsigned and may not be trustworthy.  I used “signtool.exe” from the Windows SDK to add a signature to the executable, so now it is considered somewhat more trustworthy.  Good instructions on the use of signtool.exe can be found here:
    http://www.tech-pro.net/code-signing-for-developers.html
    I am using a code signing cert that we obtained from the InCommon.org certificate service, hosted by Comodo.  It works.
  • Finally, I updated the profile installer VBScript to make reconfiguration a bit easier (subroutines were converted to functions so that variables set at the start of the script can be passed down to the function.  We then can set things like the trust anchor name, WiFi network name, and log file name at the start of the script where they are more easily edited.  Also, I removed support for Windows XP… no more Service Pack detection, Hotfix installation, or third-party profile installation utilities are needed by the script.  I was able to hack the script down to about a quarter of its original size as a result.  The new script is included below, for those who like that sort of thing…

 


Option Explicit
'On Error Resume Next
'Install UVM WPA2-Enterprise wireless profile
' Version 1.3 by J. Greg Mackinnon, University of Vermont
' Supported platforms:  Windows Vista, 7, and 8
' Requires external tools:  "CertMgr.exe" (from the Windows Platform SDK)
' Requires external files:  Root CA certificate file, 
'                           WiFi XML configuration files for Vista+ Windows OS.
'                            (obtained by running "netsh wlan export profile UVM ."
' NOTE: modify variables in the "Define variables" section to suit your environment.

'History:
' Version 1.0 - Supported UVM WiFi using WPA2, Equifax certs, Windows XP SP2+ and Vista OS
' Version 1.1 - Updated to support Windows 7
' Version 1.2 - Updated to support Windows 8.  Removed support for XP 
'             - Removed third-party "ZWlanCfg" utility and OS Hotfix installation functions (were only needed for XP support)
' Version 1.3 - Converted existing subroutines to functions to allow for easier switching of CAs and WiFi networks.
'             - Moved Global Variables to the top of the script for easier modification.
'             - Updated CA cert and WPA Profile supporting files to use "GeoTrust" instead of "Equifax".

' Create constants
Const cLogFile = "install_UVM_WiFi.log"

' Declare variables
Dim oShell, oUserEnv, oFSO, oFile, oRegExp
Dim iSPVer
Dim sTempEnv, strComputer, sOSTest, sOS, sCertName, sCertFile, sNetName, sProfileFile
Dim bReRun

' Define variables
bReRun = False
strComputer = "."
sOSTest = "Vista|Windows 7|Windows 8" 'Regular Expression for OS compatibility testing
sCertName = "GeoTrust Global CA"      'Friendly name of the trust anchor certificate
sCertFile = "GeoTrustGlobalCA.cer"    'Name of the trust anchor file
sNetName = "UVM"                      'Name of the WiFi Access Point
sProfileFile = ".Wi-Fi-UVM.xml"      'Name of the Vista+ wlan profile file.

' Instantiate global objects
Set oShell = WScript.CreateObject("WScript.Shell")
Set oFSO = CreateObject("Scripting.FileSystemObject")
sTempEnv = oShell.ExpandEnvironmentStrings("%TEMP%") & ""
Set oFile = oFSO.CreateTextFile(sTempEnv & cLogFile,True)
Set oRegExp = New RegExp
oRegExp.IgnoreCase = True
oRegExp.Global = True
oRegExp.Pattern = sOSTest

'''''''''''''''''''''''''''''''''
' Define Functions
'
Function fDetectOS(sOS, iSPVer)
'Detect OS Function - detects OS Caption string and Service Pack integer from WMI WIN32_OperatingSystem.
'Expects to varibles passed, returns the full OS Caption String, and SP Major Version intger
	'Declare variables
	Dim colItems
	Dim objWMIService, objItem
	'Instantiate local objects/collections
	Set objWMIService = GetObject("winmgmts:\" & strComputer & "rootCIMV2") 
	Set colItems = objWMIService.ExecQuery("Select * from Win32_OperatingSystem")

	For Each objItem In colItems
	  sOS = objItem.Caption
	  oFile.WriteLine "Detected Operating System: " & sOS
	  iSPVer = CInt(objItem.ServicePackMajorVersion)
	  oFile.WriteLine "Detected Service Pack Version: " & iSPVer
	  oFile.WriteLine "Service Pack Minor Version: " & objItem.ServicePackMinorVersion
	Next
	
	'Clean local objects/variables
	Set objItem = Nothing
	Set colItems = Nothing
	Set objWMIService = Nothing
End Function

Function fInstCert(sCertName,sCertFile)
' Installs cert with sCertName root CA cert into machine "root" store.
' Requires:  certmgr.exe from the Windows Platform SDK (available with VS .NET or VS 2008 installations), 
'	sCertName variable - contains the friendly name of the root CA
'	sCertFile variable - contains the name of the root CA certificate file
' Requres:  Root CA cert file
' Notes:  We use the "root" argument to certmgr.exe to install into the "Trusted Root Certificate Authorities".  
'		We also could use "ca" to install Intermediate Certificate Authorities.
'		In a previous version of this script we used "oShell.Run", but his returned unexpected results on the
'		Windows 7 platform... using .Exec now.
	
	Dim bCertPresent, bInstSuccess
	Dim oExec
	Dim sOut

	bCertPresent = false
	bInstSuccess = false
	
	set oExec = oShell.Exec("certmgr.exe -c -s -r localMachine root")

	Do Until oExec.StdOut.AtEndOfStream
		sOut = oExec.StdOut.ReadLine()
		if InStr(sOut, sCertName) Then
			'oFile.WriteLine sOut
			'WScript.Echo sOut
			bCertPresent = true
		End If
	Loop

	if bCertPresent = false then
		oFile.WriteLine "Root Certificate for """ & sCertName & """ needs to be installed.  Attempting install..."
		set oExec = oShell.Exec("certmgr.exe -add -c " & sCertFile & " -s -r localMachine root")
		Do Until oExec.StdOut.AtEndOfStream
			sOut = oExec.StdOut.ReadLine()
			if InStr(sOut, "Succeeded") Then
				'oFile.WriteLine sOut
				bInstSuccess = true
			End If
		Loop
		if bInstSuccess = true then
			oFile.WriteLine "Certificate installed successfully"
		else 
			oFile.WriteLine "Certificate failed to install... You will need to install the " _
				& "certificate manually.  See the instructions at https://www.uvm.edu/ets/wireless " _
				& ", then run this script again to compelte installation of the UVM wireless profile."
			WScript.Quit -2
		end if
	else
		oFile.WriteLine "Root Certificate for """ & sCertName & """ is already installed."
	End If
End Function

Function fImportProfile(sProfileFile,sNetName)
'Imports Vista+ Wireless Profile using NETSH command.  
'Requires: a Vista+ wifi profile file exported using NETSH, 
'	sProfileFile - string containing name of the wlan XML profile file to be imported
'	sNetName - string contining the name of the wlan profile name (WiFi Network Name)

	'On Error Resume Next
	Const cUserScope = "all"
	
	Dim iStrMatch
	Dim oExec, oStdOut
	Dim sStdOutLine
	
	oFile.WriteLine "Executing command: netsh wlan add profile filename=""" & sProfileFile & """ user=" & cUserScope & ""
	Set oExec = oShell.Exec("netsh wlan add profile filename=""" & sProfileFile & """ user=" & cUserScope & "")
	Set oStdOut = oExec.stdOut
	While Not oStdOut.AtEndOfStream
		sStdOutLine = oStdOut.ReadLine
		oFile.WriteLine(sStdOutLine)
		iStrMatch = CInt(InStr(sStdOutLine, "Profile " & sNetName & " is added on interface"))
		If iStrMatch > 0 Then
			WScript.Echo "The " & sNetName & " wireless profile was added successfully to your system"
		ElseIf iStrMatch = 0 Then
			WScript.Echo "The wireless profile failed to import.  Please see the manual profile " _
			& "configuration instructions available at http://www.uvm.edu/ets/wireless.  A " _
			& "log file named " & cLogFile & " which contains the full error message can be " _
			& "found in the " & sTempEnv & " directory."
			WScript.Quit -3
		End If
	Wend
	
	Set oStdOut = Nothing
	Set oExec = Nothing
End Function
'
' End Functions
'''''''''''''''''''''''''''''''''

'''''''''''''''''''''''''''''''''
' Begin Main
'

fDetectOS sOS, iSPVer

If oRegExp.Test(sOS) = True Then
	fInstCert sCertName, sCertFile
	fImportProfile sProfileFile, sNetName
Else
	oFile.WriteLine "Your operating system is not supported for use with this script."
	WScript.Quit -4
End If

oFile.close

' Environment cleanup 
Set oFile = Nothing
Set oFSO = Nothing
Set oUserEnv = Nothing
Set oShell = Nothing
Set oRegExp = Nothing

'
' End Main
''''''''''''''''''''''''''''''''''

Thunderbird 13 – The cloud arrives

Mozilla Thunderbird 13 arrived this week.  Guess what?  Our customized build process broke again.  Now, when you start TB for the first time, you get greeted with the option to create a new email account with one of Thunderbird’s “partners” (in other words, email providers who paid for the honor of being put in the “welcome to Thunderbird” start dialog).

With the assistance of the awesome Ben Coddington (who does not keep a blog, but should so that you can bask in his awesomeness), I was able to track down the place that the new-new account dialog is called, and kill it by switching a preference in the “thunderbird-all.js” file.

The preference is a Boolean named “mail.provider.enabled”, set in the thunderbird-all.js file, as documented here:
http://hg.mozilla.org/releases/comm-beta/rev/879e8d044e36
and referenced here:
https://bugzilla.mozilla.org/show_bug.cgi?id=718792#c3
and here:
https://wiki.mozilla.org/index.php?title=Thunderbird/Support/TB13UserChanges

I updated our Thunderbird build script to set this preference to “false”:

Echo modifying default "All Thunderbird" preferences…
……binsed.exe –binary "s/pref("mail.provider.enabled", true);/pref("mail.provider.enabled", false);/" .defaultsprefall-thunderbird_new.js
if errorlevel 1 goto err
MOVE /Y .defaultsprefall-thunderbird_new.js .defaultsprefall-thunderbird.js

The whole ugly build script is provided below:

REM Thunderbird customized build script for UVM.
REM Updated June 2012 for Thunderbird 13 support.
REM REQUIRES:
REM – 7z.exe, 7zr.exe and sed.exe in parallel "..bin" directory
REM – Unmodified Thunderbird installer in .source directory
REM – all required config files in .config directory
REM (including 7z control file, ISP Hook RDF file, and modified prefs.js)
REM – local JDK install with "jar.exe". Path to jar.exe will need to be updated in the jdk environment variable
REM OUTPUT: Fully modified Thunderbird installer in .Installer directory.
REM @echo on

set jdk="c:Program Files (x86)Javajdk1.6.0_27bin"

Echo Cleaning up old builds…
del .Installer*.exe
rmdir /s /q .build
set /P tbver=Enter Thunderbird version number to build (i.e. "6.0.2"):

Echo Extracting setup files from OEM Installer…
mkdir .buildtemp
..bin7zr x .source*.exe -o.build

Echo Extracting omni.ja contents…
mkdir .buildtemp
cd .buildtemp
%jdk%jar.exe xf ..coreomni.ja
if errorlevel 1 goto err

Echo modifying messenger functions…
……binsed.exe –binary "s/NewMailAccount(msgWindow, okCallback);/MsgAccountWizard(okCallback);/" .chromemessengercontentmessengermsgMail3PaneWindow_new.js
if errorlevel 1 goto err
MOVE /Y .chromemessengercontentmessengermsgMail3PaneWindow_new.js .chromemessengercontentmessengermsgMail3PaneWindow.js

Echo modifying default "All Thunderbird" preferences…
……binsed.exe –binary "s/pref("mail.provider.enabled", true);/pref("mail.provider.enabled", false);/" .defaultsprefall-thunderbird_new.js
if errorlevel 1 goto err
MOVE /Y .defaultsprefall-thunderbird_new.js .defaultsprefall-thunderbird.js

Echo modifying default mailnews preferences…
……binsed.exe –binary "s/try_ssl", 0)/try_ssl", 2)/" .defaultsprefmailnews_new.js
if errorlevel 1 goto err
MOVE /Y .defaultsprefmailnews_new.js .defaultsprefmailnews.js

Echo moving UVM modified prefs.js into place (note that this file is not actually used by Thunderbird!)
copy /Y ….configprefs.js .defaultsprofileprefs.js

Echo Repacking omni.ja…
del /f /q ..coreomni.ja
%jdk%jar.exe cf ..coreomni.ja *

Echo Copying UVM Custom ISP file to source…
cd ….
mkdir .buildcoreispen-US
copy /Y .configUVMMail.rdf .buildcoreispen-USUVMMail.rdf
if errorlevel 1 goto err
Echo Copying UVM default prefs.js to core directory (tbird no longer has a prefs.js by default, but it will be used if present)…
mkdir .buildcoredefaultsprofile
copy /Y .configprefs.js .buildcoredefaultsprofileprefs.js
if errorlevel 1 goto err

Echo Deleting temporary files that should not be present in the installer…
rmdir /s /q .buildtemp

Echo Repackaging Thunderbird installer…
..bin7zr a .InstallerUVM_Thunderbird_setup_%tbver%.7z .build*
copy /b ..bin7zS.sfx + .configconfig.txt + .InstallerUVM_Thunderbird_setup_%tbver%.7z .InstallerUVM_Thunderbird_setup_%tbver%.exe

Echo Cleaning up installation source…
del /s /f /q .build*.*
rmdir /s /q .buildcore
rmdir /s /q .build
del /f /q .InstallerUVM_Thunderbird_setup_%tbver%.7z
goto end

:err
echo There was an error running a command.

:end

Driver installation with SCCM Software Distribution

Here we are, working with SCCM again.  Making difficult things possible, and simple things difficult.  Today we wish to distribute a SmartCard driver to all of our managed servers, so that we can require Smart Card for certain classes of logins.  the newer “CNG” Smart Card minidrivers are all simple “.inf” driver packages that you can right-click install.  This ought to be easy, thought the sys admin.  Wrong!

Installation of inf drivers is not a well documented command line procedure (unlike the rather more complicated “.msi” package, which at least is easy to script).

My thanks goes out to the following bloggers and forum users for their assistance with this case:

The script that I cobbled together to install the Athena “ASECard” minidriver is displayed below.  Note that this should work for pretty much any minidriver, as long as it has a “DefaultInstall” section in the inf file.  I just unpack the amd64 and x86 driver cab files into their respective directories, put the batch script one directory above these, and make an SCCM software package of the whole thing.  The installation command line is simply the batch file name.

@echo off
REM Installs the drivers specified in the "DefaultInstall" section
REM of the aseMD.inf that is appropriate for the current (x86 or amd64) platform.
REM Install is silent (4 flag), with no reboot (N flag).
REM The INF is specified to be in the x86 or amd64 subdirectory
REM of the script directory (%~dp0).

echo Detecting platform…
IF EXIST "%programfiles(x86)%" (GOTO :amd64) ELSE (GOTO :i386)

:i386
echo Installing 32-bit driver…
cd x86
%windir%system32rundll32.exe advpack.dll,LaunchINFSectionEx "%~dp0x86aseMD.inf",DefaultInstall,,4,N
goto :EOF

:amd64
REM The command will run in 64-bit mode (%windir%sysnative),
REM when called from a 32-bit CMD.exe (as will be the case with SCCM).
echo Installing 64-bit driver…
cd amd64
%windir%sysnativerundll32.exe advpack.dll,LaunchINFSectionEx "%~dp0amd64aseMD.inf",DefaultInstall,,4,N
goto :EOF
REM End of file

Windows Backup Performance Testing with PowerShell

While developing our new Windows file services infrastructure, we wanted to test our pre-production platform to see if there are any file server-side bottlenecks that will cause unacceptable delays in backup processing. Here are UVM we still are using EMC Networker for Enterprise backup (no comments on our satisfaction with EMC will be provided at this time). EMC provides a tool “uasm.exe” that is used at the core of the “save.exe” and “recover.exe” commands on the backup client. If we use “uasm.exe” to backup all of the file server data to “null”, it is possible that we will be able to detect disk, HBA, and other local I/O bottlenecks before they bite us in production.

Since Networker will break up our file server into multiple “save sets”, and run a user-definable number of save set backup processes in parallel, it also is important for us to determine the required number of parallel backup processes required to complete backup in a timely fashion. Thus, we want to run several parallel “uasm.exe” processes in our tests.

PowerShell, with the assistance of “cmd.exe”, and some finesses, can get this job done. Hurdles I ran into while scripting this test follow:

  1. During development, PowerShell consumed huge amounts of CPU while redirecting uasm.exe output to the PowerShell $null object. Interestingly, previous tests using uasm.exe with cmd.exe did not show this problem. To fix this, each uasm job is spawned from a one-line cmd.exe “bat” script, which is included below.
  2. Remember that PowerShell uses the null object “$null”, but that cmd.exe uses the handle “nul” (with one “L”). If you redirect to “null”, you will soon fill up your disk with a file named “null”.
  3. When wanted to examine running jobs, it was difficult to determine which directory a jobs was working on. This was because I initially created a scriptblock object and passed parameters to it when starting a job. For example:
    [scriptblock] $sb = {
    $uasmBlock = {
    	param ([string]$sPath)
    	[string[]] $argList = '/c','c:localscriptsuasm_cmd.bat',$sPath
    	& cmd.exe $argList
    }
    $jobs += start-job -Name $myJob -ScriptBlock $sb -ArgumentList $dir1
    

    However, when inspecting the job object’s “command” property, we see “$sPath” in the output. We want the variable expanded. How to do this? Create the scriptblock object in-line when starting the job:

    [string] $cmd = '& cmd.exe "/c","c:localscriptsuasm_cmd.bat",' + $dir
    $jobs += Start-Job -Name $jobName `
    	-ScriptBlock ([scriptblock]::create($cmd))
    

    This makes for more compact code, too.

  4. To check on jobs that have completed, I create an array named “$djs” (Done Jobs), populated by piping the $jobs array and filtering for “completed” jobs. I inspect $djs to see if jobs are present. In my first pass, I used the check:
    if ($djs.count -gt 0)

    Meaning, continue if there is anything in the array $djs. However, this check did not work well because output from the $jobs object would put a null item in $djs on creation, meaning that if there were no running jobs, $djs would still have a count of one! I fixed this by changing the test:

    if ($djs[0] -ne $null)

    Meaning, if the first entry in $djs is not a null object, then proceed.

The full script follows:

#uasm_jobQueue.ps1, 2011-09-30, author: J. Greg Mackinnon
#Tests performance of disk when accessed by Networker backup commands.
#   Creates a queue of directories to test ($q), then uses external command 
#   "uasm.exe" to backup these directories to null.
#Change the "$wp" variable to set the number of uasm 'worker processes' to be 
#   used during the test.
#Note: PowerShell $null object causes very high CPU utilization when used for
#   this purpose.  Instead, we call "uasm_cmd.bat" which uses the CMD.exe 'nul'
#   re-director.  'nul' does not have the same problems as $null.

set-psdebug -strict

[int] $wp = 4

# Initialize the log file:
[string] $logfile = "s:uasm_test.log"
remove-item $logfile -Force
[datetime] $startTime = Get-Date
[string] "Start Time: " + $startTime | Out-File $logfile -Append

##Create work queue array:
# Add shared directories:
[String[]] $q = gci S:shared | ? {$_.Attributes.tostring() -match "Directory"}`
	| sort-object -Property Name | % {$_.FullName}
# Add remaining targets to queue:
$q += 'H:','I:','J:','K:','L:','M:','S:sis','S:software','s:r25'
	
[int] $dc = 0			#Count of completed (done) jobs.
[int] $qc = $q.Count	#Initial count of jobs in the queue
[int] $qi = 0			#Queue Index - current location in queue
[int] $jc = 0			#Job count - number of running jobs
$jobs = @()				#Jobs array - intended to contain running PS jobs.
	
while ($dc -lt $qc) { # Completed jobs is less than total jobs in queue
	# Keep running jobs until completed jobs is less than total jobs in queue, 
	#  and our queue count is less than the current queue index.
	while (($jobs.count -lt $wp) -and ($qc -gt $qi)) { 
		[string] $jobName = 'qJob_' + $qi + '_';
		[string] $dir = '"' + $q[$qi] + '"'
		[string] $cmd = '& cmd.exe "/c","c:localscriptsuasm_cmd.bat",' + $dir
		#Start the job defined in $cmd string.  Use this rather than a pre-
		#  defined scriptblock object because this allows us to see the expanded
		#  job command string when debugging. (i.e. $jobs[0].command)
		$jobs += Start-Job -Name $jobName `
			-ScriptBlock ([scriptblock]::create($cmd))
		$qi++ #Increment the queue index.
	}
	$djs = @(); #Completed jobs array
	$djs += $jobs | ? {$_.State -eq "Completed"} ;
	# $djs array will always have a count of at least 1.  However, if the 
	#    first entry is not empty (null), then there must be completed jobs to
	#    be retrieved.
	if ($djs[0] -ne $null) { 
		$dc += $djs.count;
		$djs | Receive-Job | Out-File $logfile -Append; #Log completed jobs
		$djs | Remove-Job -Force;
		Remove-Variable djs;
		$jobs = @($jobs | ? {$_.State -eq "Running"}); #rebuild jobs array.
	}
	Start-Sleep -Seconds 3
}


# Complete logging:
[datetime] $endTime = Get-Date
[string] "End Time: " + $endTime | Out-File $logfile -Append 
$elapsedTime = $endTime - $startTime
[string] $outstr =  "Elapsed Time: " + [math]::floor($elapsedTime.TotalHours)`
	+ " hours, " + $elapsedTime.minutes + " minutes, " + $elapsedTime.seconds`
	+ " seconds."
$outstr | out-file -Append $logfile

The “uasm_cmd.bat” file called in the above code block contains the following one line:

"c:program fileslegatonsrbinuasm.exe" -s %1 > nul

Migrating from NetApp to Windows File Servers with PowerShell – part 2

Previously we saw how PowerShell and RoboCopy can be used to sync multi-terabyte file shares from NetApp to Windows. What I did not tell you was that this script choked and died horribly on a single share in our infrastructure. You may have seen it commented out in the previous script? “#,’R25′”?

CollegeNet Resource25… my old enemy. These clowns worked around a bug in their product (an inability to read an open text column in an Oracle DB table) by copying every text row in the database to its own file on a file server, and to make matters worse they copy all of the files to the same directory. Why is this bad? Ever try to get a directory listing on a directory with 480,000 1k files? It’s bad news. Worse, it kills robocopy. Fortunately, we have a workaround.

The archive utility “7-zip” is able to wrap up the nasty directory into a single small file, which we then can unpack on the new file server. Not familiar with 7-Zip? For shame! Get it now, it’s free:
http://www.7-zip.org/

7-zip ignores most file attributes, which seems to speed up the copy process a bit. Using robocopy, ouy sync operation would either run for hours on this single directory, or just hang up forever. With 7-zip, we get the job done in 30 minutes. Still slow, but better than never.

Troublesome files are found in the R25 “text_comments” directory, a subdirectory of “text”. We have prod, pre-prod, and test environments, and so need to do a few separate 7-zip archives. Note that a little compresson goes a long way here. When using “tar” archives, my archive was several gb in size. With the lowest level of compression, we squeeze down to only about 14 Mb. How is this possible? Well, a lot of our text comment files were empty, but uncompressed they still take up one block of storage. Over 480,000 blocks, this really adds up.

Code snippet follows.

#Sync R25 problem dirs

Set-PSDebug -Strict

# Initialize the log file:
[string] $logfile = "s:r25Sync.log"
remove-item $logfile -Force
[datetime] $startTime = Get-Date
[string] "Start Time: " + $startTime | Out-File $logfile -Append

function zipit {
	param ([string]$source)
	[string] $cmd = "c:localbin7za.exe"
	[string] $arg1 = "a" #add (to archive) mode
	[string] $arg2 = join-path -Path $Env:TEMP -ChildPath $($($source | `
		Split-Path -Leaf) + ".7z") # filespec for archive
	[string] $arg3 = $source #spec for source directory
	[string] $arg4 = "-mx=1" #compression level... minimal for performance
	#[string] $arg4 = "-mtm=on" #timestamp preservation - commented out for perf.
	#[string] $arg5 = "-mtc=on"
	#[string] $arg6 = "-mta=on"
	#invoke command, route output to null for performance.
	& $cmd $arg1,$arg2,$arg3,$arg4 > $null 
}

function unzipit {
	param ([string]$dest)
	[string] $cmd = "c:localbin7za.exe"
	[string] $arg1 = "x" #extract archive mode
	[string] $arg2 = join-path -Path $Env:TEMP -ChildPath $($($dest | `
		Split-Path -Leaf) + ".7z")
	[string] $arg3 = "-aoa" #overwrite existing files
	#destination directory specification:
	[string] $arg4 = '-o"' + $(split-path -Parent $dest) + '"' 
	#invoke command, route to null for performance:
	& $cmd $arg1,$arg2,$arg3,$arg4 > $null 
	Remove-Item $arg2 -Force # delete archive
}

[String[]] $zips = "V3.3","V3.3.1","PRODWinXpText"
[string] $sourceD = "\filesr25"
[string] $destD = "s:r25"

foreach ($zip in $zips) {
	Get-Date | Out-File $logfile -Append 
	[string] "Compressing directory: " + $zip | Out-File $logfile -Append 
	zipIt $(join-path -Path $sourceD -ChildPath $zip)
	Get-Date | Out-File $logfile -Append 
	[string] "Uncompressing to:" + $destD | Out-File $logfile -Append
	unzipit $(Join-Path -Path $destD -ChildPath $zip)
}

Get-Date | Out-File $logfile -Append 
[string] "Syncing remaining files using Robocopy..." | Out-File $logfile -Append
$xd1 = "\filesr25V3.3" 
$xd2 = "\filesr25V3.3.1" 
$xd3 = "\filesr25PRODWinXPtext"
$xd4 = "\filesr25~snapshot"
$roboArgs = @("/e","/copy:datso","/purge","/nfl","/ndl","/np","/r:0","/mt:4",`
	"/b",$sourceD,$destD,"/xd",$xd1,$xd2,$xd3,$xd4)

& robocopy.exe $roboArgs

Get-Date | Out-File $logfile -Append 
[string] "Done with Robocopy..." | Out-File $logfile -Append

# Complete logging:
[datetime] $endTime = Get-Date
[string] "End Time: " + $endTime | Out-File $logfile -Append 
$elapsedTime = $endTime - $startTime
[string] $outstr =  "Elapsed Time: " + [math]::floor($elapsedTime.TotalHours)`
	+ " hours, " + $elapsedTime.minutes + " minutes, " + $elapsedTime.seconds`
	+ " seconds."
$outstr | out-file -Append $logfile

Migrating from NetApp to Windows File Servers with PowerShell – part 1

We are retiring our NetApp filer this year. It was nice knowing you, NetApp. Thank you for the no-hassle performance, agile volume management, and excellent customer support. We will not miss your insane pricing, and subtle incompatibilities with modern Windows clients.

In this multi-part series, I will be sharing PowerShell code developed to assist with our migration. In part one, we will look at bulk copy operations with RoboCopy. In part 2, we will look at a situation where RoboCopy fails to get the job done. In future parts, we will look at automated share and quota management and migration.

Migrating large amounts of data off a NetApp is not particularly straightforward. The only real option we have is to copy data off of the filer CIFS shares to their Windows counterparts. Fortunately, with the multi-threading power utility “robocopy” we can move data between shares pretty quickly. Unfortunately, robocopy only multi-threads file copy operations, not directory search operations. So, while initial data transfers with robocopy take place really quickly, subsequent sync operations are slower than expected. MS also released a utility called “RichCopy” whish supports multi-thread directory searching, but this utility is not supported by MS, and has some significant bugs (i.e. it crashes all the time). What to do?

PowerShell to the rescue! Using PowerShell jobs, we can spawn off a separate robocopy job for each subdirectory of a source share, and run an arbitrary number of parallel directory copies. With some experimentation, I determined that I could run ten simultaneous robocopy operations without overwhelming CPU or disk channels on the filer. Under this arrangement, or file sync Window has been reduced from almost 48 hours to a mere 2.5 hours.

Some tricky bits in the development of this script where:

  • PowerShell jobs and job queuing are critical to completing this script in a timely fashion. Syntax for “start-job” is tricky. See my post on backup performance testing for more comments on working with jobs.
  • Robocopy fails top copy a number of source files. This is mitigated though the use of the “/b” switch (backup mode).
  • The PowerShell cmdlet “receive-jobs” fails to capture output from a variety of job commands unless you assign the job to an object. To reliably capture the output of commands within our jobs, I needed to assign the jobs to our $jobs array.
  • I needed to do some post processing on the log file. In doing so, I needed to find UNC paths for our source filer “\files”. It is important to remember that, when using regular expressions, “” is the escape character. So, to match for “”, we need to enter “\”. To match for “\” we need to enter “\\”, as in:
     get-content $logfile | select-string -Pattern "\\files" | ...
  • Initially I allowed the script to process only one top level directory at a time (i.e. Start with \filessoftware, and only proceed to \filesshared when “software” completes). The problem with this was, I was preventing the script from running an optimal job count. Furthermore, a single hung job could bring the whole script to a halt. To combat this, I start the script by building a master queue array “$q”, which holds all of the directories for which I am going to start a job. The result of using a master queue is a considerable improvement in sustained throughput.
  • When building an array with a loop (i.e. while…) you may have trouble with the first item added to the array if you do not initialize the array before starting to loop. In my case, I needed to initialize “[array]$jobs = @()” before using the array to hold job objects in the “while” loop. Failing to do so caused “$jobs” to become a single job object when the number of jobs was equal to one. Bad news, if you are expecting to use array properties such as $jobs.count, or to call in index of the object (i.e. $jobs[0]).
  • ISE programs like the native PowerShell ISE, or Quest PowerGUI make script development much easier. However, production environments are not the same as the debug environment, so keep these tips in mind:
    1. Log your script actions! Use lots of out-file calls. If you are feeling slick, you can enclose these in “if ($debug)” clauses, and set the $debug variable as a script parameter (which I did no do here).
    2. When running in production, watch the log file in real-time using “get-content -wait”. I know it is not a cool as the Gnu command “tail”, but it is close.
  • Scoping… careful of the “global” scope. Initially I modified the $jobs and $dc variables in the global scope from within the “collectJobs” function. This worked fine in my ISE and at the PowerShell prompt. However, when running as a scheduled task, these calls failed miserably. I changed the calls to use the “script” scope, and the script now runs as a scheduled task successfully.

Below is the script I developed for this job… it contains paths specific to our infrastructure, but easily could be modified. Change the “while ($jobcount -lt 10)” loop to set the number of simultaneous robocopy processes to be used by the script…

# FilerSync_jobQueue.ps1
# JGM, 2011-09-29
# Copies all content of the paths specified in the $srcShares array to 
# corresponding paths on the local server.
# Keeps data on all copy jobs in an array "$q".
# We will use up to 10 simultaneous robocopy operations.

set-psdebug -strict

# Initialize the log file:
[string] $logfile = "s:files_to_local.log"
remove-item $logfile -Force
[datetime] $startTime = Get-Date
[string] "Start Time: " + $startTime | Out-File $logfile -Append

# Initialize the Source file server root directories:
[String[]] $srcShares1 = "adfs$","JMP$","tsFlexConfig","software","mca","sis","shared"`
	#,"R25"
	#R25 removed from this sync process as the "text_comments" directory kills
	#robocopy.  We will sync this structure separately.
[String[]] $srcShares2 = "uvol_t1_1$q-home","uvol_t1_2$q-home","uvol_t1_3$q-home",`
	"uvol_t1_4$q-home","uvol_t1_5$q-home","uvol_t2_1$q-home",`
	"vol1$qtree-home"
	
[String[]] $q = @() #queue array

function collectJobs { 
#Detects jobs with status of Completed or Stopped.
#Collects jobs output to log file, increments the "done jobs" count, 
#Then rebuilds the $jobs array to contain only running jobs.
#Modifies variables in the script scope.
	$djs = @(); #Completed jobs array
	$djs += $script:jobs | ? {$_.State -match "Completed|Stopped"} ;
	[string]$('$djs.count = ' + $djs.count + ' ; POssible number of jobs completed in this colletion cycle.') | Out-File $logfile -Append;
	if ($djs[0] -ne $null) { #First item in done jobs array should not be null.
		$script:dc += $djs.count; #increment job count
		[string]$('$script:dc = ' + $script:dc + ' ; Total number of completed jobs.') | Out-File $logfile -Append;
		$djs | Receive-Job | Out-File $logfile -Append; #log job output to file
		$djs | Remove-Job -Force;
		Remove-Variable djs;
		$script:jobs = @($script:jobs | ? {$_.State -eq "Running"}) ; #rebuild jobs arr
		[string]$('$script:jobs.count = ' + $script:jobs.Count + ' ; Exiting function...') | Out-File $logfile -Append
	} else {
		[string]$('$djs[0] is null.  No jobs completed in this cycle.') | Out-File $logfile -Append
	}
}
	
# Loop though the source directories:
foreach ($rootPath in $srcShares1) {
    [string] $srcPath = "\files" + $rootPath # Full Source Directory path.  
	#Switch maps the source directory to a destination volume stored in $target 
    switch ($rootPath) {
        shared {[string] $target = "S:shared"}
        software {[string] $target = "S:software"}
        mca {[string] $target = "S:mca"}
        sis {[string] $target = "S:sis"}
        adfs$ {[string] $target = "S:adfs"}
        tsFlexConfig {[string] $target = "s:tsFlexConfig"}
        JMP$ {[string] $target = "s:JMP"}
        R25 {[string] $target = "S:R25"}
    }
    #Enumerate directories to copy:
	$dirs1 = @()
	$dirs1 += gci $srcPath | sort-object -Property Name `
		| ? {$_.Attributes.tostring() -match "Directory"} `
		| ? {$_.Name -notmatch "~snapshot"}
	#Copy files in the root directory:
	[string] $sd = '"' + $srcPath + '"';
	[string] $dd = '"' + $target + '"';
	[Array[]] $q += ,@($sd,$dd,'"/COPY:DATSO"','"/LEV:1"' )
	# Add to queue:
	if ($dirs1[0] -ne $null) {
		foreach ($d in $dirs1) {
			[string] $sd = '"' + $d.FullName + '"';
	    	[string] $dd = '"' + $target + "" + $d.Name + '"';
			$q += ,@($sd,$dd,'"/COPY:DATSO"','"/e"')
		}
	}
}
foreach ($rootPath in $srcShares2) {   
    [string] $srcPath = "\files" + $rootPath # Full Source Directory path.
	#Switch maps the source directory to a destination volume stored in $target 
    switch ($rootPath) {
        uvol_t1_1$q-home {[string] $target = "H:homes1"}
        uvol_t1_2$q-home {[string] $target = "I:homes1"}
        uvol_t1_3$q-home {[string] $target = "J:homes1"}
        uvol_t1_4$q-home {[string] $target = "K:homes1"}
        uvol_t1_5$q-home {[string] $target = "L:homes1"}
        uvol_t2_1$q-home {[string] $target = "M:homes1"}
        vol1$qtree-home {[string] $target = "J:homes2"}
    }
    #Enumerate directories to copy:
	[array]$dirs1 = gci -Force $srcPath | sort-object -Property Name `
		| ? {$_.Attributes.tostring() -match "Directory"}
	if ($dirs1[0] -ne $null) {
		foreach ($d in $dirs1) {
			[string] $sd = '"' + $d.FullName + '"'
			[string] $dd = '"' + $target + "" + $d.Name + '"'
			$q += ,@($sd,$dd,'"/COPY:DAT"','"/e"')
		}
	}
}

[string] $queueFile = "s:files_to_local_queue.csv"
Remove-Item -Force $queueFile
foreach ($i in $q) {[string]$($i[0]+", "+$i[1]+", "+$i[2]+", "+$i[3]) >> $queueFile }

New-Variable -Name dc -Option AllScope -Value 0
[int] $dc = 0			#Count of completed (done) jobs.
[int] $qc = $q.Count	#Initial count of jobs in the queue
[int] $qi = 0			#Queue Index - current location in queue
[int] $jc = 0			#Job count - number of running jobs
$jobs = @()

while ($qc -gt $qi) { # Problem here as some "done jobs" are not getting captured.
	while ($jobs.count -lt 10) {
		[string] $('In ($jobs.count -lt 10) loop...') | out-file -Append $logFile
		[string] $('$jobs.count is now: ' + $jobs.count) | out-file -Append $logFile
		[string] $jobName = 'qJob_' + $qi + '_';
		[string] $sd = $q[$qi][0]; [string]$dd = $q[$qi][1];
		[string] $cpo = $q[$qi][2]; [string] $lev = $q[$qi][3]; 
		[string]$cmd = "& robocopy.exe $lev,$cpo,`"/dcopy:t`",`"/purge`",`"/nfl`",`"/ndl`",`"/np`",`"/r:0`",`"/mt:4`",`"/b`",$sd,$dd";
		[string] $('Starting job with source: ' + $sd +' and destination: ' + $dd) | out-file -Append $logFile
		$jobs += Start-Job -Name $jobName -ScriptBlock ([scriptblock]::create($cmd))
		[string] $('Job started.  Incrementing $qi to: ' + [string]$($qi + 1)) | out-file -Append $logFile
		$qi++
	}
	[string] $("About to run collectJobs function...") | out-file -Append $logFile
	collectJobs
	[string] $('Function done.  $jobs.count is now: ' + $jobs.count)| out-file -Append $logFile
	[string] $('$jobs.count = '+$jobs.Count+' ; Sleeping for three seconds...') | out-file -Append $logFile
	Start-Sleep -Seconds 3
}
#Wait up to two hours for remaining jobs to complete:
[string] $('Started last job in queue. Waiting up to three hours for completion...') | out-file -Append $logFile
$jobs | Wait-Job -Timeout 7200 | Stop-Job
collectJobs

# Complete logging:
[datetime] $endTime = Get-Date
[string] "End Time: " + $endTime | Out-File $logfile -Append 
$elapsedTime = $endTime - $startTime
[string] $out =  "Elapsed Time: " + [math]::floor($elapsedTime.TotalHours)`
	+ " hours, " + $elapsedTime.minutes + " minutes, " + $elapsedTime.seconds`
	+ " seconds."
$out | out-file -Append $logfile

#Create an error log from the session log.  Convert error codes to descriptions:
[string] $errFile = 's:files_to_local.err'
remove-item $errFile -force
[string] $out = "Failed jobs:"; $out | out-file -Append $logfile
$jobs | out-file -Append $errFile
$jobs | % {$jobs.command} | out-file -Append $errFile
[string] $out = "Failed files/directories:"; $out | out-file -Append $errFile
Get-Content $logfile | Select-String -Pattern "\\files"`
	| select-string -NotMatch -pattern "^   Source" `
	| % {
		$a = $_.toString(); 
		if ($a -match "ERROR 32 ")  {[string]$e = 'fileInUse:        '};
		if ($a -match "ERROR 267 ") {[string]$e = 'directoryInvalid: '};
		if ($a -match "ERROR 112 ") {[string]$e = 'notEnoughSpace:   '};
		if ($a -match "ERROR 5 ")   {[string]$e = 'accessDenied:     '};
		if ($a -match "ERROR 3 ")   {[string]$e = 'cannotFindPath:   '};
		$i = $a.IndexOf("\f");
		$f = $a.substring($i);
		Write-Output "$e$f" | Out-File $errFile -Force -Append
	}