As part of my day job I am responsible for managing backups; which means interacting with Veeam support (our backup vendor) and providing logs. While most of the log gathering is automated through the Veeam software, that isn't the case for Veeam agent for Windows (which we only use for physical machines). And while Veeam has a very helpful KB2404 to gather those logs, this work is mindless and boring; which is a perfect thing to script away!
Over the course of this series I'll walk thru my process of scripting these steps, provide the updated script and notate any issues I encounter.
This Veeam KB has 4 steps and we'll automate them in the order listed the KB. So with no further delay the first step is: Gathering Veeam Agent Guest Logs.
I wrote the following PowerShell script to do just that:
Line 1: defines the variable $ProgramDataLocation, which finds the environment variable "programdata" location, and returns the value of that object to the variable.
Line 2: defines the location of the Veeam Endpoint folder by joining the location specified in line one, with the standard folder structure specified by Veeam.
Line 3: defines the variable $ServerName, which finds the environment variable "computername", and returns the value of that object to the variable.
Line 4: defines the variable $Date, by using the module Get-Date and storing that date in the YYYYMMDD format.
Line 5: defines the variable $EndpointDestination, which will be the location for the archived/zipped folder to upload to Veeam.
Line 6: whitespace for formatting.
Line 7: Runs the Compress-Archive module specifying the input path as the variable $EndpointPath (defined in line 2), and the destination path as the $EndpointDestination (defined in line 5), further specifying the compression level as maximum.
And if I run this script I am currently encountering the current error: ZipArchiveHelper : The process cannot access the file 'C:\ProgramData\veeam\endpoint\Endpoint.Transport.log' because it is being used by another process.
So in the next blog post we'll determine how to address this issue and modify the script to accommodate it.