I structure this post by breaking down the code according to the requirements. I hope this will allow everyone to better understand how its done.

Reusable Function

The main part of this is to generate the hash code of individual files. As this portion requires to be executed a lot of times(depends on number of files in the folder), the most efficient way to code this is by writing a function for generating of hash code so that it can be reused again and again. Explanation of the code is available in my earlier post.

function gethash ($hashcheck){
$stream = ([IO.StreamReader]"$hashcheck").BaseStream
[string]$hashcode = -join ([Security.Cryptography.HashAlgorithm]::Create( "SHA256" ).ComputeHash( $stream ) | ForEach { "{0:x2}" -f $_ })
$null = $stream.Seek(0,0)
$stream.Close()
return $hashcode
}

Consolidation of all files

Next is to gather all the files which are present in the folder so that we can use it to find the individual hash code of each file. we will use the function get-ChildItem to loop the entire folder

Command will be like the below:

$filenames=gci -path $source -recurse | Where-Object {$_.mode -notmatch “d”} | select -expandproperty fullname>

The blue statement is to use the get-childItem (short form gci) to search for all the items in $source recursely(including subfolders). Result will be piped to the red statement which is to omit all the directories(we only need hash code of files). Finally this is piped to the highlighted statement which will get the full path and name. All these will be written to the filenames string array.

Getting the full filename is necessary as the gethash function we created will require the full filename in order to generate the hashcode. But the final output of this is to create all the filenames and its hashcodes in the xml file. So having the full filename is no good as the full filename will change depends on the location you placed the folder. What we need is the relative path and filename from the location the folder is residing

For example if you place the folder named happy in C:\Users\Yarnthen. A fullfilename will be something like C:\Users\Yarnthen\happy\fileA. We can’t write the full filename to the xml file because it will just keep changing depending on where you place the folder. (e.g. C:\Users\Cohos\happy\fileA if you put the folder in C:\Users\Cohos). Thus what we need here is the relative path and filename (in the example shown will be something like \fileA). This will be the same throughout no matter where we place the folder. To do this, we will need to strip the Source path from the full filename.

Command will be like the below:

$folder=Split-Path $source -Leaf
$file = $filename.Substring($Source.Path.Length + 1)

Create an object

We will need an array to hold all the information. In order to achieve this, we will need to declare a PSObject for each record and store it to the objects array. PSObject is a custom object which allows you to add custom properties to the object itself. For example in this case, I can create a PSObject to house the name of the file and add custom property of Hashcode to PSObject.

$objects=@()
$object = New-Object PSObject -Property @{Path=$folder+ “\” + $file}
$object = Add-Member -InputObject $Object -MemberType NoteProperty -Name “SHA256” -Value $Hash -PassThru

Generate the data

Once we know how to create the object, we can start to generate all the information required to form the XML. And the code will be just to loop all the files and store its relative filename and hashcode(using the reusable function) to the PSObject. As below:

$objects=@()
foreach ($filename in $filenames){
$file = $filename.Substring($Source.Path.Length + 1)
$object = New-Object PSObject -Property @{Path=$folder+ “\” + $file}
$hash = gethash $filename
$object = Add-Member -InputObject $Object -MemberType NoteProperty -Name “SHA256” -Value $Hash -PassThru
$objects += $object

Create the xml file

Once we have all the information in the PSObject, we can use it to create the XML file (i am naming it hash.xml). Of course the xml file must have some headers in content as per what other xml files have. I have taken the cue from the xml file that can be generated by fciv for headers, after all this is actually still doing what fciv is doing, albeit in a more advanced hashing algorithm.

"<?xml version=""1.0"" encoding=""utf-8""?>" >hash.xml
"<FILEHASH>" >>hash.xml
foreach ($object in $objects){
"`t<FILE_ENTRY>" >>hash.xml
"`t`t<name>" + $object.path + "</name>" >>hash.xml
"`t`t<SHA256>" + $object.SHA256 + "</SHA256>" >>hash.xml
"`t</FILE_ENTRY>" >>hash.xml
}
"</FILEHASH>" >>hash.xml

Hashing the hash.xml

Last part will be to hash the hash.xml created so as to achieve our end state of one hash code.

$hash = gethash “hash.xml” $hash

Try it out!

Do some testing with the program. Move the folder around and try to generate the hash to see if it is still the same. Make some changes and check the hash again. That is all for this post. Remember you can get the full code in the introduction post