Here is a script that will read a couple of nodes out of an XML file, then copy the file to a directory build up of the country and date values read out of the file. If the directory is not there, then it creates the directory and then moves the file.
This came up out of the need to automatically file around 40,000 xml files every single day based on their country and date. I think PowerShell was the way to go instead of a batch file because of the xml reading requirement.
cls
#
# global declarations
#
$data_rootDir = $(get-location)
#
# logic
#
write-host "Iterating xml files in directory " $data_rootDir
$files=get-childitem . *xml|where-object{!($_.psiscontainer)}
foreach ($file in $files){
$xml = [xml](get-content $file.name)
# variable declarations
$product_country = $xml.product.productMain.subsidiary
$update_date = $xml.product.productMain.updateTime.substring(0,10)
$data_targetDir = "$data_rootDir\$product_country\$update_date"
write-host "-------------------------------------"
write-host "Moving file $data_targetDir"
write-host `n"Country origin: $product_country"
write-host "Update date: $update_date"
write-host "Target xml directory: $data_targetDir"
# create and move file into daily directory
if(!(test-path $data_targetDir))
{
"Directory not found, creating..."
new-item -itemtype directory -path $data_targetDir
}
write-host "Copying file to directory"
copy-item $file.name $data_targetDir
write-host `n"-------------------------------------"
}