As a preface, I created a 3 part blog series on TeraCopy logs and parsing them so you may want to read those first to understand the underlying files and queries.
I had a case recently where I used SQLECmd (via KAPE) to parse through a collected TeraCopy folder. The only problem is that the user must have used it often as there were over 50 history database files which when using the --hunt will produce 50 CSV files.
That isn't the most reasonable to review so I pivoted to writing my own Python script instead. It even includes more correlation between the "main.db" file and the multiple history files so it operates much more smoothly by producing one larger CSV file that can be more easily filtered.
Figure 1: TeraLogger help menu
The script is simple in that it only takes two things, an input path of the TeraCopy folder (-i) and the output folder of where you'd like the export report made (-o). Because what it's really doing on the backend is just some SQLite queries and correlation it runs super fast so it'll be be completed before you even notice most likely.
Download the Python script or the release executable here:
Happy parsing!