We training professionals struggle, with good reason, to measure the impact of our training (and thus to justify our continued employment). Unfortunately, often we (and by "we" I actually mean "other people in my profession" for a change. I make many, many mistakes, but this is not one of them) reach not so much for low-hanging fruit as for the composting remains of fruit that fell off the tree a long time ago.
I speak, of course (of course?), of post-training written tests. When trainers lack the opportunity, resources or understanding needed to actually measure the impact of the training but need to point to *something*...well, they often just throw a written test at it. The lovely thing is, we can easily evaluate knowledge based on a post-training test. The problem is...knowledge doesn't really tell us anything helpful when what we're trying to do is to improve job performance.
But hey...the final score on the Erotic Reading Comprehension test is measurable. So there's that, I guess.