As we enjoy these calm few days before the frenzy of free-agent mania hits, I thought I'd provide an update on the hottest and coldest shooters of the season just passed. Our guide here is the concept of Shot Quality, pioneered by Alan Ryder and (as implemented here) used to express the probability that a given shot is going to result in a goal, based on distance from the net, shot type (slap, wrist, etc.) and situation (power play, even-strength, shorthanded). Basically, 200 shots from a defenseman firing from the point aren't going to produce as many goals as 200 shots from a winger who fires more often from the slot.
It's important to remember that the accuracy of the information recorded during NHL games is often less than we'd desire, and while I'm not (in this post) going to the extreme of introducing rink-by-rink effects to try and somehow "correct" the data, Alan's latest piece regarding these problems is well worth a read. I'll utilize it here mainly to qualify some of the results at the end.
Our criteria here is for shooters with a minimum of 100 shots, measuring Actual Goals against Expected Goals, excluding empty netters. Expected Goals is calculated by adding up the Shot Quality factor for each individual shot over the course of a season. For instance, if a 20 foot wrister on the power play has a 19.13% chance of scoring, that shot is worth 0.1913 Expected Goals. The difference between that and Actual Goals tells us how "hot" or "cold" that shooter was, without having corrected for the opposing goaltenders.
And the coldest of the cold?
So now, some of the qualifications. The data quality problems that exist appear to indicate that games in Madison Square Garden, for example, have shots recorded with much higher shot quality than in other rinks (what the MSG scorer records as a 25 footer may have been 30, etc.). That causes Jagr and Cullen to appear much worse in these standings than is truly likely (as their Expected Goals are inflated), and the effect is probably on the order of 15-20% of their total Expected Goals value, which would move them out of this top 10 list although still leave them underperforming.
On the other extreme, Tampa Bay's scoring led to a low Expected Goals value, so Vinny Prospal looks even worse when taking that factor into account, probably adding a couple more goals that he should have scored. That similar effect makes Vincent Lacavalier's results look a little less "hot", and the same would apply to Buffalo's Chris Drury and Jason Pominville, whose Expected Goals figures would be revised upwards by about 10% based on Ryder's analysis. By and large, however, the players on this list had either "lucky" or "unlucky" seasons, and it is entirely possible that their fate may change dramatically next year. Few shooters consistently over- or under-perform against expectations, giving hope to some fans, and perhaps instilling caution in others.