One of the things that drives me crazy is managers publicly busting the out the old “Stats are meaningless, I hate statisticians and everything they believe in!” stuff. For one, most top flight managers employ PAs, some of whom I’m sure go beyond video review and into something resembling statistical analysis (though for ease of identification, clubs should establish two job titles: Head of Performance Analysis and Head of Statistical Analysis). So one would hope these bombastic quotes would be followed by a nice gift of single malt to the person working on the spreadsheets in the office downstairs.
I also think the problem is many managers think statistics will always make either themselves or their players look bad. They refer to statistics in the “lies and damn lies” sense, but that old Mark Twain quote (attributed to British PM Benjamin Disraeli) in best viewed in light of the kind of numbers abuse which unfortunately persists to this day (see the 100% pass completion rate for the player who attempted 12 passes). “Stats can’t tell you about how the game is played.” Well, not in isolation, no. But they can offer evidence in support of an assertion about how the game is played. And they might even help a manager struggling to convince their chief exec to spend money on some heavy-hitters.
One wonders for example if David Moyes, whose respect for analytics was evident during his time at Everton, read up on Sean Ingle’s column for the Guardian this past Monday, which details how Moyes’ United have come up short this season. Amid all the damning numbers on United’s impotent attack and concession of chances, we get this short para:
In some ways Moyes is a touch unlucky. Last season United didn’t dominate as much as the final Premier League table suggests. In only 50% of games did they have more shots on target and possession than their opponents, and they came from behind nine times to win. It was will, as much as skill, that earned them their 20th league title.
I think there may be more to this than Ingle lets on. Ingle’s article was the subject of an interesting Twitter conversation yesterday, which also touched on how several models this year predicted a poor finish for Man United before Moyes even led a training session. This led to a brief conversation on the same models that predicted a much lower finish for United in the preceding season, the one in which they won the title on 89 points.
How could this be?
Readers of this column will know that the great conundrum of the 2012-13 Premier League season was the title-winning season of Man United, whose shots ratio—based on regression analysis done by James Grayson using over a decade of Premier League matches—should have put them somewhere around fifth place.
I’ve already bored you all to death with theories as to why this might be the case—United were efficient in their chance creation, United spent a lot of time in +1 goal lead, which meant they took fewer shots and conceded more. But this still doesn’t get us to a statistically satisfactory answer.
So what then was the reason United won the league despite their inability to dominate teams with shots?
Luck. Random chance. Sure, better chances gave them a boost. Good defense limiting their opponents’ chances in front of goal. There was the magic Ferguson touch and whatever else Gary Neville thinks. But it still isn’t enough to explain the considerable statistical gap.
So what does this have to do with Moyes? Well, contrary to the notion that stats are never helpful to the manager, I think Moyes’ claims about United’s struggles in the early part of the season were not as off-base as some might think:
“To win the Champions League, you need five or six world-class players.
“Look at Bayern Munich, they have it. Look at Barcelona, who had it in the past and Real Madrid, who have maybe got it now. That’s the level you have to be at to win it. We’ve not got that yet but what we have got is experience.”
If you think this is completely crazy, check Grayson’s graph at the drop off in United’s Total Shots Ratio toward the end of the 2012 season from this post (NB: Grayson kindly sent over one which included Man United’s TSR since then and attendant signings):
You’ll of course note which events it seems to coincide with. So did Grayson:
I’m not sure how much of a swing we can pin on the purchase or sale of one player. In the NHL it looks like losing a star player can cost the team about 2% of it’s TSR (link). Those players typically play ~40% of the time on teams that are half the size of a football team so I wouldn’t be surprised if the effect was at least as large in football as in hockey.
With that in mind, a combination of introducing Rooney and Ronaldo into the line up around the same time seems to coincide well with the rapid rise in ’03-04. I doubt that those two introductions explain all of the improvement but it would certainly contribute some amount. Conversely losing Ronaldo and Tevez together at the end of the ’09 season seems to coincide quite well with the start of the decline.
The temptation after the 2012-13 season in which United regained the title was to immediately cast aspersions on shot dominance (TSR) as a reliable predictive metric. Except what needs to be understood here is that TSR doesn’t perfectly correlate to end of season points totals (otherwise football would be pretty boring, and weird). There is room for standard deviation, and random variation, too, in addition to a lack of drive in the final third, or whatever. And that messy soup can give a team like United, who otherwise weren’t a dominant team, enough to give them the title.
What does this matter? United won the title! Your numbers are stupid. Except that the very use of these numbers is to spot trouble ahead of time, for an enterprising manager or executive or chairperson to address.
As for Moyes, he would have a strong case here to take to the board to start spending money on key players in the next few transfer windows. And for fans, perhaps reason to start looking at Malcolm Glazer, rather than the former Everton manager, to direct their ire.
Analytics as diagnosis, rather than cure
This little addendum is just a reason for me to link to Colin Trainor’s work on Statsbomb this week, myth busting the idea that Stewart Downing creates poor chances for other players.
I often get my knickers in a twist over some of this stuff, perhaps because of the implication: “Well, if you don’t do that Stewart Downing, and you’ll be a better player!” But I think this kind of analysis is actually extremely useful if we regard it as a diagnostic tool, rather than a prescriptive recommendation or a predictive model. Another manager might have listened to conventional wisdom and dealt Downing on—this kind of analysis however reveals the issue is not with Downing necessarily, but with some poor finishing, which itself may have more to do with luck than anything else. Paging Prozone nerd Sam Allardyce.
And Trainor’s work here is excellent. This to me is what value-added football analytics look like.