It was probably the worst thing he could say. “We’ve got to look at the data” is not a great line for a cricket coach to say to the press after going out of a World Cup.
It’s especially bad when that team has been previously criticised for being over-reliant on data, and sucking the spontaneity out of players.
But let’s be fair to Peter Moores: whatever he had said would not be good enough. The question whenever a country flounders in a big event is: “what went wrong?”. The answer is usually complicated and not necessarily immediately evident. And so the reply is often of the form: “we will try and work it out and learn from it.”
So how do teams work it out? Using anecdotes? Talking to fans? Plucking a theory out of thin air? No, they look at the (whisper it quietly) data.
Of course they do. If looking at data is now taboo, English cricket will suffer. What England need to do is look at the right data, put it in context, and work out what to do next.
But several media commentators have latched on to data as the culprit. We have our new bogeyman, and he is armed with a spreadsheet.
“English cricket kills itself”. That’s the headline in the Spectator. From the piece by Alex Massie:
“We’ve got to look at the data.” If ever there was an appropriate epitaph or this era of English cricket this is it. England have, under Moores, known the price of everything but the value of nothing. The data has given them heaps of information; they’ve had no idea what to do with it.
But why would they? Cricket is a complex game but not a mysterious one. It has changed much less than most people think….
England, however, think there’s some magic sauce that can unlock the mysteries of cricket. So they crunch numbers and discover that x percent of games are won by a score of y or that when z then b and if c then a+b = d. Is it any wonder then they play like humans impersonating robots?
And so on, until this conclusion: “The problem is the bloody data.”
Really? What’s the alternative?
And so the bandwagon starts to roll:
Yet… When articles are written about England’s loss to Bangladesh, they will cite numbers such as:
Between the 21st and 31st overs, only 40 runs were scored for the loss of 3 wickets.
(I just came up with that. I looked at the data.)
Or commentators will look at some other stat which will be seen as where the match was won or lost. That Bangladesh were allowed to plunder 78 off the last 10 overs. That England had them at 32 for 2 after 10 overs, but let a good start get away from them. You can take your pick.
England will have more sophisticated numbers at their disposal, such as what kinds of deliveries produced more dot-balls, or about field placings. Should they ignore them? Simply say it was a “bad day at the office”, or some other sporting cliche?
As the Guardian’s Andy Bull put it recently:
The laptop is just another tool in the box, useless unless the players understand the value of the information it provides, and no more valuable than their own ability to adapt and improvise during a match.
Interestingly, the word that Massie and Bull both use is “value”. If we consign data to a marginal or even zero role, then we will miss valuable insights.
The statistics are there. They lend themselves to being crunched. That’s not a bad thing, per se. Nor is it a good thing. But to say that the numbers are the problem is madness.