Rob Minto

Sport, data, ideas

Month: March 2015

6 Nations finale: saving the tries to last

467172764I’m still reeling from the 6 Nations final weekend. I was lucky enough to be at Twickenham, and rugby matches like that are remarkable. But the stats from the weekend are remarkable too.

For starters, the England-France match equalled the highest total number of tries in a 6N match (12) and was the second highest points total with 90. The only match that surpasses it in points (and equals it in tries) is the England-Italy match from 2001, but that was an 81-23 thrashing, mid-tournament.

Part of the reason it the last day was so dramatic was how each team in the previous match had set the bar higher. Wales did their best with eight tries vs Italy; Ireland scored 40 points against Scotland, winning by 30. And so the England-France match was set up perfectly, and it delivered right to the last moment.

And while the numbers can’t convey the excitement, they do back up idea that it was the most dramatic finish ever.

In total, 27 tries were scored on the last day of the 2015 6N. That’s 44 per cent of the tries scored in the whole competition. Compare that to 2012, when just four tries were scored on the last day.

The only other years that come close are 2005, 2007 and 2014, which also saw 20 or more tries on the last day (also all above 30 per cent of the total tries).

6N tries and last round

In those years, the outcome of the tournament was also in the balance on the final weekend. Wales won in 2005, securing the Grand Slam, although France had racked up seven tries against Italy to keep their title hopes alive. In 2007, France scored a last-minute try to take the title over Ireland by just 4 points. And in 2014, Ireland’s narrow 2-point victory over France (again with last-minute drama) gave them the title on points difference over England.

Recent advocates of changing the scoring system to include bonus points might be right in the long run, but with the last two competitions being decided right at the last, it might be a while before anyone signs up to a new set of rules.

Chelsea’s lack of penalties is completely normal – here’s why

466393940Chelsea took the unusual step of publishing an official moan about their lack of penalties this season. It has been widely reported (Guardian, BBC), but without anyone really taking them to task on the data. But a little statistical digging might have shown that they have nothing to complain about.

The Chelsea article said:

It is in our 28 Premier League games this season where we have been awarded just two penalties. Both were for infringements on the league’s most-fouled player, Eden Hazard, and both were in home London derbies, against Arsenal and QPR respectively. The most recent was four-and-a-half months ago.

Historically, this figure seems abnormally low.

In the Double-winning 2009/10 campaign, when we were the country’s outstanding attacking team, we were awarded 12 league penalties.

So let’s look at the evidence. The numbers that Chelsea point to only look at their own penalties awarded. Statistically, it’s known as sampling bias, but you don’t need to know that to see that it is a bunch of numbers out of context.

What we really care about is a few things: how many penalties should a team expect over a season? Are better teams given more penalties? And how do the league winners compare? The only way to know this is to (with apologies to Peter Moores) look at the data.

Chelsea did indeed get 12 penalties in 2009-10. But this is an outlier – in fact, for all the penalties data I could get from the 1998-99 season onwards, it is the highest number given to one team in a single season.

Two other teams have also been awarded 12 penalties in one campaign. Can you guess which teams they are? Have a go. Other league winners? Nope. In fact, it was Liverpool, in 2013-14 when they finished second; and Crystal Palace, in 2004-05, finishing in 18th place!

That might give a clue as to whether league position and penalties are connected. Basically, they are not. They are very weakly correlated, by a score of -0.28. *

Over a season, the average penalties per team per season has varied between two and six. And in the 16 years of available data, the Premier League winners have had a lower penalty count than the average team five times. That leaves 11 times when it has been higher (see chart below). Yes, you would expect the league winners to play attacking football and get a more penalties than the league average, as Chelsea suggest – but for Chelsea to get less than the average this season is hardly unprecedented.

EPL penalties

Put another way: only four times in the sixteen years of data have the team winning the league also been awarded the most penalties (Arsenal 2001-02, Manchester United 2002-03, 2007-08 and Chelsea 2009-10). Penalties are not some divine right of the best team. History shows that a team can be given a lot of penalties and still finish low down the league. Just ask Palace. Or Sunderland (6 penalties, 14th place last year). Or Blackpool (8 penalties, 19th place in 2010-11). Or West Ham (9 penalties, 17th place in 2009-10).

In other words: Chelsea’s current lack of penalties is nothing strange. It’s just… football.

* A negative number should be expected here, as a better league position is a lower number. For penalties and league position to be correlated, a score closer to -1 would be needed. For those wondering, it is very weakly positively correlated to the points a team gets over a season, with a score of 0.32.

In defence of Moores and data

465665520

It was probably the worst thing he could say. “We’ve got to look at the data” is not a great line for a cricket coach to say to the press after going out of a World Cup.

It’s especially bad when that team has been previously criticised for being over-reliant on data, and sucking the spontaneity out of players.

But let’s be fair to Peter Moores: whatever he had said would not be good enough. The question whenever a country flounders in a big event is: “what went wrong?”. The answer is usually complicated and not necessarily immediately evident. And so the reply is often of the form: “we will try and work it out and learn from it.”

So how do teams work it out? Using anecdotes? Talking to fans? Plucking a theory out of thin air? No, they look at the (whisper it quietly) data.

Of course they do. If looking at data is now taboo, English cricket will suffer. What England need to do is look at the right data, put it in context, and work out what to do next.

But several media commentators have latched on to data as the culprit. We have our new bogeyman, and he is armed with a spreadsheet.

“English cricket kills itself”. That’s the headline in the Spectator. From the piece by Alex Massie:

“We’ve got to look at the data.” If ever there was an appropriate epitaph or this era of English cricket this is it. England have, under Moores, known the price of everything but the value of nothing. The data has given them heaps of information; they’ve had no idea what to do with it.

But why would they? Cricket is a complex game but not a mysterious one. It has changed much less than most people think….

England, however, think there’s some magic sauce that can unlock the mysteries of cricket. So they crunch numbers and discover that x percent of games are won by a score of y or that when z then b and if c then a+b = d. Is it any wonder then they play like humans impersonating robots?

And so on, until this conclusion: “The problem is the bloody data.”

Really? What’s the alternative?

And so the bandwagon starts to roll:

Yet… When articles are written about England’s loss to Bangladesh, they will cite numbers such as:

Between the 21st and 31st overs, only 40 runs were scored for the loss of 3 wickets.

(I just came up with that. I looked at the data.)

Or commentators will look at some other stat which will be seen as where the match was won or lost. That Bangladesh were allowed to plunder 78 off the last 10 overs. That England had them at 32 for 2 after 10 overs, but let a good start get away from them. You can take your pick.

England will have more sophisticated numbers at their disposal, such as what kinds of deliveries produced more dot-balls, or about field placings. Should they ignore them? Simply say it was a “bad day at the office”, or some other sporting cliche?

As the Guardian’s Andy Bull put it recently:

The laptop is just another tool in the box, useless unless the players understand the value of the information it provides, and no more valuable than their own ability to adapt and improvise during a match.

Interestingly, the word that Massie and Bull both use is “value”. If we consign data to a marginal or even zero role, then we will miss valuable insights.

The statistics are there. They lend themselves to being crunched. That’s not a bad thing, per se. Nor is it a good thing. But to say that the numbers are the problem is madness.

© 2024 Rob Minto

Theme by Anders NorenUp ↑