Thursday, October 6, 2011

Appreciating how the Old Ones played. Or: Joe Namath in the 2000s, continuing...


by Jim Glass

If you haven't read Part I of this, you probably should to get the context and some details about the numbers being used here.

[] Passing game changes, 1970s to 2000s. These numbers for passing statistic averages and standard deviations show how passing norms have changed since the 1970s.

("SD%" = the standard deviation divided by the average for the stat. For instance, for yards-per-completion the SD% of .128 for is the SD of 1.47 divided by the average of 11.53)

The bulk of the difference between eras results from rule changes. In 1978 the NFL adopted major rule changes to favor the passing game (see "The Top Ten Things that Changed the Game") and has steadily followed up on them since. The result has been shorter, safer, higher-percentage passing, and more of it.











































1970s2000sChange
Comp% 0.518 0.606 17.00%
SD 0.048 0.041 -13.60%
SD% 9.30% 6.80%
       
Yards/C 11.53 10.7 -7.20%
SD 1.47 0.93 -36.80%
SD% 12.80% 8.70%
       
TDs/attempt 0.043 0.041 -4.70%
SD 0.015 0.012 -20.90%
SD% 35.70% 29.60%
       
INTs/attempt 0.054 0.03 -44.40%
SD 0.016 0.009 -40.10%
SD% 29.00% 31.20%
       
Sacks/attempt 0.089 0.067 -24.70%
SD 0.03 0.023 -22.20%
SD% 33.70% 34.90%
       
Yards/attempt 5.97 6.48 8.50%
SD 0.92 0.8 -13.40%
SD% 15.40% 12.30%
       
ANY/attempt 3.51 5.17 47.30%
SD 1.5 1.3 -13.20%
SD% 42.70% 25.20%



The result of more, safer, less-aggressive passing is seen in the numbers for the average game.

[] Average game passing:















1970s2000sChange
Attempts 25.8 32.9 27.50%
Completions 13.4 20 49.30%
Yards 153 214 39.90%
TDS 1.1 1.4 27.30%
Interceptions 1.4 1.1 -21.40%



With the norm for pre-1978 passing being 22% fewer attempts than today ... 33% fewer completions ... a meager 51% completion percentage ... and a TD-to-INT ratio of 15 to 19, we can see how so many fans of today may take even the middling passers of today as being better than the top performers of yesteryear.

But there have been more changes than just shifting averages.

[] What wins, and what won.

Regression analysis relating team winning percentage to performance by standard deviation, on both offense and defense, via rushing yards/attempt, pass completion percentage, yards per completion, interception percentage, and sack percentage, gives these notable coefficients for passing offense...














1970s2000s
Comp% 0.0343 0.0394
Yards/C 0.0699 0.0425




In the 1970s, compared to completion percentage, yards per completion contributed twice as much to winning. Today in the 2000s they contribute nearly equally.

Now we are getting to why those who actually watched Unitas and Starr play voted Unitas All-Pro five times to Starr's once, while many who know only today's game think Starr's passing numbers look at lot better.

[] Yards-per-attempt: less than it seems. Yards-per-completion: more!

Yards-per-attempt (in its many variations) has become the standard for measuring QB performance for those who don't use passer rating or a proprietary stat (such as WPA or DVOA). And only Y/A and passer rating are easily applied back though history.

But there is a problem with Y/A: it is a compound stat equaling completion percentage multiplied by yards-per-completion. It is accurate only to the extent its two components contribute equally to winning, and their relative values don't change over time.

Take two QBs with an identical 8.0 yards per attempt: Sam Slinger gets his by throwing for 16 yards per completion while completing 50% of his passes, while Percentage Pete completes two-thirds of his passes at 12 yards per completion. With all other passing numbers at league average, in the 2000s the 8.0 Y/A corresponds to about a 63% winning percentage for both QBs.

But in the 1970s, Slinger's numbers correspond to a 70% winning level of play versus 63% for Pete's. That 70% is 20 points over 50%, versus only 13 for the 63%, so Sam's play contributes 54% more to winning than Pete's (20/13 = 1.54) due to his throwing longer completions - and that 54% is a lot.

In the 1970s and earlier a strong-armed QB could win more games by reducing his completion percentage in the course of throwing longer –– an idea pretty much unheard of in the 2000s.

The low completion percentages of Unitas and Namath in their best years, held against them today, were not a "bug", they were a feature. High-yards-per-completion was worth much more than high completion percentage - and this is something that Y/A applied backward misses completely.

Unitas had MVP seasons in 1958 and 1959 with completion percentages of only 51.7% and 52.6% - well below his career average of 54.6%, and thus his "poor" years to modern eyes - but with top Y/C numbers of 14.8, and 15.0.

In 1964 Unitas and Starr had pretty much "mirror image" rate stats. Unitas' 51.8% completion percentage was well below average both for him and the league, only 9th of 14. Starr was first by completion percentage at 59.9% (and also first by today's passer rating). But Unitas was first by Y/C at 17.9, while Starr was 11th at 13.2.

The voters picked Unitas as NFL MVP. Applying the regression developed for 1971-5 (a bit of a fudge, but a lot better than using the passer rating of the 2000s) the results say the voters were right.

Unitas' mere 51.8% completion rate, good 2% interception rate, and top 17.9 Y/C produces play at a fantastic .910 winning level!

Starr's top 59.9% completion rate, superior 1.5% interception rate, and lowish 13.2 Y/C, produces play at .760 winning level ñ very good, but not even close to a match.

In the 21st century we don't get QBs repeatedly winning MVP awards while posting career-bottom completion percentages. That's the difference that yards-per-catch used to make.

For a bit more perspective, the career high Y/C numbers of today's top QBs don't reach even the career average Y/C numbers of the top QBs of the past.


















Careeraverage Y/CCareerhigh Y/C
Graham 16.1 P Manning 13.6
van Brocklin 15.2 Rivers 13.4
Namath 14.7 Brady 12.8
Unitas 14.2 Rodgers 12.7
Bradshaw 13.8 Brees 12.4





That's how the passing game has changed, and it is no accident - it's happened because what wins the game has changed.

Yards-per-completion may be considered the great overlooked statistic of NFL football, certainly in historical analysis - the "killer stat" of the pre-1978 game, the single offensive stat that contributed by far the most to winning. After 1978, it has become steadily less so, but even during 2006-10 it has contributed a little more to winning than completion percentage.


[] Passing rating: Poor enough for today, awful for historical evaluations.

If yards-per-attempt neglects the importance of yards-per-completion, the official NFL QB passer rating system is openly hostile to it.

A completion that loses yards can increase the QB's passing rating. Every completed pass that loses yards gets a rating of 79.2. So if a QB's rating is below 79, he can improve it by intentionally completing passes that lose yards.

On the 2010 passer rating list eight QBs were in this situation - a quarter of the league's starters. That's bad enough, but back in 1970 when the average rating was 62.5, fully 19 of the 24 rated QBs (79%) had ratings below 79 and could have improved their ratings by intentionally losing yards. (This is like a batter in baseball increasing his batting average by intentionally striking out.)

One can hardly think of a worse system to rate strong-armed QBs who help their teams most by throwing long passes for high Y/C while reducing their completion percentage.

Passing rating is always biased against long-throwers and in favor of short-throwers. This is not too bad when kept to the short-throwing era of today. But applying it backward in time produces ratings that are ever more distorted by (1) the increase in completion percentage over time, compounded by (2) the greater importance of long-throwing and drop of importance of completion percentage ñ even in then-current terms - to winning in earlier eras. It produces the totally absurd historical rankings noted in the first part of this story.

Applied back to the 1970s and 1960s it especially penalizes the deep throwers like Unitas and Namath, while giving a big boost to their short-throwing competitors such as Starr and Len Dawson.

Time to break again. Everything I write runs longer than expected.

Next: Joe Namath in the 2000s .... and the end.

2 comments:

James said...

I don't think it ever would have occurred to me that the relative importance of Y/C and C% would change over time. This has definitely opened my eyes.

Chase Stuart said...

Good stuff, Jim. But I think it's important to not treat history as a linear tale. The '60s were a lot different than the '70s, and in many ways, the '50s were more like the '00s. I think using "pre-1978" as a proxy for "1971 to 1975" is something to avoid. In my opinion, you'll be hard pressed to find a more unusual five-year period over the last 60 years than the period from 1971 to 1975. That, to me, is the big outlier in pro football history. The passing game plummeted in the early '70s, which is what led to the rule changes and the steady increases in passing production we have seen since then. But it's important not to confuse the early '70s with the 20 year period (even ignoring the AFL) from 1950 to 1969.

Post a Comment

Note: Only a member of this blog may post a comment.