Thursday, April 16, 2020

2019 Yards Per Play: Mountain West

This week we head west to try and rid some of our east coast bias. Welcome to the Mountain West review.

Here are the Mountain West standings.
So we know what each team achieved, but how did they perform? To answer that, here are the Yards Per Play (YPP), Yards Per Play Allowed (YPA) and Net Yards Per Play (Net) numbers for each Mountain West team. This includes conference play only, with the championship game not included. The teams are sorted by division by Net YPP with conference rank in parentheses.
College football teams play either eight or nine conference games. Consequently, their record in such a small sample may not be indicative of their quality of play. A few fortuitous bounces here or there can be the difference between another ho-hum campaign or a special season. Randomness and other factors outside of our perception play a role in determining the standings. It would be fantastic if college football teams played 100 or even 1000 games. Then we could have a better idea about which teams were really the best. Alas, players would miss too much class time, their bodies would be battered beyond recognition, and I would never leave the couch. As it is, we have to make do with the handful of games teams do play. In those games, we can learn a lot from a team’s YPP. Since 2005, I have collected YPP data for every conference. I use conference games only because teams play such divergent non-conference schedules and the teams within a conference tend to be of similar quality. By running a regression analysis between a team’s Net YPP (the difference between their Yards Per Play and Yards Per Play Allowed) and their conference winning percentage, we can see if Net YPP is a decent predictor of a team’s record. Spoiler alert. It is. For the statistically inclined, the correlation coefficient between a team’s Net YPP in conference play and their conference record is around .66. Since Net YPP is a solid predictor of a team’s conference record, we can use it to identify which teams had a significant disparity between their conference record as predicted by Net YPP and their actual conference record. I used a difference of .200 between predicted and actual winning percentage as the threshold for ‘significant’. Why .200? It is a little arbitrary, but .200 corresponds to a difference of 1.6 games over an eight game conference schedule and 1.8 games over a nine game one. Over or under-performing by more than a game and a half in a small sample seems significant to me. In the 2019 season, which teams in the Mountain West met this threshold? Here are Mountain West teams sorted by performance over what would be expected from their Net YPP numbers.
Seven teams saw their actual record differ significantly from their expected record based on YPP. Boise State, Nevada, and Utah State exceeded their expected record while Colorado State, Fresno State, New Mexico, and San Jose State under-performed relative to their YPP numbers. Close game record does a good job of explaining the over-performance. Boise State, Nevada, and Utah State combined to go 8-1 in one-score conference games. And while the Broncos went undefeated, the Wolfpack and Aggies were blown out in most of their league losses. Of Nevada's four conference defeats, three came by at least 26 points while both of Utah State's conference losses came by at least 24 points. For the underachievers, Fresno State and San Jose State can blame close games, as they went a combined 2-6 in one-score conference games. New Mexico didn't play any conference games decided by less than eleven points, but they did have the worst in-conference turnover margin of -11. However, Colorado State is the real odd duck, or Ram if you will. They had the third best per-play differential in the conference, but won just three of their eight league games. They were only 0-1 in one-score conference games and their turnover margin was underwater (-4), but hardly debilitating. I couldn't really come up with an explanation for their struggles. As it stands, Steve Addazio will likely be the beneficiary of their positive regression while Mike Bobo will have to settle for working for Will Muschamp.

Largest Average Discrepancy
You may have noticed this past season's Mountain West featured an abnormally large number of teams that saw their actual record differ significantly from their expected record based on YPP. With YPP data going back to 2005, I wanted to see if it had the largest average discrepancy (by absolute value). It did, narrowly edging out a conference from fifteen years ago. Before we get to splitting decimals though, here are the other conferences that with the largest average disparity between their teams' actual record and expected record based on YPP.
The Sun Belt looked a lot different in 2011 than it does today. The conference had only nine teams, making it the smallest of our top five. With only nine teams, the high variance is also slightly less impressive as a few large outliers can have out sized influence on the average. However, even though the conference was only nine deep, more than half the teamssaw their expected record differ by more than .200 (the standard I use to rate a difference as 'significant').
If your memory of college football seasons runs together, 2015 was the year Michigan State stole a conference title and playoff bid from Ohio State. You can actually read the YPP recap here.
The Mountain West holds two of the top three spots on our list. This one is also recent enough that you can actually read the YPP recap here.
Then known as the Pac-10, the conference of champions is our surprise runner-up. The average difference was just .0001 less than this past year's Mountain West, our overall winner for largest average discrepancy.
In its twenty year history, the Mountain West has had better years, but none where the standings and per play differentials were so mismatched. Put that on a trophy!

No comments: