clock menu more-arrow no yes

Filed under:

2015 Cal Football Season in Review: Did the Bears Meet Expectations?

New, comments

We revisit our preseason predictions to determine whether Cal football met our expectations during the 2015 season.

Tom Pennington/Getty Images

Cal football has underachieved more often than not over the past decade.  After the waning Tedford years and abysmal campaigns in 2012 and 2013, Cal football has since started doing something quite unusual: meeting expectations.  During summer of 2014 I was quite surprised to find that the CGB community predicted a whopping 5 wins during the 2014 season.  After the eye-gougingly bad 2013 season, 5 wins would have been a tremendous improvement.  Equally impressive was that the team made that tremendous improvement and met our preseason expectations.

This past summer we forecast another substantial improvement: 7 regular season wins.  And for the second year in a row we achieved expectations.  Sure, we could have done better and avoided that midseason slide, just like the 2014 team could have played a bit better and achieved bowl eligibility.  But those would have been overachievements. After the past decade, I'm happy enough when the team meets expectations.

With the 2015 season behind us, it's time to reflect on the season.  Was it a very good year? A mediocre year? A bad year (get outta here!)?

Let's go back to this past summer.  Those were simpler times: the Dow was above 18,000, Donald Trump seemed like a fringe candidate, and the men's basketball team was projected to be a top-15 team (too soon?).  Back then I asked you all to predict the Bears' chance of success in each of their games.  From 0% to 100%, you all indicated how likely you thought a Cal victory was.  Below I list the results for each game along with the outcome on the field.

Opponent Win Chance Outcome
Grambling State (9-3) 97.0% W (73-14)
San Diego State (11-3) 83.8% W (35-7)
at Texas (5-7) 53.6% W (45-44)
at Washington (7-6) 61.1% W (30-24)
Washington State (9-4) 75.7% W (34-28)
at Utah (10-3) 47.4% L (24-30)
at UCLA (8-5) 41.6% L (24-40)
USC (8-6) 36.8% L (21-27)
at Oregon (9-4) 26.6% L (28-44)
Oregon State (2-10) 75.0% W (54-24)
at LSJU (0-14) 56.1% L (22-35)
Arizona State (6-7) 49.9% W (48-46)
Total 7.041 wins 7 wins

7.04 projected wins followed by 7 realized wins!  Notice that the Bears won every single game in which we favored them to win, except for the Big Game, whose predictions are usually biased upwards due to the abundance of 100% win chance predictions. We also notched the most minor upset possible with the victory over ASU.  I'm not complaining, however--I'll take any upset we can get!

When revisiting our preseason predictions we see that the four-game slide wasn't completely unforeseen.  We forecast losses in each of those four games, although I don't think we would have predicted that UCLA and USC would combine for 11 losses.  On the other hand, I don't think we expected 30 wins among San Diego State, Wazzu, and Utah.  Still, failing to score more than 3 TDs per game in 6 of our 9 conference games is a disappointment.  Fortunately we were able to rely on a much-improved defense that allowed the team to stay competitive even when the offense struggled.  Good and bad.  Ups and downs. Everything generally evened out to match our preseason expectations.  And once again, after the brutal 2010-2013 stretch, meeting expectations is something worth celebrating.

While it's fun to revisit these preseason predictions, evaluating these predictions with binary win-loss outcomes is a bit simplistic.  Let's look at these predictions with a bit more precision.

Predictions vs. Reality

Wiser men (and women) than I have long pondered the nature of reality.  What is reality? What is? What? If we want to move beyond comparing our predictions to the win-loss outcome, we need a more precise measure of reality.  Fortunately I have one answer to the question "What is reality?".  Enter the Pythagorean projection.

One could make a very reasonable argument that large margins of victory suggest that the winning team is more likely to defeat its opponent in a rematch than if the victor wins narrowly.  The Pythagorean projection is based on this argument and uses final scores to estimate win probabilities.

                               (POINTS SCORED)^2.37
Win likelihood =~  ---------------------------------------------
                    (POINTS SCORED)^2.37 + (POINTS ALLOWED)^2.37

For each of our games I calculated the Pythagorean projection and compared it to our preseason predictions.  In the plot below I compare our preseason predictions for each game (blue) to the results from the Pythagorean projection (gray).  Anything above the black line at 50% represents a win while anything below that line represents a loss.  If a blue point (prediction) overlaps a gray point (reality), then the outcome matched our predictions.  If the gray point is higher than the blue point, then reality exceeded our predictions and the team overachieved.  Likewise, if the gray point is lower than the blue point, then the team underachieved relative to our predictions.

Preseason Predictions

Over the first four games, the team roughly matched or exceeded expectations.  Then we underachieved against Wazzu (although we wouldn't have guessed that the Cougs would be a 9-win team).  The team underachieved in losses to Utah and UCLA but met expectations in losses to USC and Oregon.  Of course, that didn't make stomaching the losses any easier.  Fortunately the Bears righted the ship with an overachieving win against Oregon State.  We then suffered the most disappointing loss of the year against the Lobsterbacks.  A win against ASU ended the season on a high note and allowed the team to match our preseason win predictions.  Compared to the pregame predictions, the team did a great job of meeting expectations.  Of course, many of us likely recalibrated our expectations after a strong start to the season.  How well did the team meet our revised expectations?

Pregame Predictions

During our weekly report cards we asked readers to estimate Cal's chances of winning the next game.  While we mostly make educated guesses in the preseason predictions, we learned more about ourselves and our opponents over the course of the season.  As such, our weekly predictions may be more accurate reflections of our expectations of the team (for example, we probably all revised our expectations upwards after taking a 45-24 lead over the Longhorns).  That or our pregame expectations represent wild overreactions to our wins or losses because we're all emotionally unstable after years of debilitating Cal fandom.  In any case, we can compare the pregame expectations to the results on the field.

In the plot below I've added our pregame predictions (dark blue) to complement our preseason predictions (lighter blue) and reality (gray).

After destroying our first two opponents with an impressive combination of offense and defense, our expectations climbed notably.  Cathartic wins over Texas and UW further bumped our expectations upward.  Despite the 5-0 start, our pregame predictions for the Utah game dipped slightly thanks to the Utes' top-five ranking.  Following a narrow loss to the Utes, we were quite optimistic against UCLA.  And then things went terribly, terribly wrong.

After the loss to the Bruins, our expectations went into a death spiral and sank low enough for a 6-point loss to USC to be a slight overachievement.  A still-sloppy Ducks team bolstered our expectations of beating Oregon and another lousy performance (this time against the Ducks) caused our pregame predictions against OSU to dip substantially.  Despite lowering our expectations against the Lobsterbacks, we still failed to achieve expectations.  Fortunately that pessimism allowed us to exceed expectations against ASU.  Despite a pleasant finish to the season, we consistently failed to meet pregame expectations throughout much of the season.  Compared to our pregame predictions we met expectations once, overachieved four times, and underachieved seven times.  A victim of our own success, we never quite lived up to expectations set in September.  No wonder the discussions in November about Dykes' potential extension became so contentious...

How Well Did Vegas Predict Our Season?

Finally, we compare our predictions to those from Vegas, as derived from the betting lines.  Many thanks to PhilaBear for calculating the Vegas predictions and sending them along to me.  I've plotted the Vegas predictions below in gold.

Vegas is surprisingly close to our pregame predictions over the first half of the season.  Then, after overrating our chances against the Bruins, Vegas is remarkably close to reality over the final five games.  After regularly underrating the Bears during the 2014 season (which is reasonable following a 1-11 campaign), Vegas was much closer to reality this season. In addition to these charts above, I've provided the data in a table below.

Reality CGB Preseason CGB Pregame Vegas
Grambling 0.980 0.970 0.970 1.000
SDSU 0.978 0.838 0.864 0.820
Texas 0.513 0.536 0.714 0.714
UW 0.629 0.611 0.691 0.667
WSU 0.613 0.757 0.845 0.880
Utah 0.371 0.474 0.440 0.317
UCLA 0.230 0.416 0.655 0.403
USC 0.355 0.368 0.240 0.345
Oregon 0.255 0.266 0.502 0.357
OSU 0.872 0.750 0.597 0.933
LSJU 0.250 0.561 0.492 0.222
ASU 0.525 0.499 0.444 0.644
Total Wins 6.573 7.045 7.453 7.303

While it's fun to compare the various predictions with reality, this would be more satisfying if we can crown a winner.  Three predictions entered; only one can reign supreme.  So let's pick a winner.  The method to determine a winner is pretty simple: I measure each prediction's deviation from reality and added them up over the course of the season.  To penalize predictions that are further from reality, I square the difference (so if a prediction is off by 5 it's penalized by 25, while if it's off by 10 I'd penalize it by 100).  In the table below I post reality and the deviation for each prediction. For each of the predictions, smaller values indicate predictions that were closer to reality. For each game I've bolded and starred the predictions that were closest to reality.

Reality CGB Preseason Predictions CGB Pregame Predictions Vegas Predictions
Grambling (W, 73-14) 98.04 *1.18* *1.18* 3.83
SDSU (W, 35-7) 97.84 197.13 *131.79* 251.55
Texas (W, 45-44) 51.33 *5.17* 403.07 403.90
UW (W, 30-24) 62.92 *3.21* 37.59 14.03
WSU (W, 34-28) 61.30 *208.28* 538.94 713.90
Utah (L, 24-30) 37.08 106.13 47.96 *28.44*
UCLA (L, 24-40) 22.96 346.61 1812.49 *301.51*
USC (L, 21-27) 35.53 1.58 132.19 *1.11*
Oregon (L, 28-44) 25.52 *1.11* 607.15 103.98
OSU (W, 54-24) 87.24 150.87 756.49 *37.19*
LSJU (L, 22-35) 24.97 970.54 586.20 *7.53*
ASU (W, 48-46) 52.52 *6.94* 65.25 141.45
Total Deviation NA *1998.76* 5120.30 2008.42

Our preseason predictions were closest to reality 6 times, Vegas was closest to reality 5 times, and our pregame predictions were most accurate twice (the first two games, interestingly enough).  As I noted above, Vegas was remarkably accurate in October and November, as it was most accurate in 5 of our final 7 games. After adding up all the squared deviations, our preseason predictions narrowly beat Vegas as the most accurate set of predictions.  The volatility of our pregame predictions ensured that they were much, much less accurate than our preseason predictions or Vegas' predictions.  I don't know whether we should be proud or embarrassed that we were only slightly more accurate than a bunch of bookies who don't know much of anything about our team.

A chart of the above table helps indicate the magnitude of how close or far we were from reality over the course of the season (look at that UCLA game!).

Let's continue this trend of recognizing winners (and lovable losers) by handing out some awards!

Awards

While I was revisiting our preseason predictions, I calculated how far each participant was from reality.  The scores take into account the difference between predictions and reality for each game and the final scores are based on your garden-variety standard deviation statistic.  Now let's recognize the best and the worst of the bunch.

Ursadamus Award

First we recognize those whose predictions were closest to reality. gamedaytribe was closest to reality, followed by daveman and Nick Kranz, who shows that we CGB writers aren't completely out of touch. Congrats to the rest of you who finished in the top ten.  Presumably you're all satisfied with how the season progressed, as it most closely matched your predictions.  Is that a reasonable assumption, or did your expectations climb (or decline) over the season?

Name Deviation
1. gamedaytribe 0.063
2. daveman 0.090
3. Nick Kranz 0.106
4. Mallrat92204 0.109
5. jodjeoa 0.111
6. CreepSwenson 0.118
7. gubear 0.120
8. Gandu Jee 0.120
9. shame on me 0.127
10. NoBetterThanSolid 0.132

Next we take a look at the other end of the spectrum.

Miss Cleo Award

Herzon_Alfaro(AzusaCA91702 leads the way followed by Fiat Lux and DavidsonBear.  Herzon gave the Bears a 100% chance of winning each game, while most of the others posted similarly lofty predictions.  One of these days Cal will post a shutout in every single game and you 100%ers will win the Ursadamus Award.  Although it may be a while before that happens...

Name Deviation
1. Herzon_Alfaro(AzusaCA91702) 3.289
2. Fiat Lux 3.203
3. DavidsonBear 2.592
4. Maxdarkfire 2.005
5. Ghost of Joe Roth 1.806
6.  Norton1982 1.730
7. Old Bear 71 1.579
8. Oski Disciple 1.561
9. So Cal Bear 23 1.539
10. willstevens 1.294

Final Thoughts

For the second year in a row the team came remarkably close to meeting our preseason expectations.  So why was there so much hand-wringing in October and November?  The team's overachievement during September and the 5-0 start caused many of us to recalibrate our expectations.  After such a strong start, a 2-5 finish was tough to accept, especially when the offense sputtered.  Once again, how satisfied you were with the season likely depended on how much you were willing to deviate from your preseason predictions.  If you kept expecting a 7-win season, even after a 5-0 start, then you were probably satisfied with the season.  If you started expecting 8, 9, 10, or 15 wins, then the 7-win season was probably tougher to accept.  Fortunately a bowl win seemed to cure most of our grumbling.  Thank Oski, as basketball season has been rough this year.  CGB would be a grim place if both basketball and football failed to meet expectations.

With spring ball only a few weeks away, the Bears will be back on the (practice) field soon.  And that means we'll be predicting the number of wins in the 2016 season soon.  Thanks to all of you who participated in this past season's predictions!  Check back in April for our first round of 2016 predictions!