Cal Football 2012 Season in Review: Revisiting Our Predictions

Not one of these fans predicted a 3-win season - Thearon W. Henderson

2012 was an absolute tire fire of a season. Did any of us see it coming? Let's take a look back at our preseason and pregame predictions to find out!

Remember the excitement, the anticipation, and the hope we all had on August 31st, 2012? The season was one day away, Memorial Stadium was about to reopen, and the Bears were on their way to a 7 or 8-win season...or so we thought. In our semi-annual predictions, we asked you all to predict the Bears' chances of winning each game in the 2012 schedule. The results showed that the Bears would improve upon the 2011 season with 7 or 8 regular-season wins in 2012. That would give us a chance to go to a bowl to get our 8th or 9th win and generate some momentum heading into 2013.

That obviously didn't happen.

Nine losses later, we clearly misjudged this team. Let's take a look at just how wrong we were. Below we have our preseason predictions coupled with the outcome on the field.

Opponent Preseason prediction Result
Nevada 82.3% Loss
Southern Utah 98.0% Win
Ohio State
38.9% Loss
USC 22.9% Loss
ASU 75.0% Loss
UCLA 73.6% Win
WSU 68.9% Win
Furd 58.6% Don't remind me
Utah 56.8% Loss
UW 64.3% Loss
Oregon 34.6% Loss
Oregon State 63.0% Loss
Total 7.37 wins 3-9

So much for seven wins. We only won three of our five easiest games...and no others. We lost all the toss-up games and did not notch a single upset.

Comparing the predictions to the win-loss outcome is interesting, but rather simplistic. What if we compare the predictions to the score discrepancy in each game?

Predictions vs. Reality

The challenge here is determining what is "reality." Let's get metaphysical: one school of thought says you can use the difference between points scored in a game to judge how much better one team was than the other. If team A outscores team B 40-10, you could argue that team A would win 80% of their games against team B, as team A scored 80% of the points in the game. These predictions get rather wonky in blowout wins, however. If team A plays team B 10 times and averages winning 40-10 in every game, would you expect team B to have a 20% chance of winning the next game? That's a generous prediction, especially since A likely went into cruise control late in the games.

To avoid these unusual results that occur in blowout wins/losses, we need to modify our equation a bit. Instead of calculating the chances that A beats B as (Points scored by A/ Total points scored by A and B), we're going to use the following equation: A's win probability = A^2.37/ (A^2.37 + B^2.37). This strategy of raising the points scored to a certain power is known as the Pythagorean Projection. To read more about why we do this, go here. I'll spare the rest of you from any more unneccessary math. I'd like to thank reader PaulThomas for bringing this to my attention in last year's edition of this post.

We did an awful job of prediction wins and losses; how well did we predict the actual outcome, as defined by the Pythagorean projection?

Opponent Preseason prediction Reality Result
Nevada 82.3% 35.3% Loss
Southern Utah 98.0% 75.6% Win
Ohio State
38.9% 37.1% Loss
USC 22.9% 6.9% Loss
ASU 75.0% 25.0% Loss
UCLA 73.6% 90.0% Win
WSU 68.9% 80.6% Win
Furd 58.6% 1.0% Don't remind me
Utah 56.8% 19.6% Loss
UW 64.3% 24.3% Loss
Oregon 34.6% 4.5% Loss
Oregon State 63.0% 2.9% Loss
Total 7.37 wins 4.03 wins 3-9

We were spot in with our predictions for the Ohio State game. Otherwise, we ranged from misguided to woefully inaccurate.

We overrated the Bears in ten of their twelve games. We underrated the Bears in their wins against UCLA and WSU.

We managed to overrate the Bears' chances by about 50% in four games: Nevada, ASU, Big Game, and Oregon State. We overrated the Bears by about 40% against Utah and Washington.

Overall, Pythagorus says we should have won about 4 games. Even he overrated the Bears this season.

Let's take a look at these results in chart form. Our preseason predictions are in blue and reality is the black line.

2012predictionsrevisited4_large

We consistently and slightly overrated the Bears over the first half of the season. We were way off track during the second half, once the team went into a tailspin. Look at the difference between predictions and reality over those final five games!

We obviously suffered from delusions of grandeur during the preseason. Did we come back to reality as the season went on?

Preseason vs Pregame Predictions


Each week during our weekly report cards we predicted the Bears' chances of winning the upcoming game. These predictions allowed us to revisit out thoughts on the upcoming opponent and reassess the Bears' chances as the season progressed.

Let's compare the preseason and pregame predictions.

Opponent Preseason prediction Pregame prediction Reality Result
Nevada 82.3% 82.3% 35.3% Loss
Southern Utah 98.0% 74.3% 75.6% Win
Ohio State
38.9% 15.5% 37.1% Loss
USC 22.9% 30.0% 6.9% Loss
ASU 75.0% 42.3% 25.0% Loss
UCLA 73.6% 19.3% 90.0% Win
WSU 68.9% 72.6% 80.6% Win
Furd 58.6% 56.9% 1.0% Don't remind me
Utah 56.8% 40.2% 19.6% Loss
UW 64.3% 11.2% 24.3% Loss
Oregon 34.6% 2.9% 5.0% Loss
Oregon State 63.0% 19.3% 2.9% Loss
Total 7.37 wins 4.67 wins 4.03 wins 3-9

Once we lost the grand reopening of Memorial Stadium, we seemed to recognize that the Bears would not quite live up to expectations in 2012.

The season began with a whimper as we let Colin Kaepernick that other Nevada QB run wild on us. Our expectations for the rest of the season took an immediate nosedive. Over the rest of the season, only twice were we more optimistic about the Bears' chances than we were during the summer. We were slightly more optimistic about facing a Khaled Holmes-less USC and a Wazzu team that had yet to get its sea legs under Mike Leach. Except for the Big Game, our expectations were much, much worse as the season wore on. The worst declines were 50%+ reductions in our chances of winning against the surprisingly competitive UCLA and Oregon State. UCLA managed to outderp us en route to a blowout loss while the Oregon State game was and unmitigated disaster...just as we expected.

Except for the Ohio State, USC and UCLA games, our pregame predictions were much more accurate than our preseason predictions.

Based on our pregame predictions, we should have won about 4 or 5 games--an obvious decline from our preseason predictions. Here is the above table in chart form: look at the difference between some of those preseason (dark blue) and pregame (light blue) predictions!

2012predictionsrevisited5_medium

The only time our pregame predictions returned to our preseason predictions was after wins or other competitive performances--and those were few and far between. After the loss to Nevada, we immediately expected the Bears to fare much, much worse over the first half of the season. After consecutive wins over UCLA and Washington State, our optimism climbed back to climbed to preseason levels. That came to a sudden end after the Big Game fiasco. From that point onward we steadily recognized that this season was going to end horrifically, as the difference between our preseason and pregame predictions grew to depressing levels.

How did our predictions compare to the predictions from the moneymakers in Vegas?

Vegas' Predictions

While we tend to focus on the point spread Vegas issues for each game, we can directly calculate a win probability from the moneyline. I'll spare you the details on the exact conversion, but go here to see the formula if you're interested. Below we have included Vegas' predictions along with our own. Many thanks to PhilaBear for gathering and calculating the Vegas predictions. Were the oddsmakers more accurate than we were?

Opponent Preseason prediction Pregame prediction Vegas' prediction Reality Result
Nevada 82.3% 82.3% 79.8% 35.2% Loss
Southern Utah 98.0% 74.3% 100% 75.6% Win
Ohio State
38.9% 15.5% 11.0% 37.1% Loss
USC 22.9% 30.0% 13.1% 6.9% Loss
ASU 75.0% 42.3% 50.0% 25.0% Loss
UCLA 73.6% 19.3% 43.6% 90.0% Win
WSU 68.9% 72.6% 72.2% 80.6% Win
Furd 58.6% 56.9% 43.6% 1.0% Don't remind me
Utah 56.8% 40.2% 48.4% 19.6% Loss
UW 64.3% 11.2% 60.7% 24.3% Loss
Oregon 34.6% 2.9% 2.3% 5.0% Loss
Oregon State 63.0% 19.3% 16.8% 2.9% Loss
Total 7.37 wins 4.67 wins 5.41 wins 4.03 wins 3-9

The Vegas predictions tended to be pretty accurate. Only twice did the Bears lose when they were favored to win (Nevada, UW) and only once did they win a game in which they were underdogs. These were about as accurate as our pregame predictions, where we were upset twice (Nevada, Big Game) and the upsetters once (UCLA). Those are some respectable predictions.

Overall, Vegas expected the Bears to win about 5 or 6 games.

Here are the predictions in chart form:

2012predictionsrevisited3_medium

Overall, the Vegas predictions seemed to be close to our pregame predictions.

In retrospect our preseason predictions were wildly inaccurate. As the season dragged on our predictions became more in touch with reality. It clearly did not take long us for us to realize this season would take a turn for the Ohio.

Awards!

How could we possibly spend all this time on these predictions without handing out some awards? We would like to recognize those of you with the most accurate and least accurate predictions. First, a word on how I handed out grades.

In a perfect game, Cal would defeat its opponent in a shutout. The most accurate prediction of this would be a 100% prediction of a Cal victory. The least accurate would be 0%. As a result, the furthest one's prediction could be from reality is 1.00. If Cal has a perfect season and shuts out all its opponents, the furthest one's predictions could be over the course of the season is 12.00, 1.00 for each game. The closer one's cumulative deviation is to 12, the worse the grade. With some basic math, we can turn the cumulative deviation into a grade. I followed the basic strategy I explained, but I took the square root of the deviations for each game. Accordingly, predictions were penalized more harshly as they became further from reality. Let's see how the best and the worst of you fared:

The Ursadamus Award

The first award goes to those with the most accurate predictions over the course of the season. Look at that: no one earned a passing grade!

Ursadamus Award
Name Cumulative Deviation Grade
1. Boset 4.91 59.1%
2. giobear 4.97 58.6%
3. DonthavesbnatonID 4.98 58.5%
4. Slb 5.13 57.2%
5. Old Blue 1975 5.26 56.2%
6. BearBack_Mtn 5.54 53.8%
7. bluehenbear 5.58 53.5%
8. Nasal Mucus Goldenbear 5.64 53.0%
9. SoCal Oski 5.68 52.6%
10. Jerry 5.69 52.6%

Boset leads the way, followed by giobear and Don'thavesbnationID.

These scores would have been among the worst last season, but they're downright accurate for the topsy-turvy 2012 season. You all didn't quite anticipate a 3-win season, but your predictions tended to be a little less sunny than the rest of ours.

The Miss Cleo Award

Our second award goes to the least accurate among you: those who foresaw anything but a 3-win season.

Miss Cleo Award
Name Cumulative Deviation Grade
1. jiggets 9.40 21.7%
1. solarise 9.40 21.7%
3. c98 8.73 27.3%
4. CruzinBears 8.06 32.8%
5. alpha1906 7.74 35.5%
6. texashaterforlife 7.55 37.1%
7. fatoski 7.50 37.5%
8. Cugel 7.44 38.0%
9. Marshawn Rodgers 7.39 38.4%
10. floridabear 7.36 38.6%

Giving the Bears a 100% chance to win each of the games, jiggest and solarise were, by far, the least accurate of the bunch.

Don't let these embarassingly low scores keep you from participating in the next round of season predictions!

Writers and Moderators

Finally, we have selected the ballots of your beloved mods and writers for CGB. Did our countless hours of writing about our team allow us to be more accurate in our predictions?

Mods
Name Overall Deviation Grade
41. TwistNHook 6.22 48.2%
53. iVinshe (Vincent S) 6.30 47.5%
71. Ohio Bear 6.41 46.6%
87. HydroTech 6.53 45.6%
92. LeonPowe 6.58 45.2%
94. unclesam22 6.59 45.1%
98. atomsareenough 6.61 44.9%
106. Berkelium97 6.66 44.5%
126. Kodiak 6.84 43.0%
168. solarise 9.40 21.7%

No, we fared pretty badly. TwistNHook managed to have the most accurate predictions, followed by Vincent S and Ohio Bear. Overall, we all sat somewhere in the middle of the pack, except for solarise, whose undying optimism carried him to the lowest possible spot.

Collectively, it looks like none of us saw this coming. We had a 1-in-200 chance of doing this badly. Hopefully we swing to the other end next season and achieve that 1-in-200 shot at a 14-win season. The Sunshine Pumpers will rise again!

Thanks for participating, everyone! We'll see you in a couple months for our first round of 2013 season predictions!

X
Log In Sign Up

forgot?
Log In Sign Up

Forgot password?

We'll email you a reset link.

If you signed up using a 3rd party account like Facebook or Twitter, please login with it instead.

Forgot password?

Try another email?

Almost done,

Join California Golden Blogs

You must be a member of California Golden Blogs to participate.

We have our own Community Guidelines at California Golden Blogs. You should read them.

Join California Golden Blogs

You must be a member of California Golden Blogs to participate.

We have our own Community Guidelines at California Golden Blogs. You should read them.

Spinner.vc97ec6e

Authenticating

Great!

Choose an available username to complete sign up.

In order to provide our users with a better overall experience, we ask for more information from Facebook when using it to login so that we can learn more about our audience and provide you with the best possible experience. We do not store specific user data and the sharing of it is not required to login with Facebook.

tracking_pixel_9341_tracker