Not one of these fans predicted a 3-win season - Thearon W. Henderson
2012 was an absolute tire fire of a season. Did any of us see it coming? Let's take a look back at our preseason and pregame predictions to find out!
Remember the excitement, the anticipation, and the hope we all had on August 31st, 2012? The season was one day away, Memorial Stadium was about to reopen, and the Bears were on their way to a 7 or 8-win season...or so we thought. In our semi-annual predictions, we asked you all to predict the Bears' chances of winning each game in the 2012 schedule. The results showed that the Bears would improve upon the 2011 season with 7 or 8 regular-season wins in 2012. That would give us a chance to go to a bowl to get our 8th or 9th win and generate some momentum heading into 2013.
That obviously didn't happen.
Nine losses later, we clearly misjudged this team. Let's take a look at just how wrong we were. Below we have our preseason predictions coupled with the outcome on the field.
|Furd||58.6%||Don't remind me|
So much for seven wins. We only won three of our five easiest games...and no others. We lost all the toss-up games and did not notch a single upset.
Comparing the predictions to the win-loss outcome is interesting, but rather simplistic. What if we compare the predictions to the score discrepancy in each game?
Predictions vs. Reality
The challenge here is determining what is "reality." Let's get metaphysical: one school of thought says you can use the difference between points scored in a game to judge how much better one team was than the other. If team A outscores team B 40-10, you could argue that team A would win 80% of their games against team B, as team A scored 80% of the points in the game. These predictions get rather wonky in blowout wins, however. If team A plays team B 10 times and averages winning 40-10 in every game, would you expect team B to have a 20% chance of winning the next game? That's a generous prediction, especially since A likely went into cruise control late in the games.
To avoid these unusual results that occur in blowout wins/losses, we need to modify our equation a bit. Instead of calculating the chances that A beats B as (Points scored by A/ Total points scored by A and B), we're going to use the following equation: A's win probability = A^2.37/ (A^2.37 + B^2.37). This strategy of raising the points scored to a certain power is known as the Pythagorean Projection. To read more about why we do this, go here. I'll spare the rest of you from any more unneccessary math. I'd like to thank reader PaulThomas for bringing this to my attention in last year's edition of this post.
We did an awful job of prediction wins and losses; how well did we predict the actual outcome, as defined by the Pythagorean projection?
|Furd||58.6%||1.0%||Don't remind me|
|Total||7.37 wins||4.03 wins||3-9|
We were spot in with our predictions for the Ohio State game. Otherwise, we ranged from misguided to woefully inaccurate.
We overrated the Bears in ten of their twelve games. We underrated the Bears in their wins against UCLA and WSU.
We managed to overrate the Bears' chances by about 50% in four games: Nevada, ASU, Big Game, and Oregon State. We overrated the Bears by about 40% against Utah and Washington.
Overall, Pythagorus says we should have won about 4 games. Even he overrated the Bears this season.
Let's take a look at these results in chart form. Our preseason predictions are in blue and reality is the black line.
We consistently and slightly overrated the Bears over the first half of the season. We were way off track during the second half, once the team went into a tailspin. Look at the difference between predictions and reality over those final five games!
We obviously suffered from delusions of grandeur during the preseason. Did we come back to reality as the season went on?
Preseason vs Pregame Predictions
Each week during our weekly report cards we predicted the Bears' chances of winning the upcoming game. These predictions allowed us to revisit out thoughts on the upcoming opponent and reassess the Bears' chances as the season progressed.
Let's compare the preseason and pregame predictions.
|Opponent||Preseason prediction||Pregame prediction||Reality||Result|
|Furd||58.6%||56.9%||1.0%||Don't remind me|
|Total||7.37 wins||4.67 wins||4.03 wins||3-9|
Once we lost the grand reopening of Memorial Stadium, we seemed to recognize that the Bears would not quite live up to expectations in 2012.
The season began with a whimper as we let
Colin Kaepernick that other Nevada QB run wild on us. Our expectations for the rest of the season took an immediate nosedive. Over the rest of the season, only twice were we more optimistic about the Bears' chances than we were during the summer. We were slightly more optimistic about facing a Khaled Holmes-less USC and a Wazzu team that had yet to get its sea legs under Mike Leach. Except for the Big Game, our expectations were much, much worse as the season wore on. The worst declines were 50%+ reductions in our chances of winning against the surprisingly competitive UCLA and Oregon State. UCLA managed to outderp us en route to a blowout loss while the Oregon State game was and unmitigated disaster...just as we expected.
Except for the Ohio State, USC and UCLA games, our pregame predictions were much more accurate than our preseason predictions.
Based on our pregame predictions, we should have won about 4 or 5 games--an obvious decline from our preseason predictions. Here is the above table in chart form: look at the difference between some of those preseason (dark blue) and pregame (light blue) predictions!
How did our predictions compare to the predictions from the moneymakers in Vegas?
While we tend to focus on the point spread Vegas issues for each game, we can directly calculate a win probability from the moneyline. I'll spare you the details on the exact conversion, but go here to see the formula if you're interested. Below we have included Vegas' predictions along with our own. Many thanks to PhilaBear for gathering and calculating the Vegas predictions. Were the oddsmakers more accurate than we were?
|Opponent||Preseason prediction||Pregame prediction||Vegas' prediction||Reality||Result|
|Furd||58.6%||56.9%||43.6%||1.0%||Don't remind me|
|Total||7.37 wins||4.67 wins||5.41 wins||4.03 wins||3-9|
The Vegas predictions tended to be pretty accurate. Only twice did the Bears lose when they were favored to win (Nevada, UW) and only once did they win a game in which they were underdogs. These were about as accurate as our pregame predictions, where we were upset twice (Nevada, Big Game) and the upsetters once (UCLA). Those are some respectable predictions.
Overall, Vegas expected the Bears to win about 5 or 6 games.
Here are the predictions in chart form:
Overall, the Vegas predictions seemed to be close to our pregame predictions.
In retrospect our preseason predictions were wildly inaccurate. As the season dragged on our predictions became more in touch with reality. It clearly did not take long us for us to realize this season would take a turn for the Ohio.
How could we possibly spend all this time on these predictions without handing out some awards? We would like to recognize those of you with the most accurate and least accurate predictions. First, a word on how I handed out grades.
In a perfect game, Cal would defeat its opponent in a shutout. The most accurate prediction of this would be a 100% prediction of a Cal victory. The least accurate would be 0%. As a result, the furthest one's prediction could be from reality is 1.00. If Cal has a perfect season and shuts out all its opponents, the furthest one's predictions could be over the course of the season is 12.00, 1.00 for each game. The closer one's cumulative deviation is to 12, the worse the grade. With some basic math, we can turn the cumulative deviation into a grade. I followed the basic strategy I explained, but I took the square root of the deviations for each game. Accordingly, predictions were penalized more harshly as they became further from reality. Let's see how the best and the worst of you fared:
The Ursadamus Award
The first award goes to those with the most accurate predictions over the course of the season. Look at that: no one earned a passing grade!
|5. Old Blue 1975||5.26||56.2%|
|8. Nasal Mucus Goldenbear||5.64||53.0%|
|9. SoCal Oski||5.68||52.6%|
Boset leads the way, followed by giobear and Don'thavesbnationID.
These scores would have been among the worst last season, but they're downright accurate for the topsy-turvy 2012 season. You all didn't quite anticipate a 3-win season, but your predictions tended to be a little less sunny than the rest of ours.
The Miss Cleo Award
Our second award goes to the least accurate among you: those who foresaw anything but a 3-win season.
|Miss Cleo Award|
|9. Marshawn Rodgers||7.39||38.4%|
Giving the Bears a 100% chance to win each of the games, jiggest and solarise were, by far, the least accurate of the bunch.
Don't let these embarassingly low scores keep you from participating in the next round of season predictions!
Writers and Moderators
Finally, we have selected the ballots of your beloved mods and writers for CGB. Did our countless hours of writing about our team allow us to be more accurate in our predictions?
|53. iVinshe (Vincent S)||6.30||47.5%|
|71. Ohio Bear||6.41||46.6%|
No, we fared pretty badly. TwistNHook managed to have the most accurate predictions, followed by Vincent S and Ohio Bear. Overall, we all sat somewhere in the middle of the pack, except for solarise, whose undying optimism carried him to the lowest possible spot.
Collectively, it looks like none of us saw this coming. We had a 1-in-200 chance of doing this badly. Hopefully we swing to the other end next season and achieve that 1-in-200 shot at a 14-win season. The Sunshine Pumpers will rise again!
Thanks for participating, everyone! We'll see you in a couple months for our first round of 2013 season predictions!