Minitab Blog

How Well Did Minitab Predict Fantasy Football Player Performance? Part 1

Written by Kevin Rudy | Jan 20, 2012 8:28:00 PM

Five weeks into the NFL season, I used Minitab's regression analysis to predict player performance for the rest of the year. Now that the regular season is over, it's time to go back and see how accurate the regression analysis was! I'm going to use a fitted line plot to compare my predictions and the players' final averages.

Back in October, I ranked the top 25 players at each position and predicted the average number of fantasy points per game they would finish the season with. So how accurate were those predictions? Minitab’s fitted line plot will let us see how close my predictions were, and will also give us a regression equation relating my predictions and the players’ final averages. If my predictions were accurate, the slope should be close to 1 and the y-intercept should be close to 0. So it should look like:

 

Final Average = 0 + 1 * Predicted Average

If that’s the equation, then the final average equals my prediction exactly! But of course I don’t expect them to be exact. So how close were they? Here are the results.

Note: There were 23 players who had played in fewer than 5 games at the time of the study. The predictive model is based on players who played 5 games. So I went back to those players and took their average from the first 5 games they played. Also, the final averages are the players’ fantasy points scored per game through week 16. They do not include week 17.

The r-squared value is 86.7%, which means that my predictions account for 86.7% of the variation in the players’ final averages. The other 13.3% is most likely accounted for by injuries, player trades, and just random variation. The equation is:
   

 

 

Final Average = -1.69 + 1.045 * Predicted Average

 

  
So the slope is just about 1, which is good. However, the intercept isn’t quite 0, it’s -1.69. We can plug numbers into the equation to see how much we’d expect my predictions to differ from the actual average.

• Prediction of 10 points per game: Final Average = -1.69 + 1.045 * 10 = 8.8 points/game
• Prediction of 15 points per game: Final Average = -1.69 + 1.045 * 15 = 14 points/game
• Prediction of 20 points per game: Final Average = -1.69 + 1.045 * 20 = 19.2 point/game

So my regression model was high by about 1 point per game. Over the course of a 15-game season (again, no week 17 was included), that means I would overestimate a player’s final fantasy score by about 15 points. For example, my projection said Cam Newton would finish the season (through week 16) with about 358 points. He actually finished with 341. No, it wasn’t exact, but I’d say that’s close enough!

And more important than the actual projection is where the players were ranked. The regression model had Newton as the 2nd best quarterback. People were questioning if Cam Newton could keep up his high-scoring pace the entire season and finish as a top quarterback. The regression model showed that he would. And sure enough, Newton had one of the best rookie seasons ever and finished as the 3rd best quarterback.

But our model wasn't accurate for everybody. Remember when I said Ryan Fitzpatrick would be a top 10 quarterback, or when I said you should trade Roddy White before the rest of the league realized how bad he was doing? Oops. So next I'm going to break down the top 10 players at each position, and see how many I was able to correctly predict.

Photograph by RonAlmog. Images licensed under Creative Commons Attribution ShareAlike 2.0.