# Who Has the Best Fantasy Football Projections? 2017 Update

23In prior posts, we demonstrated how to download projections from numerous sources, calculate custom projections for your league, and compare the accuracy of different sources of projections (2013, 2014, 2015, 2016). In the latest version of our annual series, we hold the forecasters accountable and see who had the most and least accurate fantasy football projections over the last 5 years.

## The R Script

You can download the R script for comparing the projections from different sources here. You can download the historical projections and performance using our Projections tool.

To compare the accuracy of the projections, we use the following metrics:

- R-squared (R
^{2}) – higher is better - Mean absolute scaled error (MASE) – lower is better

## Whose Predictions Were the Best?

Source | 2012 | 2013 | 2014 | 2015 | 2016 | Average | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|

R^{2} |
MASE | R^{2} |
MASE | R^{2} |
MASE | R^{2} |
MASE | R^{2} |
MASE | R^{2} |
MASE | |

Fantasy Football Analytics: Average | .670 | .545 | .567 | .635 | .618 | .577 | .626 | .553 | .645 | .535 | .625 | .569 |

Fantasy Football Analytics: Robust Average | .667 | .549 | .561 | .636 | .613 | .581 | .628 | .554 | .644 | .536 | .623 | .571 |

Fantasy Football Analytics: Weighted Average | .626 | .553 | .645 | .535 | .636 | .544 | ||||||

CBS Average | .637 | .604 | .479 | .722 | .575 | .632 | .500 | .664 | .559 | .625 | .550 | .649 |

ESPN | .576 | .669 | .500 | .705 | .498 | .723 | .615 | .585 | .630 | .551 | .564 | .647 |

FantasyData | .531 | .639 | .531 | .639 | ||||||||

FantasyFootballNerd | .370 | .785 | .281 | .767 | .501 | .641 | .384 | .731 | ||||

FantasyPros | .613 | .572 | .608 | .585 | .610 | .561 | .610 | .573 | ||||

FantasySharks | .529 | .673 | .606 | .592 | .568 | .633 | ||||||

FFtoday | .661 | .551 | .550 | .646 | .530 | .659 | .546 | .626 | .574 | .618 | .572 | .620 |

NFL.com | .551 | .650 | .505 | .709 | .518 | .692 | .582 | .632 | .605 | .584 | .552 | .653 |

WalterFootball | .472 | .713 | .431 | .724 | .483 | .718 | .462 | .718 | ||||

Yahoo | .547 | .645 | .635 | .554 | .624 | .562 | .602 | .587 |

- Fantasy Football Analytics: Average (or Weighted Average)
- Fantasy Football Analytics: Robust Average
- FantasyPros
- Yahoo (ProFootballFocus)
- FFtoday
- FantasySharks
- FantasyData
- ESPN
- CBS
- NFL.com
- WalterFootball
- FantasyFootballNerd

Notes: CBS estimates were averaged across Jamey Eisenberg and Dave Richard. FantasyFootballNerd projections include only their free projections (not their full subscription projections). We did not calculate the weighted average prior to 2015. The accuracy estimates may differ slightly from those provided in prior years because a) we now use standard league scoring settings (you can see the league scoring settings we used here) and b) we are only examining the following positions: QB, RB, WR, and TE. The weights for the weighted average were based on historical accuracy (1-MASE). For the analysts not included in the accuracy calculations, we calculated the average (1-MASE) value and subtracted 1/2 the standard deviation of (1-MASE). The weights in the weighted average for 2016 were:

CBS Average: .344

ESPN: .329

FantasyData: .428

FantasySharks: .327

FFToday: .379

NFL.com: .329

WalterFootball: .281

Yahoo Sports: .400

Here is a scatterplot of our average projections in relation to players’ actual fantasy points scored in 2016:

## Interesting Observations

- Projections that combined multiple sources of projections (FFA Average, Weighted Average, Robust Average) were more accurate than all single sources of projections (e.g., CBS, NFL.com, ESPN) every year. This is consistent with the wisdom of the crowd.
- FFA projections were more accurate than projections from FantasyPros. This may be because we include more sources of projections.
- The simple average (mean) was more accurate than the robust average. The robust average gives extreme values less weight in the calculation of the average. This suggests that outliers may reflect meaningful sources of variance (i.e., they may help capture a player’s ceiling/floor) and may not just be bad projections (i.e., error/noise).
- The weighted average was equally accurate compared to the simple average. Weights were based on historical accuracy. If the best analysts are consistently more accurate than other analysts, the weighted average will likely outperform the mean. If, on the other hand, analysts don’t reliably outperform each other, the mean might be as or more accurate. Given the mean and weighted average were equally accurate each year, the evidence suggests that analysts don’t consistently outperform (or underperform) each other.
- The FFA Average explained 57–67% of the variation in players’ actual performance. That means that the projections are somewhat accurate but have much room for improvement in terms of prediction accuracy. 1/3 to 1/2 of the variance in actual points is unexplained by projections. Nevertheless, the projections are likely more accurate than pre-season rankings.
- The R-squared of the FFA average projection was .67 in 2012, .57 in 2013, .62 in 2014, .63 in 2015, and .65 in 2016. This suggests that players are more predictable in some years than others.
- There was little consistency in performance across time among sites that used single projections (CBS, NFL.com, ESPN). In 2012, CBS was the most accurate single source of projection but they were the least accurate in 2013. Moreover, ESPN was among the least accurate in 2014, but they were among the most accurate in 2015. This suggests that no single source reliably outperforms the others. While some sites may do better than others in any given year (because of fairly random variability–i.e., chance), it is unlikely that they will continue to outperform the other sites.
- Projections were more accurate for some positions than others. Projections were much more accurate for QBs and WRs than for RBs. Projections were the least accurate for Ks, DBs, and DSTs. For more info, see here. Here is how positions ranked in accuracy of their projections (from most to least accurate):
- QB: R
^{2}= .73 - WR: R
^{2}= .57 - TE: R
^{2}= .55 - LB: R
^{2}= .53 - RB: R
^{2}= .48 - DL: R
^{2}= .45 - K: R
^{2}= .38 - DB: R
^{2}= .37 - DST: R
^{2}= .24

- QB: R
- Projections over-estimated players’ performance by about 4–10 points every year across most positions (based on mean error). It will be interesting to see if this pattern holds in future seasons. If it does, we could account for this over-expectation in players’ projections. In a future post, I hope to explore the types of players for whom this over-expectation occurs.

## Conclusion

Fantasy Football Analytics had the most accurate projections over the last five years. Why? We average across sources. Combining sources of projections removes some of their individual judgment biases (error) and gives us a more accurate fantasy projection. No single source (CBS, NFL.com, ESPN) reliably outperformed the others or the crowd, suggesting that differences between them are likely due in large part to chance. In sum, crowd projections are more accurate than individuals’ judgments for fantasy football projections. People often like to “go with their gut” when picking players. That’s fine—fantasy football is a game. Do what is fun for you. But, crowd projections are the most reliably accurate of any source. Do with that what you will! But don’t take my word for it. Examine the accuracy yourself with our Projections tool and see what you find. And let us know if you find something interesting!

Yahoo is the same as Pro Football Focus, right?

Yes, Yahoo uses the same projections as PFF.

Isaac what is your email address as I would like to send you a comment rather than posting here? Is that possible?

You’re using the FantasyPros composite, right? Can you list which composite of theirs you are using? If you’re double-using some of the other sources in this composite, that introduces issues of multicollinearity, right?

We compared our projections to the FantasyPros composite. We did not include FantasyPros projections in our composite because of the issue of double-counting. For more info, see here:

http://fantasyfootballanalytics.net/2014/06/custom-rankings-and-projections-for-your-league.html

trying out your projection tool 🙂

Isaac, since I have not heard from you regards an email address I will now make the comment I wanted to make via an email to you. While I remain a fan I would have to say that my belief in your data took a fall this past year as my draft choices based on your projections did not meet my expectations. Perhaps I placed your data on too high a pedestal given that there is only a low 60 odd percentage of predictiveness in yours or others projections. FYI

Hi Brian,

As we discuss in the article, there is considerable room for improvement of projection accuracy. About one-third to one-half of the variance in players’ performance is unexplained by projections. Predicting complex things like behavior (athletic performance) is hard because behavior is over-determined (https://en.wikipedia.org/wiki/Overdetermination). Unlike other sites, we are transparent with our approach and accuracy. We take a Wisdom of the Crowd Approach, with the consequence being that our data are only as good as that of our sources. Unlike other sites, we also calculate our (in)accuracy. First, we provide estimates of prediction error in the form of a floor and ceiling to give people a sense of the un(certainty) of our estimates. We also calculate our historical accuracy with multiple advanced statistical metrics to allow users to examine our accuracy and compare it to others. In any case, we’ve shown that our projections are more accurate than individual sources and other sources that aggregate projections. We hope our projections continue to improve in the future; at the same time, the evidence suggests that these are the most accurate projections available.

Thanks for your considered reply!

I see that 4for4 was not included in the study. Any particular reason for that? I’ve used them for a few years now, and the past 2 seasons have been spotty imo. If there is a more accurate service, surely I’d like to try them out.

4for4 is a subscription service. That’s why we didn’t include them. We’ve already compared subscription sources of projections to free sources of projections. Subscription sources are not more accurate than free sources of projections, and may actually be less accurate:

http://fantasyfootballanalytics.net/2016/04/subscription-sources-accurate.html

Are certain positions on certain teams more apt to get hurt and miss time because of training techniques or training staff?

Couple questions:

1. When will the 2017 season projections be up? At least ESPN and Fantasy Sharks have projections up already.

2. Have you ever considered dropping a couple sources from your consensus to make a sort of “discerning consensus”? From my quick studies, it seems like including CBS and Walter Football in the consensus drops accuracy every year.

3. Will we be able to do analysis only on “Fantasy relevant” players if we like to? Could you possibly include a selection for “include only above-baseline players” so that only the top X players according to projections are selected for the accuracy analysis? This would help me to do analysis that actually fits most closely with what I do practically (looking for accuracy only among those players who I might legitimately draft)

Regarding 2), it would seem to me that unless you are considering CBS and Walter to be categorically INaccurate, you always want to include as many sources as possible, as they will tend to balance out outliers and reinforce consensuses. Even the “best” individual projections have their biases.

Yes, that is what I am saying. Given that they have a track record of diluting the consensus accuracy year in and year out, my guess is that they are categorically inaccurate.

We give users the flexibility to choose whatever weights they want to give each source.

Which I love! And I’ve been doing that…I’m just saying, my weighted consensus beats yours every year…so booya!

We are always looking for users to submit their projections (with whatever weights they want) to see if they are more accurate. Feel free to submit your projections this year, and we’ll see!

Hi Isaac – are you using the same n of players when calculating the metrics for all sources? I assume r2 gets much higher the more players you include – since lower scoring players would be easier to predict.

I would be very interested in seeing an analysis of accuracy for only the top 50 and top 100 players from each sources draft board.

Hi Matt,

Yes, the accuracy depends on which positions and players are examined. Different sites provide projections for different players. We examine the accuracy of all available players for whom we can find projections. Nevertheless, you can examine our (and others’) accuracy for other positions using our tools:

http://fantasyfootballanalytics.net/2015/07/accuracy-of-fantasy-football-projections-interactive-scatterplot-in-r.html

We plan on building the functionality to allow users to limit the calculation of accuracy to the top players.

Hope that helps,

Isaac

Hi, when are you going to update your rankings?

By rankings, I mean projections for all players post 2017.

All players post draft 2017