Who Has the Best Fantasy Football Projections? 2017 Update
73In prior posts, we demonstrated how to download projections from numerous sources, calculate custom projections for your league, and compare the accuracy of different sources of projections (2013, 2014, 2015, 2016). In the latest version of our annual series, we hold the forecasters accountable and see who had the most and least accurate fantasy football projections over the last 5 years.
The R Script
You can download the R script for comparing the projections from different sources here. You can download the historical projections and performance using our Projections tool.
To compare the accuracy of the projections, we use the following metrics:
- R-squared (R2) – higher is better
- Mean absolute scaled error (MASE) – lower is better
Whose Predictions Were the Best?
Source | 2012 | 2013 | 2014 | 2015 | 2016 | Average | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
R2 | MASE | R2 | MASE | R2 | MASE | R2 | MASE | R2 | MASE | R2 | MASE | |
Fantasy Football Analytics: Average | .670 | .545 | .612 | .573 | .618 | .577 | .626 | .553 | .645 | .535 | .634 | .557 |
Fantasy Football Analytics: Robust Average | .667 | .549 | .612 | .573 | .613 | .581 | .628 | .554 | .644 | .536 | .633 | .559 |
Fantasy Football Analytics: Weighted Average | .626 | .553 | .645 | .535 | .636 | .544 | ||||||
CBS Average | .637 | .604 | .479 | .722 | .575 | .632 | .500 | .664 | .559 | .625 | .550 | .649 |
ESPN | .576 | .669 | .500 | .705 | .498 | .723 | .615 | .585 | .630 | .551 | .564 | .647 |
FantasyData | .531 | .639 | .531 | .639 | ||||||||
FantasyFootballNerd | .370 | .785 | .281 | .767 | .501 | .641 | .384 | .731 | ||||
FantasyPros | .613 | .572 | .608 | .585 | .610 | .561 | .610 | .573 | ||||
FantasySharks | .529 | .673 | .606 | .592 | .568 | .633 | ||||||
FFtoday | .661 | .551 | .550 | .646 | .530 | .659 | .546 | .626 | .574 | .618 | .572 | .620 |
NFL.com | .551 | .650 | .505 | .709 | .518 | .692 | .582 | .632 | .605 | .584 | .552 | .653 |
WalterFootball | .472 | .713 | .431 | .724 | .483 | .718 | .462 | .718 | ||||
Yahoo | .547 | .645 | .635 | .554 | .624 | .562 | .602 | .587 |
- Fantasy Football Analytics: Average (or Weighted Average)
- Fantasy Football Analytics: Robust Average
- FantasyPros
- Yahoo (ProFootballFocus)
- FFtoday
- FantasySharks
- FantasyData
- ESPN
- CBS
- NFL.com
- WalterFootball
- FantasyFootballNerd
Notes: CBS estimates were averaged across Jamey Eisenberg and Dave Richard. FantasyFootballNerd projections include only their free projections (not their full subscription projections). We did not calculate the weighted average prior to 2015. The accuracy estimates may differ slightly from those provided in prior years because a) we now use standard league scoring settings (you can see the league scoring settings we used here) and b) we are only examining the following positions: QB, RB, WR, and TE. The weights for the weighted average were based on historical accuracy (1-MASE). For the analysts not included in the accuracy calculations, we calculated the average (1-MASE) value and subtracted 1/2 the standard deviation of (1-MASE). The weights in the weighted average for 2016 were:
CBS Average: .344
ESPN: .329
FantasyData: .428
FantasySharks: .327
FFToday: .379
NFL.com: .329
WalterFootball: .281
Yahoo Sports: .400
Here is a scatterplot of our average projections in relation to players’ actual fantasy points scored in 2016:
Interesting Observations
- Projections that combined multiple sources of projections (FFA Average, Weighted Average, Robust Average) were more accurate than all single sources of projections (e.g., CBS, NFL.com, ESPN) every year. This is consistent with the wisdom of the crowd.
- FFA projections were more accurate than projections from FantasyPros. This may be because we include more sources of projections.
- The simple average (mean) was more accurate than the robust average. The robust average gives extreme values less weight in the calculation of the average. This suggests that outliers may reflect meaningful sources of variance (i.e., they may help capture a player’s ceiling/floor) and may not just be bad projections (i.e., error/noise).
- The weighted average was equally accurate compared to the simple average. Weights were based on historical accuracy. If the best analysts are consistently more accurate than other analysts, the weighted average will likely outperform the mean. If, on the other hand, analysts don’t reliably outperform each other, the mean might be as or more accurate. Given the mean and weighted average were equally accurate each year, the evidence suggests that analysts don’t consistently outperform (or underperform) each other.
- The FFA Average explained 57–67% of the variation in players’ actual performance. That means that the projections are somewhat accurate but have much room for improvement in terms of prediction accuracy. 1/3 to 1/2 of the variance in actual points is unexplained by projections. Nevertheless, the projections are likely more accurate than pre-season rankings.
- The R-squared of the FFA average projection was .67 in 2012, .57 in 2013, .62 in 2014, .63 in 2015, and .65 in 2016. This suggests that players are more predictable in some years than others.
- There was little consistency in performance across time among sites that used single projections (CBS, NFL.com, ESPN). In 2012, CBS was the most accurate single source of projection but they were the least accurate in 2013. Moreover, ESPN was among the least accurate in 2014, but they were among the most accurate in 2015. This suggests that no single source reliably outperforms the others. While some sites may do better than others in any given year (because of fairly random variability–i.e., chance), it is unlikely that they will continue to outperform the other sites.
- Projections were more accurate for some positions than others. Projections were much more accurate for QBs and WRs than for RBs. Projections were the least accurate for Ks, DBs, and DSTs. For more info, see here. Here is how positions ranked in accuracy of their projections (from most to least accurate):
- QB: R2 = .73
- WR: R2 = .57
- TE: R2 = .55
- LB: R2 = .53
- RB: R2 = .48
- DL: R2 = .45
- K: R2 = .38
- DB: R2 = .37
- DST: R2 = .24
- Projections over-estimated players’ performance by about 4–10 points every year across most positions (based on mean error). It will be interesting to see if this pattern holds in future seasons. If it does, we could account for this over-expectation in players’ projections. In a future post, I hope to explore the types of players for whom this over-expectation occurs.
Conclusion
Fantasy Football Analytics had the most accurate projections over the last five years. Why? We average across sources. Combining sources of projections removes some of their individual judgment biases (error) and gives us a more accurate fantasy projection. No single source (CBS, NFL.com, ESPN) reliably outperformed the others or the crowd, suggesting that differences between them are likely due in large part to chance. In sum, crowd projections are more accurate than individuals’ judgments for fantasy football projections. People often like to “go with their gut” when picking players. That’s fine—fantasy football is a game. Do what is fun for you. But, crowd projections are the most reliably accurate of any source. Do with that what you will! But don’t take my word for it. Examine the accuracy yourself with our Projections tool and see what you find. And let us know if you find something interesting!
Yahoo is the same as Pro Football Focus, right?
Yes, Yahoo uses the same projections as PFF.
Isaac what is your email address as I would like to send you a comment rather than posting here? Is that possible?
You’re using the FantasyPros composite, right? Can you list which composite of theirs you are using? If you’re double-using some of the other sources in this composite, that introduces issues of multicollinearity, right?
We compared our projections to the FantasyPros composite. We did not include FantasyPros projections in our composite because of the issue of double-counting. For more info, see here:
https://fantasyfootballanalytics.net/2014/06/custom-rankings-and-projections-for-your-league.html
trying out your projection tool 🙂
Isaac, since I have not heard from you regards an email address I will now make the comment I wanted to make via an email to you. While I remain a fan I would have to say that my belief in your data took a fall this past year as my draft choices based on your projections did not meet my expectations. Perhaps I placed your data on too high a pedestal given that there is only a low 60 odd percentage of predictiveness in yours or others projections. FYI
Hi Brian,
As we discuss in the article, there is considerable room for improvement of projection accuracy. About one-third to one-half of the variance in players’ performance is unexplained by projections. Predicting complex things like behavior (athletic performance) is hard because behavior is over-determined (https://en.wikipedia.org/wiki/Overdetermination). Unlike other sites, we are transparent with our approach and accuracy. We take a Wisdom of the Crowd Approach, with the consequence being that our data are only as good as that of our sources. Unlike other sites, we also calculate our (in)accuracy. First, we provide estimates of prediction error in the form of a floor and ceiling to give people a sense of the un(certainty) of our estimates. We also calculate our historical accuracy with multiple advanced statistical metrics to allow users to examine our accuracy and compare it to others. In any case, we’ve shown that our projections are more accurate than individual sources and other sources that aggregate projections. We hope our projections continue to improve in the future; at the same time, the evidence suggests that these are the most accurate projections available.
Thanks for your considered reply!
I see that 4for4 was not included in the study. Any particular reason for that? I’ve used them for a few years now, and the past 2 seasons have been spotty imo. If there is a more accurate service, surely I’d like to try them out.
4for4 is a subscription service. That’s why we didn’t include them. We’ve already compared subscription sources of projections to free sources of projections. Subscription sources are not more accurate than free sources of projections, and may actually be less accurate:
https://fantasyfootballanalytics.net/2016/04/subscription-sources-accurate.html
Are certain positions on certain teams more apt to get hurt and miss time because of training techniques or training staff?
Couple questions:
1. When will the 2017 season projections be up? At least ESPN and Fantasy Sharks have projections up already.
2. Have you ever considered dropping a couple sources from your consensus to make a sort of “discerning consensus”? From my quick studies, it seems like including CBS and Walter Football in the consensus drops accuracy every year.
3. Will we be able to do analysis only on “Fantasy relevant” players if we like to? Could you possibly include a selection for “include only above-baseline players” so that only the top X players according to projections are selected for the accuracy analysis? This would help me to do analysis that actually fits most closely with what I do practically (looking for accuracy only among those players who I might legitimately draft)
Regarding 2), it would seem to me that unless you are considering CBS and Walter to be categorically INaccurate, you always want to include as many sources as possible, as they will tend to balance out outliers and reinforce consensuses. Even the “best” individual projections have their biases.
Yes, that is what I am saying. Given that they have a track record of diluting the consensus accuracy year in and year out, my guess is that they are categorically inaccurate.
We give users the flexibility to choose whatever weights they want to give each source.
Which I love! And I’ve been doing that…I’m just saying, my weighted consensus beats yours every year…so booya!
We are always looking for users to submit their projections (with whatever weights they want) to see if they are more accurate. Feel free to submit your projections this year, and we’ll see!
Hi Isaac – are you using the same n of players when calculating the metrics for all sources? I assume r2 gets much higher the more players you include – since lower scoring players would be easier to predict.
I would be very interested in seeing an analysis of accuracy for only the top 50 and top 100 players from each sources draft board.
Hi Matt,
Yes, the accuracy depends on which positions and players are examined. Different sites provide projections for different players. We examine the accuracy of all available players for whom we can find projections. Nevertheless, you can examine our (and others’) accuracy for other positions using our tools:
https://fantasyfootballanalytics.net/2015/07/accuracy-of-fantasy-football-projections-interactive-scatterplot-in-r.html
We plan on building the functionality to allow users to limit the calculation of accuracy to the top players.
Hope that helps,
Isaac
Hi, when are you going to update your rankings?
By rankings, I mean projections for all players post 2017.
All players post draft 2017
Seconded! Drafts coming up, valuable resource in the past
Just wanted to say thanks for this site. Super useful for my research and visuals. Helped me win best overall record and most points scored in regular season the last 2 years. Very useful to help me ID undervalued players in my auction draft.
I think the R2 and MASE values for FFA average in 2013 might be flipped.
Fixed, thanks!
Hi Isaac,
Great stuff as usual. I wondered if there was a reason why Numberfire fell off the list of sites?
Reason being that I generally am a big fan of their content and processes for analysis (both site and podcasts) and I was surprised when you did one of these a couple of years ago and noted that they had some of the worst projections. They were quite a new site at the time, so I was interested in seeing if they were improving over time.
If I recall, we had trouble scraping them because they were behind a login wall. If someone can figure out how to scrape their projections, we’d absolutely be willing to include them!
Thanks to this site. I am now a football fanatic. I was originally doing this to learn R.
Almost won the championship last year too; landed 2nd place with 12th pick of 12.
Will you scrape ESPN’s IDP projections this year?
Why are players missing? Example: Leonard Fournette
Here’s are some of the QB, RB, WR missing???
QBs Missing: *Derek Carr, *Marcus Mariota, A.J. McCarron, *Deshaun Watson
RBs: Missing: *Fournette,* McCaffrey, Pumphrey, Damien Williams
WRs: *C Brown, *M Bryant, C Davis, M Goodwin, *M Thomas, D Westbrook, T Williams
I imagine as the season approaches you’ll update the site more often, but until then do you have a game plan on how often you will be updating it? It seems that it hasn’t been updated since early June.
Thanks Isaac
We have updated as of today!
Thank you Val. However, can you give me some idea as to what your game plan will be going forward as it pertains to updating these rankings? Again, it has been 3+ weeks…
Will you be doing it weekly, bi-weekly, daily, what? starting when? Thank you!
In a recent article on Fantasy Pros, looks like 4for4 has the most accurate ranks, yet you don’t use them in this comparison?
Those are subscription projections, so we can’t make them available to the public. We tested subscription sources of projections and actually found them to be *less* accurate than free sources of projections:
https://fantasyfootballanalytics.net/2016/04/subscription-sources-accurate.html
What is the cadence for scraping updated projections?
Isaac – Do you plan on updating your average projections for 2017 any time soon?
We are adding a script to scrape the projections daily!
If I sent you an email about the fantasy projections for the 4 for 4 fantasy website could you look at it and factor it in into your projections for 2017 season?
Hey Vinit, sure send them along to admin@fantasyfootballanalytics.net and we can take a look
Why have you used Rotogrinder or Numberfire in your projection.
Fantasy Index has a custom scoring system that they provide. Have you ever looked at it to compare its projections accuracy?.
What is the link link to their publicly available projections?
Hi Isaac, the projection tool doesn’t save my league and scoring settings beyond the session in which I’m using it – e.g. I have to reconfigure the tool everytime I login. Is this by design or a bug? For troubleshooting purposes: yes, I am logged in. Yes I’ve tried logging out and logging back in. Yes, I’m a paid subscriber.
Did you select the scoring settings you saved under the “League Settings” page (“League Scoring” option)?
How do you determine the weights you use for each source?
The article above explains that (they’re based on historical accuracy).
Working under the assumption you used the standard R^2 calculation, I’m wondering — have you ever considered using a variant on that? I’d be interested in the outcome of a weighted correlation (I use the wCorr package) and/or a Spearman correlation in place of the Pearson.
We also calculate MASE (see above).
Under our specific league settings the 49ers defense and the redskins def is coming in at picks 29 and 33 respectively…what am i doing wrong?
adjust your VOR
https://fantasyfootballanalytics.net/about-the-site/faq#incorrectProjection
Thanks for the site! I found it 2 years ago when I first started playing and I’ve never finished lower than 3rd using your projections.
I have noticed that the projections for the upcoming year seem to strongly reflect last year’s performance. This makes some sense because previous years’ data is all we have to go on when formulating projections. But, it makes me curious how accurate last year’s stats are as a predictor of this year’s stats. Have you computed R^2 and MASE using e.g. the 2015 stats as a source of prediction for 2016 performance? I would be interested to see how a “Previous Year Performance” source stacks up against the other sources you evaluate.
I’ve noticed a glitch in your current projections. Kenneth Dixon is currently 3rd in VOR, Ryan Tannehill is 5th and other injured and cut players are in the top 20.
Are these seasonal or weekly? From context I’m guessing seasonal. I’d be interested in seeing an analysis of weekly projections.
Also, I would be interested in seeing two things:
1. An analysis/discussion of which sources (free and subscription) are responsive to game-day updates. E.g. find some examples of late-breaking depth chart changes and see which sites responded (e.g. there was an infamous-in-DFS-circles “Rawls week” in 2015, week 11 I think, when Marshawn Lynch’s’ injury was reported just before lock. Getting Rawls into your lineups was key). For weekly projections this will be a major source of error and it is important for DFS players when choosing a projection provider.
2. An analysis/discussion of which projection providers include any kind of uncertain metric (e.g. Numberfire’s CI, 4for4’s floor/ceiling), and a review of which of these is most “accurate”. For my DFS methodology I use probability distributions which I can construct from these metrics in various ways. I am not interested in metrics based on the distribution of aggregated projections (as your floors/ceilings are), but rather the actual outcome distribution.
Obviously, this is personally motivated by my own methodology, which requires some kind of uncertainty metrics and is updated ideally until the last game starts. Slight differences in season-long accuracy are not very interesting to me. I understand that this is your bread and butter but getting into some of the more advanced topics like creating accurate probability distributions (even joint probability distributions–another element of my methodology, but no one provides that so I use a copula to do it myself) would really make you stick out. These are important issues for mathematically inclined DFS players and no one discusses them–instead the fantasy analytics community just rehashes the same stale points about wisdom of the crowds and small improvements in accuracy. It’s time for something new, and I’m posting here because I believe you are one of the few sites who could pull it off.
I want to add, with regard to #1 above, that I think a big advantage for subscription projection sources is that they put in the effort to tweak projections by hand weekly in response to depth chart changes. While this is something that certainly could be automated in a model, I’m inclined to doubt that anyone does it that way.
There are other elements that some weekly projectors adjust by hand rather than model, I’m sure, and I imagine some of the paid sources put more effort into this. These kinds of tweaks are not going to show up in season-long accuracy, however.
The above analysis is for seasonal projections. You can examine the accuracy of seasonal and weekly projections in our webapps with many different accuracy metrics:
https://fantasyfootballanalytics.net/2015/07/accuracy-of-fantasy-football-projections-interactive-scatterplot-in-r.html
Or you can download our historical data and analyze them however you’d like:
https://fantasyfootballanalytics.net/2016/06/download-projections.html
Just out of curiousity: Have you checked to see if projections are more or less accurate in different parts of the range (e.g., top 50 or so players vs. players ranked below 100)? I’m wondering if the potentially more consistent playing time / playing opportunities of the former group relative to the latter may result in differences in the ability to predict performance in different parts of the range.
Hi Isaac,
First off thank you. Your site and apps are awesome. Have you gotten any users to submit weights for each source? I struggle each year with using the best weighting methodology. I plan to just the simple weights this year, but wondering if any users have submitted a weighting methodology that may improve the R2 or MASE values above the weighted simple. Thanks so much, and really appreciate your site and info. Best out there for sure!
I’ve noticed Yahoo and Fantasy Sharks aren’t included in the weightings for week 3 projections yet. I noticed the same thing for week 2 (last week, Friday), however pulling up week 2 weightings today, Yahoo and Fantasy Sharks are both included. Is this a timing thing for week 3? i.e., will Yahoo and Fantasy Sharks projections be included in the weightings closer to Sunday?
2018 projections for the web app, when are they coming out?
How come the list has not been updated since 2016 would be nice to see the data if possible; thanks!
You can access this historical accuracy anytime in our webapps, see here:
https://fantasyfootballanalytics.net/2015/07/accuracy-of-fantasy-football-projections-interactive-scatterplot-in-r.html
How do you keep from going postal when people ask the same questions over and over?
Isaac! Are you doing the experiment for 2018? What caused you to exclude the Football Guys Henry & Dodds?
Isaac, do you have a new projections ranking as of late? Could you post for us to evaluate before/during draft season? Much appreciated!
Hi! Here’s how to examine the accuracy of projections:
https://fantasyfootballanalytics.net/2015/07/accuracy-of-fantasy-football-projections-interactive-scatterplot-in-r.html
Any chance we can get updated comparisons from the last couple of seasons?
As we continue to build and update our app, this is something we will be looking at developing. For now you can select past seasons in the app to view how players have performed historically.
I think you have the best fantasy football projections out there!