2015-16 Team win projections
Re: 2015-16 Team win projections
BR win projections now available. http://www.basketball-reference.com/fri ... f_prob.cgi
Re: 2015-16 Team win projections
So these win projections have an R^2 of 0.9986 with what I've been posting, using the Kubatko method (update here: http://i.imgur.com/O1kZrSp.png). Except, B-Ref's numbers are super regressed to the mean (which doesn't disrupt the correlation). The Warriors are a historically good team so far, and are going to win under 60 games for instance.
This looks too heavily regressed to the mean based on what I've seen. I add 9.8 dummy games to each team's SRS to get their rating going forward (essentially adding 9.8 games of league average performance). That's the best fit based on the last 20 years of data. B-Ref is essentially adding 17.4 dummy games here, resulting in 15 Pyth + Regression once again kicking everyone's butts.
Of course, maybe I'm just defensive since the other method has me leading right now.
This looks too heavily regressed to the mean based on what I've seen. I add 9.8 dummy games to each team's SRS to get their rating going forward (essentially adding 9.8 games of league average performance). That's the best fit based on the last 20 years of data. B-Ref is essentially adding 17.4 dummy games here, resulting in 15 Pyth + Regression once again kicking everyone's butts.
Of course, maybe I'm just defensive since the other method has me leading right now.

Re: 2015-16 Team win projections
Maybe it is not much different, but maybe one chart using both would be good. I think several references were used last season.
My understanding is that the BR projection simulates the entire season game by game 7,500 times. Does your method simulate the season game by game? If yes, is it once or more?
My understanding is that the BR projection simulates the entire season game by game 7,500 times. Does your method simulate the season game by game? If yes, is it once or more?
Re: 2015-16 Team win projections
Agreed, yours "looks" better to me, I don't agree with the super regressed projections. Seems unrealistic.kmedved wrote:So these win projections have an R^2 of 0.9986 with what I've been posting, using the Kubatko method (update here: http://i.imgur.com/O1kZrSp.png). Except, B-Ref's numbers are super regressed to the mean (which doesn't disrupt the correlation). The Warriors are a historically good team so far, and are going to win under 60 games for instance.
This looks too heavily regressed to the mean based on what I've seen. I add 9.8 dummy games to each team's SRS to get their rating going forward (essentially adding 9.8 games of league average performance). That's the best fit based on the last 20 years of data. B-Ref is essentially adding 17.4 dummy games here, resulting in 15 Pyth + Regression once again kicking everyone's butts.
Of course, maybe I'm just defensive since the other method has me leading right now.
Re: 2015-16 Team win projections
Here's a link to another projection.
Don't know how it's done but it looks the best that I've seen so far.
https://www.teamrankings.com/nba/projections/standings/
Don't know how it's done but it looks the best that I've seen so far.
https://www.teamrankings.com/nba/projections/standings/
Re: 2015-16 Team win projections
I added the B-Ref ratings, and the TeamRankings ratings to the list. RMSE now based on the average of the 3.


I don't do any simulations. Just generate a win probability for each game, and then sum them up.Crow wrote:My understanding is that the BR projection simulates the entire season game by game 7,500 times. Does your method simulate the season game by game? If yes, is it once or more?
Re: 2015-16 Team win projections
Thanks for the reply and scoresheet enrichment.
Re: 2015-16 Team win projections
I looked all the teams in the 3 point era with a point differential of +12 per game (Warriors currently at +15 MOV and +12 SRS) during their first 12 team games. I had 15 teams that matched that criteria. Results:kmedved wrote:So these win projections have an R^2 of 0.9986 with what I've been posting, using the Kubatko method (update here: http://i.imgur.com/O1kZrSp.png). Except, B-Ref's numbers are super regressed to the mean (which doesn't disrupt the correlation). The Warriors are a historically good team so far, and are going to win under 60 games for instance.
This looks too heavily regressed to the mean based on what I've seen. I add 9.8 dummy games to each team's SRS to get their rating going forward (essentially adding 9.8 games of league average performance). That's the best fit based on the last 20 years of data. B-Ref is essentially adding 17.4 dummy games here, resulting in 15 Pyth + Regression once again kicking everyone's butts.
Of course, maybe I'm just defensive since the other method has me leading right now.
First 12 games:
10.8 average wins (.900 win%) (Teams ranged from 9 wins to 12 wins)
+13.9 Point Differential
Next 70 games:
57.8 win pace
+6.19 Point Differential
Full season stats:
60.2 wins per 82 games
6.90 Average SRS
13 out of the 15 teams finished with 55+ wins, 11 out of 15 with 60+ wins. 2 of the teams finished with under 50 wins. 83 Sonics with 48 wins and 2012 sixers with 35 wins (44 prorated wins). 3 title teams in that mix. The Celtics or Lakers did it 4 times in the 80's but the other team won the title in every single one of those instances.
The 15 teams:
1996-97 CHI
2002-03 DAL
2011-12 PHI
1993-94 SEA
1985-86 LAL
1984-85 BOS
1980-81 PHO
2008-09 LAL
1990-91 POR
1979-80 BOS
1997-98 LAL
2007-08 BOS
1981-82 BOS
2000-01 SAC
1982-83 SEA
Re: 2015-16 Team win projections
So basically, we expect more than 50% regression to .500, from the first 12 to the next 70.colts18 wrote:..
First 12 games:
10.8 average wins (.900 win%) (Teams ranged from 9 wins to 12 wins)
+13.9 Point Differential
Next 70 games:
57.8 win pace
+6.19 Point Differential
Full season stats:
60.2 wins per 82 games
6.90 Average SRS
And 12 games is just about exactly (6.9/13.9) 50% regressed over 82 games.
Excellent work and reporting!
Re: 2015-16 Team win projections
At last. I'd been badgering them about this.Crow wrote:BR win projections now available. http://www.basketball-reference.com/fri ... f_prob.cgi
Using unsquared errors from the b-r.com projections, we get quite a different race:
Code: Select all
tzu 6.0 yoop 7.1 taco 7.6
KF 6.5 bbs 7.1 BD 7.6
km 6.7 snd 7.1 nr 7.9
Cal 6.8 itca 7.2 EZ 8.0
Crow 7.0 AJ 7.2 rsm 8.6
DSM 7.0 fpli 7.3 DrP 8.8
DF 7.0 MG 7.3 Dan 9.0
'15 Pyth again last at 9.7
The projection has both East and West winning 50% of 1230 games.
Re: 2015-16 Team win projections
Added unsquared error here.


-
- Posts: 151
- Joined: Sun Jul 14, 2013 4:58 am
- Contact:
Re: 2015-16 Team win projections
hey hey, not bad!Mike G wrote:At last. I'd been badgering them about this.Crow wrote:BR win projections now available. http://www.basketball-reference.com/fri ... f_prob.cgi
Using unsquared errors from the b-r.com projections, we get quite a different race:Tarrazu with the commanding lead.Code: Select all
tzu 6.0 yoop 7.1 taco 7.6 KF 6.5 bbs 7.1 BD 7.6 km 6.7 snd 7.1 nr 7.9 Cal 6.8 itca 7.2 EZ 8.0 Crow 7.0 AJ 7.2 rsm 8.6 DSM 7.0 fpli 7.3 DrP 8.8 DF 7.0 MG 7.3 Dan 9.0
'15 Pyth again last at 9.7
The projection has both East and West winning 50% of 1230 games.
Re: 2015-16 Team win projections
Semi-daily update:


-
- Posts: 201
- Joined: Thu Dec 04, 2014 12:58 pm
Re: 2015-16 Team win projections
Just looking at the differences in RMSE between our projections vs. '15 Pyth-Regression and our projections vs. bbrf.com's projections. Doesn't the bbrf projections use Pyth Win % too? Using those projections placed everyone of us toward a smaller RMSE. I'm no math or RAPM guru, so my simple question is why?
The Bearded Geek
Re: 2015-16 Team win projections
Why do we prefer squaring the errors to begin with?
A hypothetical -- For simplicity, suppose we're predicting a league of 8 teams:
Predictor AB has 7 teams exactly right: errors 0, and squared errors are 0. On the 8th team, he's off by 20.
YZ has missed those same 7 teams by 5, and the 8th he misses by 15.
On each of those 7 teams, YZ has a squared error of 25, vs a total of zero for AB. This gives AB an advantage of 175.
On that 8th team, the difference in squared errors is (400-225) also 175.
Despite AB being (equally) better on 7 of 8 predictions, both predictors get the same RMSE: sqrt(400/8) = 7.07
In (unsquared) absolute avg error, its 2.50 to 6.25
A hypothetical -- For simplicity, suppose we're predicting a league of 8 teams:
Predictor AB has 7 teams exactly right: errors 0, and squared errors are 0. On the 8th team, he's off by 20.
YZ has missed those same 7 teams by 5, and the 8th he misses by 15.
On each of those 7 teams, YZ has a squared error of 25, vs a total of zero for AB. This gives AB an advantage of 175.
On that 8th team, the difference in squared errors is (400-225) also 175.
Despite AB being (equally) better on 7 of 8 predictions, both predictors get the same RMSE: sqrt(400/8) = 7.07
In (unsquared) absolute avg error, its 2.50 to 6.25