Page 1 of 1
2014 Sloan Research Papers Posted
Posted: Wed Feb 05, 2014 4:44 am
by steveshea
The 2014 finalists (or most of them) for the Sloan Sports Analytics Conference Research Paper Competition are now posted on Sloan's website:
http://www.sloansportsconference.com/?page_id=462
There are few basketball papers covering topics such as the hot hand, and machine recognition of on-ball screens.
Re: 2014 Sloan Research Papers Posted
Posted: Wed Feb 05, 2014 5:08 am
by mtamada
If that's the final list, it's interestingly lopsided: of the 8 papers currently listed, 4 are about basketball, 3 about baseball, and 1 seems to be about soccer/football (for some reason I can't download the files so I'm not able to actually read any of the papers). No football, hockey, track, auto racing, skiing, etc.
The new video data has created a new frontier for research into NBA data, so perhaps there is more pathbreaking research being done in basketball these days? Two, and perhaps three, of the basketball papers appear to use video data.
Re: 2014 Sloan Research Papers Posted
Posted: Wed Feb 05, 2014 5:18 am
by Dr Positivity
I've only read the hot hand one so far but it's basically what I feel as well. The "Hot Hand effect" exists in that its players believe in it, so when a player hits a few in a row, both the offensive player and the defensive player behaviours shift. However because of this, it is impossible to tell whether a player can actually get hot or not, or eg have a surge of confidence. Because both the offense and defensive behaviour is shifted, the output is statistically tainted. For example a player may have extra confidence from getting hot (effect A), but the defender may be playing him tighter (effect B). If effect B outweighs effect A, the offensive player is now more likely to miss despite getting mentally "more hot". So therefore the net result is the hot hand doesn't exist, but the reality is the hot hand existed and the defender blew a fire extinguisher on it too quick for it to show up in the statsheet
What I find interesting is that the hot hand debates conclusion is usually "It's a trick of the mind, a bunch of shots go in because of statistical chance and we assume they're hot". But I wonder how they'd explain OJ Mayo shooting over 50% from 3 for a month and a half to start last year. It's just too many games and too many shots in a row where he was hot as a shooter to just be chance. And there's been other examples of course, the outlier Aaron Brooks and Mike James seasons, even Arron Afflalo and Paul George this year who've cooled down a ton in 2014. George fascinatingly had an uglier shot distribution than last season even when he was putting up superstar offensive stats. His midrange jumpshot to shots at the restricted area ratio had gone in the wrong direction. . It's just he was Durant/Dirk-ing it as perimeter shooter for about the entire first 2 months of the season. Now that he is not as hot, with that level of midrange jacking, his efficiency has gotten quite ugly in Jan and February. Even small examples like Michael Beasley having a "breakout" his first year in Minnesota where for a few weeks he was dropping 25pt games with ease, largely from red hot midrange shooting. Or Demar Derozan having a 5 G stretch this December where he averaged 4.8 3 pt attempts a game on 50% shooting. Then the switch flipped off and he went back to being the sub 30% guy he's been the rest of his career. Ultimately there have been enough periods of time where a player has been hot over extended games or weeks, to make me cast doubt on whether it can happen. I presume that players caliber of play can ebb and flow for physical and mental reasons. After all this is a sport where a team waking up in its own bed and having fans cheer for them has a large statistical impact on games, so who knows about mental effects.
Re: 2014 Sloan Research Papers Posted
Posted: Wed Feb 05, 2014 3:42 pm
by Mike G
Do other sports have 'hot hand' studies? Golfers have good matches and bad matches, comprising a couple hundred shots (strokes?) which will be significantly better or worse than their norms. And there's no defense in golf -- you just hope your opponent misses more often.
Tennis players have droughts and hot streaks, and there's only so much a player can do about his opponent's play. In baseball, a hot hitter can be walked a lot, and that's about it. In football, a hot receiver can get double coverage, but this exposes the defense somewhere else.
In basketball, the immediate response to a player's heating up is to put a better defender on him, or 2-3 defenders situationally. If you don't, as a player or coach, you aren't doing your job.
An army of statistics-minded people have convinced themselves that they can create studies in which they cannot distinguish a hot hand in basketball. One person opted to treat Kobe's 9 consecutive 3-pointers as a single event. This allowed the foregone conclusion, that his hot first half was just random.
Today, among the academic crowd, the Hot Hand is almost universally considered a “fallacy.”
Such a gulf between 'the academic crowd' and the real world would seem to be a disservice to real analytic advancement. It amazes me that anyone even feels the need to prove anything so obvious; and then to come to an impossible conclusion? I am glad this paper is different:
... our findings
cast doubt on the overwhelming consensus that the Hot Hand is a fallacy.
Everyone has good days and bad days. Why and how could
basketball players be the one exception?
Re: 2014 Sloan Research Papers Posted
Posted: Wed Feb 05, 2014 4:59 pm
by xkonk
I think it's important to distinguish between feeling hot and shooting hot, which it sounds like this paper does. Obviously shooters/people doing pretty much anything will have good days and bad days. The questions at hand are a) are they actually having a good day or do they just feel good and b) is their day so good that it's (statistically) unusual? I don't think there's any question that people feel like they have good days; on the other hand, I've gone to the gym feeling great and feeling like my shot feels pretty good and then missed everything. But the research has obviously focused on question b, and the evidence appears to be shaky across the range of studies that I know of.
I think part of the reason that people find a 'real' hot hand to be so appealing is that there isn't much of an intuition as to what kind of sample size you need to distinguish probabilities from each other. You mention OJ Mayo, who's a career 38% three point shooter. Six weeks of 50% shooting seems pretty hot. But he only took 84 threes in that span. There's about a half a percent chance of him hitting exactly that many if he's a 38% shooter. That sounds low, but he could have also hit a couple more or a couple less and would have still looked like a hot shooter (so we can say a streak like his had maybe a 1% chance of happening), and there were however many guys who had a chance to be that hot shooter (so maybe we had more like a 50% chance, probably higher, of seeing someone get that hot, and we just happened to see Mayo). So I think it's pretty reasonable to expect to see a 38% shooter go for 50% for a month or so.
Re: 2014 Sloan Research Papers Posted
Posted: Wed Feb 05, 2014 7:41 pm
by steveshea
I recently blogged on the hot hand
http://www.basketballanalyticsbook.com/ ... hot-hands/
The blog also contains a link to my recent publication on the topic. So, I won't say anything more here than the new Sloan paper agrees with the results and theories laid out in that paper and blog.
I thought the Cervone et al. article was interesting. They define expected possession value. From the article
"Using player-tracking data, we develop a coherent, quantitative representation of a whole possession that summarizes each moment of the possession in terms of the number of points the offense is expected to score – a quantity we call expected possession value, or EPV (see Figure 1 for an illustration of EPV). We accomplish this by specifying and fitting a probabilistic model that encodes how ball handlers make decisions based on the spatial configuration of the players on the court. "
They have high hopes for the new metric:
"The clearest application of EPV is quantifying a player's overall offensive value, taking into account every action he
has performed with the ball."
Is a player's offensive value restricted only to his actions with the ball in his hand? What about screens and decoys and...?
Unfortunately, they do not share all of the details for their model, and only a limited sample of outputs (a top 10 and bottom 10 list from 2012-13). So, it's hard to fully interpret the value in the statistics. Chris Paul led the league last season. That seems reasonable. The bottom 10 included Paul George, Kevin Love, Russell Westbrook, Jrue Holliday and Roy Hibbert. That's confusing.
At this point, I think they have a great idea with loads of potential. It appears there are some details that need to be cleaned up. The authors acknowledge some of these limitations in the paper. One of the biggest problems in my mind is the Markovian model, as many offensive plays involve several steps. Some passes are a demonstration of a player's trust in a play whose goal is a score several passes later.
Re: 2014 Sloan Research Papers Posted
Posted: Wed Feb 05, 2014 9:27 pm
by nileriver
I find it interesting that there is a paper on how to interpret SportVU data. I am sure that will be helpful for teams who have yet to do much analysis with it.
Re: 2014 Sloan Research Papers Posted
Posted: Thu Feb 06, 2014 7:29 pm
by Crow
The boxscoregeeks.com writers hint at something coming at or after Sloan.
http://www.boxscoregeeks.com/articles/m ... -mid-range
Will look for it.
Has anyone got thoughts on the rebound paper? I will read it later.
Re: 2014 Sloan Research Papers Posted
Posted: Thu Feb 06, 2014 8:24 pm
by steveshea
Goldsberry writes about his EPV paper (with Cervone and others):
http://grantland.com/features/expected- ... analytics/
Goldsberry's group uses advanced economic/mathematical models and a tremendous amount of computing power to arrive at their metric EPV. This leads Goldsberry to wonder if NBA teams are currently equipped to handle this type of analysis:
"How many NBA teams have employees who even know what a competing risk model is, let alone people who know how to design and implement one? How many NBA teams have computational clusters or employees who know how to use them? Those answers may not be zero, but they are surely much closer to zero than they are to 30."
There is so much potential in SportVU's data. At the same time, I agree with Goldsberry that it provides new challenges for NBA teams (and maybe eventually more college teams). I hope these teams invest more money in their analytics in the future. Will this paper help persuade them or will owners scoff at an offensive metric that puts players like Kevin Love and Paul George in the NBA's bottom 10? I look forward to the conversations in Boston in a few weeks.
Re: 2014 Sloan Research Papers Posted
Posted: Thu Feb 06, 2014 8:32 pm
by steveshea
Crow wrote:
Has anyone got thoughts on the rebound paper? I will read it later.
I like their "crashing and blocking out" subcomponent. It seems to have potential to identify good defensive rebounding guards and small forwards (players that are not typically in a great rebounding position when the shot goes up). Their top 10 in this category includes Lance Stephenson, Paul George and Reggie Jackson.
Re: 2014 Sloan Research Papers Posted
Posted: Sat Mar 22, 2014 7:56 pm
by wilq
Random SSAC question, will there be a video of each or any panel this year?
AFAIR they were available in the past but I can't find anything for 2014 discussions...
Re: 2014 Sloan Research Papers Posted
Posted: Sat Mar 22, 2014 9:51 pm
by Crow
I recall that the videos come out some months after the event. I don't recall if it is more than 3 months or 6+ but it is not right away, probably to protect the live sales to the event or at least the sense of inside insight for those who paid and attended.
Re: 2014 Sloan Research Papers Posted
Posted: Sat Mar 22, 2014 9:59 pm
by Crow
Arturo G. submitted a paper to Sloan. It was not selected but I just found this article on it:
http://www.boxscoregeeks.com/articles/c ... or-stretch