Optical tracking of player performance, in-game action powering broadcast, sports betting and training activities

Track160 captures player movement data from a single camera mount, the only system of its kind certified by FIFA.Courtesy of Track160

Adiscernment eye watching an elite game of football or rugby can spot a slight stretch in the fabric of a player’s shirt between the shoulder blades. This is the telltale sign of a GPS device tracking their physical performance during the match: distance covered, maximum speed, maximum acceleration, number of sprints, etc.

NFL and NHL players have their own tracking sensors affixed to their pads, but NBA and MLB athletes, for example, are still without a device. The same race data is collected on everyone via their advanced stats partners, Second Spectrum and Hawk-Eye, respectively, which use a series of arena-sounding cameras to triangulate the player’s position and orientation.

Division I college football players almost uniformly wear GPS devices, but NFL scouts generally don’t have access to this data. Instead, they turn to companies like Sportlogiq and Slants, whose computer vision platforms can ingest old game videos and retroactively calculate comparable data sets.

Likewise, biomechanics was once relegated to the realm of research labs, with test subjects outfitted with dozens of reflective markers to guide the collection of 3D motion capture data. Now, lighter setups are possible with a number of companies – Mustard and ProPlayAI in baseball, Sportsbox AI in golf, Physimax for movement – ​​offering solutions that only require a single smartphone.

The sports world is not yet entering a full-fledged post-wearable era. On-body devices will remain relevant for the foreseeable future in measuring biometrics and internal load – heart rate, muscle activity, hydration and glucose monitoring, to name a few – as well as in certain training parameters. where camera coverage from a certain height or perspective is not feasible.

But the industry is going optical in a major way.

Its advantages are numerous but highlighted by the obvious: players do not need to be convinced (or contractually obliged) to wear a device. Data collection is frictionless and non-invasive for the athlete and more easily obtained by teams and broadcasters. The metrics are already intrinsically linked to the video, eliminating the need to synchronize different inputs by timestamp.

AI drives growth

As camera technology continues to improve, what’s really driving this change is the exponential growth of artificial intelligence, especially in computer vision and deep learning techniques.

Phil Cheetham, an Australian Olympic gymnast who competed in the 1976 Games, earned his doctorate. in biomechanics in 2014 and has co-developed several motion analysis systems, earning him the nickname “The 3D Guy”. Cheetham was the director of sports technology and innovation for the U.S. Olympic and Paralympic Committee before taking a position as chief scientific officer of Sportsbox AI earlier this year. He had rejected attempts at single-camera motion capture—”Quite frankly,” he said, “I thought it was impossible”—until he saw what Sportsbox had built. “We’re only talking a few degrees difference and less than an inch,” Cheetham said of his comparison to lab-grade motion capture. “It’s pretty amazing.”

There has been a change even within the broader framework of artificial intelligence. Serial entrepreneur Miky Tamir founded several sports optical tracking companies, such as SportVU, Pixellot and Track160. SportVU, which was later acquired by the company now called Stats Perform, became the NBA’s first tracking data partner in 2013. Track160 does more complex tracking work, including pose estimation for granular player movement analysis, all from a single camera mount.

“Now everything is done using deep learning,” Tamir said of a machine learning technique that uses three or more layers of neural networks and attempts to mimic human brain function. “For example, Track160 is just deep learning whereas SportVU was classic computer vision. And that’s a huge difference. We are now doing things that we could dream of 10 years ago.

The American Sports Medicine Institute launched its first performance development partnership with a purely camera-based motion capture system, DARI Motion, in July 2020. ASMI Research Director Glenn Fleisig told called “the next revolution”.

Founder of pioneering Driveline Baseball training facility Kyle Boddy, who also served as the Cincinnati Reds’ pitching manager, said “validated, mobile, low-cost biomechanical tracking,” like from a phone, “will change the game forever. He likened its impact to the voluminous research done on pitch tracking by Pitch F/X and TrackMan.

DARI Motion uses markerless motion capture – using only data extracted from video – and works with healthcare partners such as the American Sports Medicine Institute and the Hospital for Special Surgery.Courtesy of DARI Motion

Automated production

All this increasingly accurate and accessible data will ideally improve player performance, prevent injuries and guide rehabilitation protocols. But it has also been a major boon for broadcasters, sportsbooks and even youth sports.

Automated video production companies like Pixellot can follow the action on a field while zooming, panning and changing angles without human intervention. It offers live streams on a range of sports organizations – Jr. NBA, MLB-affiliated development leagues, Scottish professional football and the National Federation of State High School Associations – as well as a range of broadcasters .

Pixellot streamed over 1 million games across 20,000 field and arena installs in 2021, its growth trajectory accelerated by COVID-19 restrictions on broadcast staff that diminished the supply of directed shows man-made and fan traffic which have increased the demand for remote viewing of events. Even before the pandemic, Pixellot had partnered with ESPN to broadcast a wide range of America East Conference games on ESPN+, and the start of name, image and likeness marketing deals further heightened interest from institutions. of the NCAA.

“With NIL, more content is king because if gamers want to monetize their name, image and likeness, they need content behind it,” said David Shapiro, president of Pixellot’s North American business. “So being able to capture every practice and every game becomes paramount for every sporting department.”

Feeding sports betting

It is also true for sports betting. Betting streams are already mature in Europe – where Pixellot has worked most closely with Genius Sports – and there is a fertile growth market in the US as adoption increases and more sports betting begins to stream games in their app. “What betting houses have found is that if you can watch the game, you’re more likely to bet on the game,” Shapiro said.

Sportsbooks have shown a healthy appetite for these advanced data feeds both to inform their odds and to create unique prop bets. In its partnership with MLB, for example, MGM Resorts claimed exclusive access to some of the league’s Hawk-Eye-generated Statcast data. “The data itself is really what the business is around,” said Dorian Pieracci, CEO and co-founder of Movrs, a new entry into the optical motion capture space, adding that what is “the most valuable” is its use case in sports. betting world.

LVision began using computer vision to extract information about tennis and has since expanded to 20 sports. His insights, said CEO and Founder Ido Lazar, contributed to a 20% increase in bets placed and a 15% increase in the average bet. “We measure everything,” he said. “Every league, every bet, every everything. And the results are really, really breathtaking.

Training aims

Smartphone-based technologies will democratize advanced coaching methods and provide more instruction and feedback to young players. NBA partner HomeCourt has tracked more than 100 million basketball shots, and SwingVision, which counts tennis governing bodies in England, Australia and US universities as partners, is now on track to do the same in tennis. Both also automatically cut highlights.

“It’ll just be the norm that kids today grow up knowing and accepting, ‘Oh, I know I can capture my movement to improve myself as an athlete or whatever,'” Pieracci said. “Computer vision is perhaps the most studied part of AI from an implementation perspective. You see it in cars, you see it on your phone. It’s being talked about here for sports. It’s predominant. It was just a matter of computing power coming in and [collecting] enough training data” to refine the algorithms.

Sig Mejdal, the Baltimore Orioles assistant general manager who oversees the club’s analytics after holding similar positions with the Houston Astros and St. Louis Cardinals, has been a keen observer of emerging tech trends for nearly two decades. Better devices mean better data and better decisions. Pitch-tracking radars proliferated in baseball, but they couldn’t measure or make statistical guesses about certain metrics like ball orientation or spin rate — information that camera-based systems don’t have. no harm in observing, with the potential to collect much more data.

“Computer vision and optical tracking,” Mejdal said, “are only limited by your imagination.”

SportTechie is now part of Leaders Group. To learn more about the intersection of sport and technology, visit sporttechie.com.

Comments are closed.