MuseMuse
EyeBall

Ball tracking

Follow the ball.
Understand the play.

EyeBall follows the basketball across game film. A Kalman filter stitches noisy detections into a smooth trajectory; its shape tells you when passes happen. Layer in other signals -- dribbles, player position, clock state -- and you get actions, and from actions, plays: the coaching surface the box score never captures. The same primitive also drives a synthetic broadcast camera for amateur footage, no operator required.

Case 01 -- Tight tracking

Jordan, drive to the rim.

A classic crossover-and-finish. The ball changes direction and speed twice per second and briefly disappears behind MJ's body at the rim. The system holds the trajectory through all of it -- the kind of possession the box score never explains. That's the starting point for any downstream analytics: if you can't follow the ball, you can't measure the play.

Case 02 -- Possession breakdown

Spurs at Houston, half-court.

One half-court possession. The system watches the ball move between players and stamps a timestamp every time the trajectory changes direction sharply enough to register as a pass. It also over-counts here -- classical CV has no way to distinguish a real pass from a crisp dribble reversal. That's the natural limitation of a ball-only signal.

Passes are a starting point, not the answer. Combine them with dribble detection, player position, and shot-clock state and you get action detection -- pick-and-roll, hand-off, off-ball screen, isolation -- and from there, play detection. That's the coaching surface I'm building toward: the box score never tells you which actions actually got run.

Under the hood, no training data required: HSV + contour detection, Kalman smoothing, outlier rejection, and an angle-based pass detector with a cooldown gate.

Detection

HSV + contours

Smoothing

Two-pass Savgol + outlier rejection

Events

Angle-based pass detection

Output -- Houston half-court run

Passes detected

7

Detection rate

70%

294 / 420 frames

Clip duration

14.0s

Trajectory

4,796px

total path length

Pass timeline (seconds)

0.2s
1.7s
6.1s
6.9s
7.9s
9.6s
12.3s

Downstream application

Virtual panning.
No operator, still a broadcast.

Almost no amateur game has a camera operator. A stationary wide panorama turns into a real broadcast by tracking the action and rendering a synthetic 16:9 pan over it. Motion inside a court ROI drives the pan target; Savitzky-Golay keeps the crop from jittering.

Source

Output

Engineering note

No training data? Generate it with NBA 2K.

Good labelled basketball film -- with the ball picked out on every frame -- is rare. Before classical CV got the current pipeline running, I trained the early ball-detection models on synthetic data generated from NBA 2K: realistic geometry, controllable camera, and a game engine I could produce labelled frames from on demand. Not a shortcut to production, but a legitimate proof of concept and a reliable test bed wherever a real labelled set doesn't exist yet.

Under the hood

Stack.