BowlSense Learn

Your iPhone Already Knows Your Ball Layout

We built an AR ball scanning feature that uses iPhone LiDAR to measure bowling ball layouts with 2% accuracy. Model registration beats 3D scanning.

Every layout measurement on your bowling ball — pin to CG, pin to PAP, span, bridge — comes down to distances between points on a sphere. And your iPhone Pro already has a depth sensor that can see that sphere sitting on your kitchen table. So we built a feature that lets you scan your ball with your phone and get accurate layout measurements in about 30 seconds. No pro shop visit, no ball spinner, no tape measure.

Why This Is Hard (And Why We Almost Got It Wrong) Our first instinct was to throw LiDAR at the problem. Point the depth sensor at the ball, reconstruct the surface as a 3D point cloud, auto detect the finger holes, find the pin and CG marks. Full automation. It didn't work. Here's why: Bowling balls are terrible LiDAR targets. The polished reactive cover reflects infrared unpredictably.

A matte wall returns tens of thousands of clean depth points. A glossy bowling ball? Maybe 500 noisy ones. The finger holes — matte drilled fiberglass — actually gave better returns than the ball surface itself. We tried RANSAC sphere fitting (random sampling to find the best fit sphere), concavity clustering for hole detection, even RGB computer vision to find the colored pin and CG marks.