In this paper, we introduce a newly developed target-free calibration method for automotive augmented reality head-up displays (AR-HUDs), which can be performed fully automatically using a smartphone camera. Our method requires no calibration target to be set up in front of the vehicle. Instead, it utilizes feature points of the environment, which makes it robust against misplaced targets and allows for an easy deployment, i.e. in garages. Under the pinhole model assumption, we decouple the perspective projection matrix into three parts: intrinsic matrix, relative pose between the vehicle’s 3D sensor and the smartphone camera, and then rotation between the camera space and the HUD field of view (HUD-FOV). Based on the epipolar constraint, we acquire the relative pose. The determination of intrinsic and rotation matrices is also accomplished without any pre-designed calibration target. The calibration itself takes less than 5 minutes for an eye box with 9 different training viewpoints. With our new approach, we achieve a competitive average reprojection error of 6.7 mm at a distance of 7.5 m, which is comparable to the previous work that applied targets.
|