A Swift iOS application that visualizes head orientation data from AirPods Pro using CoreMotion's CMHeadphoneMotionManager.
- Real-time head orientation tracking with AirPods Pro
- 3D visual representation of head position
- Displays pitch, roll, and yaw values with intuitive visualizations
- Automatic connection detection for AirPods Pro
- Works with AirPods Pro connected before or after app launch
- Debug information for troubleshooting connections
- iOS 17.0+
- Xcode 15.0+
- AirPods Pro (1st or 2nd generation)
- iPhone compatible with AirPods Pro
- Clone the repository:
git clone https://github.com/ctxzz/HeadTrackerApp.git
- Open the project with Xcode:
cd HeadTrackerApp
xcodegen # Generate the Xcode project from project.yml
open HeadTrackerApp.xcodeproj
- Build and run the app on your iOS device
- Connect your AirPods Pro to your iPhone
- Launch the app
- The app will automatically detect your AirPods Pro
- Once connected, you'll see a 3D face that rotates to match your head movements
- The numerical values for pitch, roll, and yaw will be displayed below the visualization
If your AirPods Pro aren't detected automatically, you can tap the "Retry Connection" button.
- Pitch: Up and down movements (nodding "yes")
- Roll: Side-to-side tilt (tilting your head toward your shoulders)
- Yaw: Left and right rotation (shaking your head "no")
If the app doesn't detect your AirPods Pro:
- Make sure your AirPods Pro are connected to your iPhone via Bluetooth
- Check that your AirPods Pro have sufficient battery
- Try tapping the "Retry Connection" button
- Toggle the debug info (tap the info icon in the bottom right) to see connection status
- Swift
- SwiftUI
- CoreMotion
- Combine
- Built using CMHeadphoneMotionManager for AirPods Pro motion data
- SwiftUI for the user interface
- Project structure generated using XcodeGen