Skip to content

Real-time human pose tracking for schools — an educational AI demo using Python, MediaPipe, and PyBullet on Jetson AGX Orin.

Notifications You must be signed in to change notification settings

ADEFORGE/real_time_pose_tracking

Repository files navigation

🚀 Real-Time Pose Tracking – AI Prototype

Prototype developed for the Machine Learning Group at UIT – The Arctic University of Norway Designed for educational demonstrations at the Nordnorsk Vitensenter

📜 Project Overview

This prototype showcases real-time human motion tracking using AI. The system captures a live video feed from a camera, detects body movements, and instantly maps them onto a 3D avatar. The goal is to provide a fun and educational interactive experience for children visiting the Nordnorsk Vitensenter.

The application was developed as part of a two-month internship and is intended as a proof-of-concept that can be further improved and deployed in interactive exhibitions.

✨ Features

  • 🔍 Pose keypoint detection using MediaPipe (33 body landmarks in 3D)
  • ⚙️ Kalman filtering for smoothing and short-term prediction of marker positions
  • 🌟 3D avatar animation using Panda3D
  • 🎯 Coordinate conversion from Cartesian positions to rotation vectors and Euler angles
  • Real-time performance tested on Nvidia Jetson AGX Orin
  • 📄 Open-source: only free and open-source libraries used

💡 AI Concepts Covered

  • Computer Vision: 3D pose estimation
  • Filtering & Prediction: Kalman filter implementation for motion smoothing
  • 3D Transformation: converting Cartesian coordinates to quaternion-based rotations
  • Embedded Real-Time Processing: performance optimization on Jetson hardware

⚙️ Technical Configuration

Component Technology Rationale
Pose Detection MediaPipe Lightweight, embedded-friendly 3D keypoint detection
Filtering OpenCV (Kalman) Predictive smoothing of motion data
3D Engine Panda3D Open-source, Python-compatible 3D rendering engine
Physics Simulation PyBullet Additional 3D model manipulation capabilities
Language Python 3 Readable, widely taught, cross-platform

🛠️ Installation & Execution

Note: This prototype has been tested on the Nvidia Jetson AGX Orin. With all dependencies installed, it should also run on a standard Linux or Windows environment.

# 1. Clone the repository
git clone <repository_url>
cd real_time_pose_tracking

# 2. Create a virtual environment
python3 -m venv venv
source venv/bin/activate   # On Windows: venv\Scripts\activate

# 3. Install dependencies
pip install -r requirements.txt

# 4. Run the prototype
python3 main.py

📃 Documentation

Full installation and technical documentation is available in the repository: CR_Installation_Real_Time_Pose_Tracking_2025_en.pdf

💩 Current Limitations (Prototype Status)

  • Filter configuration is empirical; parameters should be tuned methodically.
  • Incomplete animation of some body parts (e.g., head not fully animated).
  • No animation for fingers and toes.
  • Basic 3D model: lacks detailed visuals; can be replaced with a more polished avatar.
  • No graphical interface: currently CLI-based.

🛡️ License

This project uses only free and open-source software. Please refer to the repository's license file for details.

👥 Contributors

  • Benjamin Poireault (PHELMA / E3, Grenoble INP) – AI workflow creation, pose detection pipeline
  • Arthur Deforge (ENSI CAEN) – 3D engine integration, code structure, documentation

Supervisors: Benjamin Ricaud, Samuel Kuttner, Steffen Aagaard Sørensen, Erik Heggeli

About

Real-time human pose tracking for schools — an educational AI demo using Python, MediaPipe, and PyBullet on Jetson AGX Orin.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages