Project Overview
Project AURA (Vision-Controlled IoT-Enabled Robotic Arm) represents a breakthrough in accessible robotics. During the Internet of Things course in Fall 2025 at Thompson Rivers University, our team developed a groundbreaking solution that bridges the gap between industrial-grade automation and human safety.
Watch the video below for a quick walkthrough of the AURA robotic arm project and its capabilities. Detailed explanations follow.
The Core Mission
In the middle of a bustling factory floor, there is a worker standing in front of a hydraulic press. Their job is simple but terrifying: reach in, place a part, pull back, and repeat. For eight hours a day, they dance with a machine that could change their life in a split second of fatigue.
This is where the inspiration for AURA began. We saw a massive disconnect in the industry: a gap I call "The Complexity Gap." On one side, you have million-dollar industrial robots that are safe but require a PhD to program. On the other, you have manual labor that is "cheap" but puts human lives at risk. Our mission was clear: build a bridge across that gap.
The Complexity Gap Problem
The robotics industry faces a critical accessibility challenge that AURA directly addresses.
Traditional Industrial Robots
• Cost: $100,000+
• Programming: Requires PhD-level expertise
• Accessibility: Only available to major corporations
• Deployment Time: Months
Manual Labor Reality
• Cost: Low per-unit labor
• Safety: High risk of injury
• Sustainability: High employee burnout
• Precision: Variable and inconsistent
AURA Solution
• Cost: <$100 prototype (<$30 at scale)
• Programming: No code needed
• Accessibility: Democratized robotics
• Deployment: Immediate
The "dull, dirty, and dangerous" jobs represent some of the most critical opportunities for automation. Yet the tools to automate these tasks have traditionally been locked behind massive price tags and technical barriers. AURA shatters both limitations.
Innovation Through Collaboration
Team & Leadership
Project AURA is the result of an exceptional collaborative effort. Developed during the Internet of Things (IoT) course in Fall 2025 at Thompson Rivers University, our team was guided by the expertise of Dr. Anthony Aighobahi.
Core Team Members:
- Gursahib Singh
- Deepansh Sharma
- Yassh Singh
- Noori Arora
- Vansh Sethi
Together, we wanted to prove that high-end automation shouldn't be a luxury reserved for the world's richest corporations. Every member brought their unique expertise to create something truly transformative.
Solving the Safety Crisis: The No-Code Paradigm
The revolutionary insight behind AURA is simple yet powerful: workers shouldn't need to be programmers to operate robots. We implemented a breakthrough "No-Code" automation paradigm that fundamentally changes how humans interact with robotics.
How No-Code Operation Works
- 1.Visual Recognition: A worker simply shows their hand movements to the webcam. The robot observes exactly what needs to be done.
- 2.Intent Processing: Using Google MediaPipe for human pose estimation and OpenAI's GPT-4o-mini for semantic understanding, the system processes the human's intent.
- 3.Instant Mirroring: The AURA robotic arm mirrors the human hand movements in real-time with precision control.
- 4.Life-Saving Deployment: By replacing a human hand with AURA's 3D-printed grip in high-risk environments like hydraulic presses, we save lives while improving efficiency.
The Technology Stack
Computer Vision Pipeline:
Google MediaPipe detects hand keypoints and human pose in real-time
AI Understanding:
OpenAI GPT-4o-mini processes visual intent into robotic commands
Microcontroller:
Arduino-based control for servo motors and arm coordination
IoT Connectivity:
Firebase for telemetry, logging, and real-time synchronization

The AURA robotic arm in action
High Impact, Low Cost: Democratizing Robotics
Perhaps the most "disruptive" part of Project AURA is the price tag. While traditional industrial arms cost as much as a luxury car, our prototype was built for under $100.
Prototype Cost: <$100
- 3D-printed components (PLA plastic)
- Entry-level servo motors
- Arduino microcontroller
- Standard webcam & connectivity modules
Scale Production: <$30
- Injection molding cost reduction (60-70%)
- Bulk motor procurement (30-40% savings)
- Integrated circuit board manufacturing
- Economies of scale across supply chain
🎯 The Vision
This represents a total democratization of robotics. We are moving toward a future where a small family-owned workshop can have the same safety standards and precision as a massive automotive plant. Safety should never be a luxury; it should be universal.
Academic Research & International Publication
Project AURA is more than just a functional prototype: it is a contribution to the scientific community. Our team has authored a full-length technical research paper detailing:
Paper Topics
- Kinematic modeling and inverse kinematics
- IoT telemetry via Firebase infrastructure
- Neural pipeline for no-code programming
- Computer vision integration methodology
- Accessibility-first robotics design
Publication Status
Our work is currently in the process of being published with a highly reputed international conference.
We believe the data and findings inside this paper will serve as a blueprint for the next generation of accessible robotics and inspire other researchers to pursue democratized automation solutions.
Once the publication process is finalized, the full manuscript and findings will be shared publicly to benefit the research community.
Technical Architecture
AURA's architecture combines multiple sophisticated systems working in harmony:

Complete system architecture from vision input to IoT telemetry
Key Technical Components
MediaPipe Hand Tracking
Detects 21 hand landmarks in 3D space with ~95% accuracy at 30fps
Servo Motor Array
6-degree-of-freedom arm with synchronized servo control
Inverse Kinematics Engine
Calculates joint angles from target end-effector positions
Firebase Integration
Cloud-based telemetry for monitoring and analytics
Data Flow
- 1. Capture: Webcam captures user hand gesture
- 2. Process: MediaPipe extracts keypoints (21 landmarks)
- 3. Understand: GPT-4o-mini interprets intent
- 4. Calculate: IK engine computes servo angles
- 5. Execute: Arduino drives motors in sync
- 6. Log: Firebase records telemetry data
Key Features
User-Facing Features
- No-Code Operation: Show, don't tell. No programming knowledge required.
- Real-Time Mirroring: Gestures translate to arm movements instantly
- Safety Override: Emergency stop and real-time control options
- Accessibility First: Designed for operators with varying physical abilities
Technical Capabilities
- Multi-Gesture Recognition: Point, grab, rotate, and custom gestures
- Adaptive Learning: Improves accuracy with each use
- IoT Telemetry: Full performance monitoring and analytics
- Modular Design: Easy to upgrade components and extend functionality
The Future of AURA
Project AURA proved that when you combine the power of AI, the connectivity of IoT, and the passion of a focused team, you can solve problems that were once thought to be too expensive to fix. The journey is just beginning.
Near-Term Goals (6-12 months)
- Complete academic publication & peer review
- Prototype refinement for industrial testing
- Partnership discussions with manufacturing facilities
- Open-source release for educational purposes
Long-Term Vision
- Commercial manufacturing at $30 scale
- Global accessibility in developing countries
- Multi-arm coordination and swarm robotics
- Integration with AR/VR for advanced teleoperation
The Complexity Gap is Closing
We are experiencing a fundamental shift in how automation will be deployed in the next decade. As AI becomes more accessible and manufacturing costs continue to drop, the barriers to industrial-grade robotics will continue to crumble. AURA is not just a product; it's a proof of concept that democratized, accessible automation is not only possible, but inevitable.
We're the ones holding the door open.
Learn More
GitHub Repository: Full source code and documentation coming soon
Research Paper: Complete technical details available upon publication
Contact: Interested in collaboration or implementation? Get in touch!
