Autonomy Videos
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

Autonomy Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Sunday recap

NewsDealsSocialBlogsVideosPodcasts
AutonomyVideosStudents Final Project Presentation - Robotics Developer Masterclass
AutonomyRobotics

Students Final Project Presentation - Robotics Developer Masterclass

•February 7, 2026
0
The Construct (ROS)
The Construct (ROS)•Feb 7, 2026

Why It Matters

The demo proves that open‑source robotics tools can deliver real‑time, vision‑guided interaction, accelerating prototyping for education and research in human‑robot collaboration.

Key Takeaways

  • •Integrated ROS, OpenCV, and MoveIt to build tic‑tac‑toe robot.
  • •Used Gazebo simulation and RViz for testing before real deployment.
  • •Fixed camera and board positions simplified perception but lighting remained problematic.
  • •Implemented separate vertical/horizontal line detection and clustering to isolate grid.
  • •Shape detection relied on contour solidity analysis to differentiate X and O.

Summary

The final project presentation of the Robotics Developer Masterclass showcased Aaron Emer’s "tic‑tac‑toe bot," a robotic arm that plays tic‑tac‑toe against a human opponent using computer vision and motion planning. The system combines the ROS framework, OpenCV for perception, MoveIt for trajectory execution, and a Vue.js web interface for live control, with extensive testing in Gazebo simulation and RViz visualization before moving to a real‑world setup in a Spanish lab.

Emer outlined a four‑module architecture: an order module that issues commands, a perception module that extracts the board state from camera images, a decision module that computes the optimal move, and a trajectory module that drives the arm. The robot draws the grid, places Xs, and reacts to human‑drawn Os, while the web app streams camera feeds and visualizes detected lines and shapes. Key technical hurdles included variable lighting, overlapping grid lines, and mis‑detections when human marks were near the board edges.

During the live demo, the robot successfully traced the grid and placed its first X, but struggled to recognize a circle placed close to a grid line, illustrating the limits of the contour‑based detection pipeline. Emer explained that separating vertical and horizontal line detection, applying Sobel filters, and clustering line candidates reduced noise, while solidity analysis of contours helped distinguish Xs from Os.

The project demonstrates how a modular ROS stack can integrate perception, planning, and actuation for interactive human‑robot games, offering a template for educational robotics and a testbed for advancing real‑time vision‑driven manipulation. It highlights the importance of robust preprocessing and fixed‑environment assumptions when deploying vision systems in variable lighting conditions.

Original Description

Presentations of the Final Projects of the Robotics Developer Masterclass (https://bit.ly/49OBvyn).
These presentations are open to the public and provide an opportunity to learn about the different research topics developed by the students of this program.
The Masterclass Final Project consists of the student undertaking an autonomous project under the direction of a tutor, the completion of which is intended to help the students implement the robotics development skills that they have gained during their studies.
Agenda:
16:00 CET - Emeric Arhant: TicTacToeBot
Tutor(s): Alberto Ezquerro
--
#ros #robotics #robot #ai #ros2
0

Comments

Want to join the conversation?

Loading comments...