Tic Tac GrABB
During my summer internship at Tyson Foods (yes, the chicken company), I contributed to the Tic Tac GrABB (TTG) project. This initiative aimed to program a robot capable of interpreting its environment using an overhead camera and subsequently acting upon that information. Specifically, I programmed the robot to engage in games of tic-tac-toe to demonstrate its ability to interact accurately with its surroundings. Ultimately, Tyson envisions leveraging this technology for sorting raw chicken in factory settings.
Accessory Design
Designs lost… Imagine a 3D printed rail attached to a sharpie with a spring.
In the context of playing tic-tac-toe, the robotic arm required an end effector designed to hold a Sharpie marker. The design emphasized providing optimal play and pressure to ensure consistent lines with the markers utilized.
Computer Vision
A ceiling-mounted camera was deployed above the robot’s operational area. Through computer vision techniques, the system could detect and isolate a green tic-tac-toe grid.
Machine Learning
Once the tic-tac-toe grid was identified in the image, a classification model was employed to determine whether a grid space was occupied by an X or an O. Subsequently, a reinforcement learning model interpreted the game state and selected the optimal next move.
Robot Control
Prior to instructing the robot to move, camera coordinates were transformed to align with the robot’s coordinate system. The coordinates of the selected move, determined by the reinforcement learning model, were transmitted to the robot via a socket using MoveIT. The robot then executed the move, following a predefined path to draw either an X or an O.
Results
The robot successfully played tic-tac-toe against individuals, achieving a performance level akin to that of a second-grader after approximately 500 runs. However, occasional bugs occurred, such as socket connection timeouts or subpar image processing, leading to inaccuracies in game interpretation. Ideally, further development would enable the system to operate autonomously without user intervention to determine when it is safe to make a move. Additionally, integrating object tracking could significantly enhance accuracy.
Note: For a comprehensive explanation of the project’s workings, please refer to the GitHub repository.
Involvement
I served as the primary contributor to the project, collaborating closely with Jeremy Gerard, a Tyson employee, to discuss logic and prepare for demonstrations.
Leave a Reply