
Projects
UCSC Mechatronics Competition
Our final project in the ECE-118/L Mechatronics course involved designing and building an autonomous robot capable of collecting and dispensing 30 chrome balls, each 1 inch in diameter. The robot operated on a field separated by a 3-inch tall wall and navigated around an 11x11x11 inch obstacle.
Weight and Output Stationary 2D Systolic Array AI Accelerator
The project presents a weight-stationary 8×8 systolic array accelerator for quantized VGG16 inference, designed to achieve high efficiency on FPGA hardware. Implemented on the Cyclone IV GX, the system integrates quantization-aware training, pruning, and Huffman encoding to minimize power and memory demands while maintaining over 90% accuracy. The RTL design includes modular MAC arrays, custom scratchpad memories, and verification through a comprehensive testbench that achieved zero functional errors. By combining structured and unstructured pruning with adaptive compression, the design demonstrates scalable, low-power deep learning acceleration optimized for embedded and resource-constrained systems
Law-Abiding Batmobile
The Law-Abiding Batmobile is an autonomous robotic vehicle developed using ROS2 to integrate real-time perception, control, and decision-making. It employs AI-driven computer vision on the OAK-D Lite for stop sign and color detection, alongside LiDAR-based obstacle avoidance for safe navigation. Built from the ground up, the system fuses multiple ROS2 nodes for constant speed control, arbitration, and visual inference to demonstrate fully autonomous on-edge driving capabilities
Autonomous Computer Vision-Based Navigation Robot
This project features a fully 3D-printed autonomous robot that navigates indoor environments using only visual input from a Pi Camera and a custom-trained CNN model deployed on a Raspberry Pi 4. The system performs real-time image classification and sends motion commands to an Arduino via serial communication, enabling end-to-end autonomy without LiDAR or ultrasonic sensors Designed, fabricated, and coded entirely from scratch, the robot integrates deep learning–based perception with embedded motor control for smooth, adaptive navigation. Achieving over 91% accuracy in real-world testing, it demonstrates how compact, low-cost systems can achieve robust indoor autonomy through intelligent vision and efficient mechatronic design