Apples are among the most important agricultural products worldwide. Ensuring apple quality with minimal effort is crucial for both large-scale and local producers. In Mexico, the manual detection of damaged apples has led to inconsistencies in product quality, a problem that can be addressed by integrating vision systems with machine learning algorithms. The YOLO (You Only Look Once) neural network has significantly improved fruit detection through image processing and has automated several related tasks. However, training and deploying YOLO models typically requires substantial computational resources, making it essential to develop lightweight and cost-effective detection systems, especially for edge computing systems. This paper presents a mechatronic system designed to detect apple varieties and potential damage in apples (Malus domestica) within the visible spectrum. The cultivated apple varieties considered were Gala, Golden, Granny Smith, and Red Delicious. Our contribution lies in developing a lightweight neural network architecture optimized specifically for embedded systems. The proposed architecture was compared against YOLOv3-Tiny, YOLOv4-Tiny, and YOLOv5-s. Our optimized model achieved a high accuracy and sensitivity (94–99%) and was successfully implemented on a Jetson Xavier NX board, where it reached a processing speed of 37 FPS.