A Chinese research team at Tarim University has developed a lightweight object detection and recognition solution for cleaning solar panels.
Scientists from the Tarim University of China have proposed a way to tackle the challenging problem of Pose recognition for photovoltaic panel cleaning robots.
Their new solution is based on a Low Power version of the You Only Looking One (YOLOV8) -Model for object detection and computer vision tasks. Other versions of YOLO have been investigated on solar applications, such as defective detection and panel inspection.
The effective use of PV cleaning robots requires accurate object detection and determining recognition, but also low power consumption, according to the researchers. In this respect there are challenges in the field of machine weights to be tackled: panels have different tilting angles and orientations, there is imaging interference of ambient light, dust and dirt, as well as partial occlusion caused by other panels
The team proposed a lightweight panel for recognition model based on you that only once (yolo) version 8 nano (yolov8n) object detection -algorithm look. It said that this version represents the “most lightweight variant” within the YOLOV8-Machine vision and object detective family, because it gives priority to efficiency and real-time processing to enable the use of hardware with low power.
The work is detailed in “YOLOV8N – PP: A lightweight pose recognition algorithm for photovoltaic array cleaning robot“Published in Journal of Real-Time Image Processing.
The researchers used a “diverse and extensive data set” for photovoltaic panel poses to ensure that their method demonstrates “strong generalization performance” for various environments. The dataset, called P-pose, consisted of PV Pose images collected from the photovoltaic power plant of Jingke Technology in Alar City, China.
They integrated Yolov8N with mobile-vit machine vision technology to make Yolov8n-Photovoltaic-Pose (YOLOV8N-PP). The scientists said that Mobile-Vit is a lighter version of self-connection-based Vision Transformer (VIT) for mobile applications. Vit was reportedly developed as an alternative to a transformer based on conventional neural networks to achieve a faster conclusion speed.
“This integration helps to reduce the effects of different target positions from the mobile perspective of the robot,” the researchers said. Moreover, they used a regression of the bead box to improve the precision and accuracy of PV panel recognition, known as MPDIOU loss.
“By combining Yolov8N, Mobile Vit and MPDIOU loss, we propose a method that are called Yolov8n-Photovoltaic-Pose (YOLOV8N-PP), which use the strengths of these components to reach accurate and efficient pose recognition for photovoltaïsche-Odegassingen, senior-to-the-sides.
For the training validation they used a 64-bit Windows 10 computer with an Intel Xeon (R) Silver 4210R CPU and an NVIDIA GeForce RTX 3060TI GPU. Python 3.8 was the programming language, together with Pytorch 2.0.0 Deep Learning Framework for Network Training.
The team carried out a detailed analysis in which the YOLOV8-PP method was compared with various other versions of the Yolo, and found that the proposed solution “the best results” achieved in various evaluation tricks. “The precision and recall of our approach in particular are 3.45% and 5.78% higher compared to the Baseline Yolov8n model,” said it.
The method exhibited improvements in both precision and recall, making it an “effective solution for PV-Pose recognition”, whereby Yolov8n-PP not only improves detection bluence, but also improving stability.
Space for improvement was noted in the ability to deal with extreme occlusion and very reflective environments.
Future research means that YOLOV8 PP is implemented in a PV cleaning robot, field tests and the inclusion of additional types of sensors, such as infrared image formation, to further improve the detection performance of the model.
This content is protected by copyright and may not be reused. If you want to work with us and reuse part of our content, please contact: editors@pv-magazine.com.
