Single wood 3D reconstruction based on point cloud fusion of lidar and Kinect camera
CSTR:
Author:
Affiliation:

1.College of Electronic Engineering(College of Artificial Intelligence), South China Agricultural University, Guangzhou 510642,China;2.National Center for International;Collaboration Research on Precision Agricultural Aviation Pesticides Spraying Technology,Guangzhou 510642,China;3.South China Smart Agriculture Public Research and Development Center of Ministry of Agriculture Rural Affairs,Guangzhou 510520,China

Clc Number:

TP391.4

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • | | |
  • Comments
    Abstract:

    3D reconstruction of trees is of great significance in the fields of plant phenotyping, digital orchards, and forestry resource planning. Kinect and lidar, a depth color camera based on infrared active structured light, are commonly used 3D reconstruction devices. In order to better establish a 3D color model of a single tree of cherry trees and obtain accurate phenotypic parameters, a detection method based on Kinect camera and lidar point cloud information fusion of single tree is proposed in this paper. Firstly, the complete environmental point cloud of the area where the single cherry tree is located is collected by the lidar to generate a point cloud map. Secondly, the multi-view point cloud of the single cherry tree is collected by the Kinect camera to obtain a complete 3D color point cloud. Based on the lidar point cloud position, the two point clouds were initially registered by selecting corresponding points with the same name, so that there was a good initial position relationship between the point clouds. Then, the point clouds were accurately registered by using the iterative closest point (ICP) algorithm. Finally, the color point cloud is used to perform point cloud coloring and fusion processing on the radar point cloud to realize the 3D reconstruction of the single-tree cherry tree. Compared with the single-tree phenotype parameters of cherry trees generated only by the Kinect v2 camera, the average relative errors of plant height, crown width and diameter at breast height of the integrated cherry trees were reduced by 1.52, 6.46 and 18.17 percentage points, respectively. The experimental results show that the Kinect v2 depth color camera and lidar can achieve complementary advantages in the 3D reconstruction of a single tree, improve the registration accuracy of the point cloud, and at the same time, it can not only reduce the influence of light and climatic conditions, but also increase the measurement distance, and the phenotype of a single tree can be improved. parameters are more accurate. The single-tree 3D reconstruction method of this complementary fusion technology has a good application prospect, and can be applied to the occasions such as fruit tree phenotype and growth monitoring, etc., to provide technical support for the development of digital orchards.

    Fig.1 Experimental scene
    Fig.2 Point cloud fusion flowchart
    Fig.3 Schematic diagram of lidar scanning
    Fig.4 LeGO-LOAM drawing process
    Fig.5 Schematic diagram of sensor placement
    Fig.6 Data collected by the Kinect camera
    Fig.7 Kinect color point cloud background segmentation result
    Fig.8 Partial effect of point cloud registration from different angles
    Fig.9 Overall fusion effect diagram
    Fig.10 Comparison of the effect before and after point cloud smoothing
    Fig.11 Comparison before and after sampling
    Fig.12 Point cloud to be registered
    Fig.13 Comparison of fusion positions
    Fig.14 Point cloud fusion results
    Fig.15 Comparison of different data precisions
    Fig.16 Point cloud pretreatment process diagram
    Fig.17 Schematic diagram of parameter calculation
    Fig.18 Comparison of single Kinect point cloud data and fusion point cloud data
    Reference
    Related
    Cited by
Get Citation

彭孝东,何静,时磊,赵文锋,兰玉彬. Single wood 3D reconstruction based on point cloud fusion of lidar and Kinect camera[J]. Jorunal of Huazhong Agricultural University,2023,42(2):224-232.

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:May 27,2022
  • Online: March 31,2023
Article QR Code