Skip to main page content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Filters applied. Clear all
. 2018 Mar 25;18(4):969.
doi: 10.3390/s18040969.

Green Grape Detection and Picking-Point Calculation in a Night-Time Natural Environment Using a Charge-Coupled Device (CCD) Vision Sensor With Artificial Illumination

Affiliations
Free PMC article

Green Grape Detection and Picking-Point Calculation in a Night-Time Natural Environment Using a Charge-Coupled Device (CCD) Vision Sensor With Artificial Illumination

Juntao Xiong et al. Sensors (Basel). .
Free PMC article

Abstract

Night-time fruit-picking technology is important to picking robots. This paper proposes a method of night-time detection and picking-point positioning for green grape-picking robots to solve the difficult problem of green grape detection and picking in night-time conditions with artificial lighting systems. Taking a representative green grape named Centennial Seedless as the research object, daytime and night-time grape images were captured by a custom-designed visual system. Detection was conducted employing the following steps: (1) The RGB (red, green and blue). Color model was determined for night-time green grape detection through analysis of color features of grape images under daytime natural light and night-time artificial lighting. The R component of the RGB color model was rotated and the image resolution was compressed; (2) The improved Chan-Vese (C-V) level set model and morphological processing method were used to remove the background of the image, leaving out the grape fruit; (3) Based on the character of grape vertical suspension, combining the principle of the minimum circumscribed rectangle of fruit and the Hough straight line detection method, straight-line fitting for the fruit stem was conducted and the picking point was calculated using the stem with an angle of fitting line and vertical line less than 15°. The visual detection experiment results showed that the accuracy of grape fruit detection was 91.67% and the average running time of the proposed algorithm was 0.46 s. The picking-point calculation experiment results showed that the highest accuracy for the picking-point calculation was 92.5%, while the lowest was 80%. The results demonstrate that the proposed method of night-time green grape detection and picking-point calculation can provide technical support to the grape-picking robots.

Keywords: green grapes; night-time environment; picking-point calculation; vision-sensor detection.

Conflict of interest statement

The authors declare no conflicts of interest.

Figures

Figure 1
Figure 1
The picking robot and its vision-sensing system. (a) Vision-sensing process of the picking robot; (b) structure of the picking robot; (c) grape-picking experiment of the picking robot.
Figure 2
Figure 2
Daytime and night-time grape pictures. (a) Daytime grape cluster; (b) night-time grape cluster.
Figure 3
Figure 3
Color feature analysis of the acquired images. (a) Samples of image blocks; (b) color distribution of daytime red, green and blue (RGB) image; (c) color distribution of night-time RGB image.
Figure 4
Figure 4
Algorithm flow diagram of the proposed method.
Figure 5
Figure 5
Comparison of R component before and after rotation. (a) The original R component image; (b) the R component image after rotation; (c) the original R component histogram; (d) the R component histogram after rotation.
Figure 5
Figure 5
Comparison of R component before and after rotation. (a) The original R component image; (b) the R component image after rotation; (c) the original R component histogram; (d) the R component histogram after rotation.
Figure 6
Figure 6
Segmentation with improved Chan–Vese (C–V) level-set model. (a) Segmentation result; (b) binary image.
Figure 7
Figure 7
Histograms of the R component of the night-time images in different conditions. (a,e) Grape clusters near the light-source center; (b,f) grape clusters near light-source center and on the edge of the image; (c,g) grape clusters out of the light-source center and on the edge of the image; (d,h) no grape cluster.
Figure 7
Figure 7
Histograms of the R component of the night-time images in different conditions. (a,e) Grape clusters near the light-source center; (b,f) grape clusters near light-source center and on the edge of the image; (c,g) grape clusters out of the light-source center and on the edge of the image; (d,h) no grape cluster.
Figure 8
Figure 8
Detection results of the improved algorithm. (a) Night-time image; (b) initial iteration result; (c) rough contour; (d) further iteration result; (e) binary image; (f) Bbinary image after retaining the largest region and filling holes; (g) binary image after opening operation and closing operation; (h) grape fruit.
Figure 9
Figure 9
The calculation of the picking point. (a) Horizontal line and vertical line; (b) detection of line angle range; (c) determination of picking point; (d) fruit area; (e) the region of interest (red box) and the Hough detection results; (f) picking-point calculation.
Figure 9
Figure 9
The calculation of the picking point. (a) Horizontal line and vertical line; (b) detection of line angle range; (c) determination of picking point; (d) fruit area; (e) the region of interest (red box) and the Hough detection results; (f) picking-point calculation.
Figure 10
Figure 10
The successful detection results. (a,d) Fruit area; (b,e) the region of interest (red box) and the Hough detection; (c,f) picking-point calculation.
Figure 10
Figure 10
The successful detection results. (a,d) Fruit area; (b,e) the region of interest (red box) and the Hough detection; (c,f) picking-point calculation.
Figure 11
Figure 11
The error results of grape-detection and picking-point calculation. (a,d) Fruit area; (b,e) the region of interest (red box) and the Hough detection; (c,f) picking-point calculation.
Figure 11
Figure 11
The error results of grape-detection and picking-point calculation. (a,d) Fruit area; (b,e) the region of interest (red box) and the Hough detection; (c,f) picking-point calculation.

Similar articles

See all similar articles

References

    1. Bac C.W., Hemming J., van Henten E.J. Stem localization of sweet-pepper plants using the support wire as a visual cue. Comput. Electron. Agric. 2014;105:111–120. doi: 10.1016/j.compag.2014.04.011. - DOI
    1. Gongal A., Amatya S., Karkee M., Zhang Q., Lewis K. Sensors and systems for fruit detection and localization: A review. Comput. Electron. Agric. 2015;116:8–19. doi: 10.1016/j.compag.2015.05.021. - DOI
    1. Song Y., Glasbey C.A., Horgan G.W., Polder G., Dieleman J.A., van der Heijden G.W.A.M. Automatic fruit recognition and counting from multiple images. Biosyst. Eng. 2014;118:203–215. doi: 10.1016/j.biosystemseng.2013.12.008. - DOI
    1. Gatica G., Best S., Ceroni J., Lefranc G. Olive Fruits Recognition Using Neural Networks. Procedia Comput. Sci. 2013;17:412–419. doi: 10.1016/j.procs.2013.05.053. - DOI
    1. Bulanon D.M., Kataoka T. Fruit detection system and an end effector for robotic harvesting of Fuji apples. CIGR J. 2010;12:203–210.
Feedback