Automatic Fruits Classification System Based on Deep Neural Network

Authors

  • Khadija Munir Department of Information Technology, Hazara University, Mansehra, KPK, Pakistan
  • Arif Iqbal Umar Department of Information Technology, Hazara University, Mansehra, KPK, Pakistan
  • Waqas Yousaf Department of Information Technology, Hazara University, Mansehra, KPK, Pakistan

DOI:

https://doi.org/10.24949/njes.v13i1.501

Keywords:

Deep convolutional neural network, agricultural robotics, Fruit Recognition

Abstract

Fruit classification is playing a vital role in robot-based farming. The plucking of fruits and packing is done using robots nowadays. This could only be possible using efficiently trained robots base on machine learning. Different techniques have been developed for fruit classification, but still, there are many gaps, i.e., efficiency and accuracy. In this research work, we are targeting classification accuracy. This paper presented an Automatic Fruit Detection tool with good precision and recalled using deep learning neural networks. It will help in farming, cultivation, and produce sound effects in robotic farming. The aim is to build an accurate, fast and reliable fruit detection system, a vital element of an autonomous agricultural robotic platform; it is a crucial element for fruit yield estimation and automated harvesting. We used the ResNet-50 in the context of transfer learning. Different training choices were defined, i.e., 10% to 80%. Experimental results show that we compete for the prior approaches even on only 10% training. The proposed approach achieves state-of-the-art results compared to prior work with the F1 Score, which considers both precision and recall performances improving from 0.838 to 0.894 and 0.995 of accuracy. In addition to improved accuracy, this approach is also much quicker as compared to recent approaches.

Author Biography

Waqas Yousaf, Department of Information Technology, Hazara University, Mansehra, KPK, Pakistan

Currently working as Visiting Lecturer at Hazara University Mansehra

References

Valle, H., T. Caboche, and M. Lubulwa, Australian vegetable growing farms: An economic survey, 2011-12 and 2012-13. 2014: ABARES Canberra, ACT.

Kondo, N., M. Monta, and N. Noguchi, Agricultural Robots: Mechanisms and Practice. 2011: Apollo Books.

Deng, L. and D. Yu, Deep learning: methods and applications. Foundations and Trends® in Signal Processing, 2014. 7(3–4): p. 197-387.

Schmidhuber, J., Deep learning in neural networks: An overview. Neural networks, 2015. 61: p. 85-117.

LeCun, Y., Y. Bengio, and G. Hinton, Deep learning. nature 521 (7553): 436. Google Scholar, 2015.

Sharif Razavian, A., et al. CNN features off-the-shelf: an astounding baseline for recognition. in Proceedings of the IEEE conference on computer vision and pattern recognition workshops. 2014.

Nuske, S., et al. Yield estimation in vineyards by visual grape detection. in Intelligent Robots and Systems (IROS), 2011 IEEE/RSJ International Conference on. 2011. IEEE.

Nuske, S., et al., Automated visual yield estimation in vineyards. Journal of Field Robotics, 2014. 31(5): p. 837-860.

Yamamoto, K., et al., On plant detection of intact tomato fruits using image analysis and machine learning methods. Sensors, 2014. 14(7): p. 12191-12206.

Wang, Q., et al. Automated crop yield estimation for apple orchards. in Experimental robotics. 2013. Springer.

Bac, C., J. Hemming, and E. Van Henten, Robust pixel-based classification of obstacles for robotic harvesting of sweet-pepper. Computers and electronics in agriculture, 2013. 96: p. 148-162.

Hung, C., et al. Orchard fruit segmentation using multi-spectral feature learning. in Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on. 2013. IEEE.

Kapach, K., et al., Computer vision for fruit harvesting robots–state of the art and challenges ahead. International Journal of Computational Vision and Robotics, 2012. 3(1/2): p. 4-34.

Song, Y., et al., Automatic fruit recognition and counting from multiple images. Biosystems Engineering, 2014. 118: p. 203-215.

Simonyan, K. and A. Zisserman. Two-stream convolutional networks for action recognition in videos. in Advances in neural information processing systems. 2014.

Krizhevsky, A., I. Sutskever, and G.E. Hinton. Imagenet classification with deep convolutional neural networks. in Advances in neural information processing systems. 2012.

He, K., et al. Deep residual learning for image recognition. in Proceedings of the IEEE conference on computer vision and pattern recognition. 2016.

Everingham, M., et al., The pascal visual object classes challenge: A retrospective. International journal of computer vision, 2015. 111(1): p. 98-136.

Uijlings, J.R., et al., Selective search for object recognition. International journal of computer vision, 2013. 104(2): p. 154-171.

Zitnick, C.L. and P. Dollár. Edge boxes: Locating object proposals from edges. in European conference on computer vision. 2014. Springer.

Ren, S., et al. Faster r-cnn: Towards real-time object detection with region proposal networks. in Advances in neural information processing systems. 2015.

He, K., et al. Spatial pyramid pooling in deep convolutional networks for visual recognition. in European conference on computer vision. 2014. Springer.

Ngiam, J., et al. Multimodal deep learning. in Proceedings of the 28th international conference on machine learning (ICML-11). 2011.

Eitel, A., et al. Multimodal deep learning for robust rgb-d object recognition. in Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on. 2015. IEEE.

Mureşan, H. and M. Oltean, Fruit recognition from images using deep learning. Acta Universitatis Sapientiae, Informatica, 2018. 10(1): p. 26-42.

Sa, I., et al., Deepfruits: A fruit detection system using deep neural networks. Sensors, 2016. 16(8): p. 1222.

Kim, J.-y., M. Vogl, and S.-D. Kim. A code based fruit recognition method via image convertion using multiple features. in IT Convergence and Security (ICITCS), 2014 International Conference on. 2014. IEEE.

Downloads

Published

2020-12-01

Issue

Section

Engineering Sciences