Methodological Basis of The UAVs Use for the Weed Detection
Abstract
Purpose. To work out methodological approaches to the use of quadcopters for weeds assesment. Methods. The shooting was carried out using DJI Phantom Vision 2+ and LadyBug Copper Dot. The LadyBug was shoted in the visible and near-infrared range using the 12-megapixel S100 NDVI UAV-Kit camera with elevations: 20 m, 40 m and 60 m. The DJI Phantom Vision 2+ was shot in the visible range of the GoPro 14 megapixel camera altitudes: 10 m, 15 m, 30 m and 60 m. Decryption of photographs was carried out using the controlled classification method in QGIS and TNTmips programs. Weed accounting was performed on control sites 1m2 by weight method, taking into account their qualitative composition. Results. It is shown that the best results of weed recognition during decoding of images was obtained by the use of controlled classification according to the maximum likelihood method under conditions of shooting from heights up to 40 m. In order to improve the recognition of weeds and separate their image from images of cultivated plants, it is expedient to use the object-oriented analysis. At the stage of sunflower budding, about 30% of the weeds are closed from the remote observation, which led to an automatic underestimation of number of weeds. Conclusions. In order to evaluate the crop contamination, it is possible to successfully use the data from UAVs in a visible range of electromagnetic waves under low altitudes (up to 40 meters) and the use of a controlled classification method for decoding images. For the recognition of weeds, the images in the infrared range do not have advantages over images in the visible range. It is necessary to additionally apply ground-based control of weeds to assess the proportion of "hidden" from remote observation of weeds.
Downloads
References
Achasov, A. B., Achasova, A. O., Titenko, G. V., Seliverstov, O. Yu., Syedov, A. O. (2015) Shhodo vy`kory`stannya BPLA dlya ocinky` stanu posiviv [UAV usage for crop estimation]. Visnyk of V.N. Karazin Kharkiv national university Series “Ecology”, 13, 13-18. [In Ukrainian]
Savin, I.YU., Vernyuk, YU.I., Faraslis, I. (2015) Vozmozhnosti ispol'zovaniya bespilotnyh letatel'nyh apparatov dlya monitoringa produktivnosti pochv [Possibilities of using unmanned aerial vehicles for monitoring of soil productivity]. Bulletin of Soil Institute named V.V. Dokuchaev, 80, 95-106. [In Russian]
Pfeifer, J., Khanna, R., Dragos, C., Popovic, M., Galceran, E., Kirchgessner, N., Walter, A., Siegwart, R., Liebisch, F.(2016). Towards automatic UAV data interpretation for precision farming. Proc. ofthe International Conf. ofAgricultural Engineering (CIGR)
Tokekar, P., Hook, J. V., Mulla, D., Isler, V.( 2013). Sensor planning for a symbiotic UAV and UGV system for precision agriculture, 5321-5326.
Achasov, A. B., Syedov, A. O., Achasova, A. O.(2016) Ocinka zabur'yanenosti posiviv sonyashny`ka za dopomogoyu bezpilotny`x lital`ny`x aparativ [Assessment of a contamination of crops of sunflower by means of unmanned aerial vehicles]. Man and the Environment. Issues of Neoecology, 3-4, 69-74. [In Ukrainian]
Shpanev, A. M., Lekomcev, P. V. (2012) Novye podhody k metodike ucheta sornyh rastenij [New approaches to the method of accounting for weed plants]. Plant protection and quarantine: a monthly journal for specialists, scientists and practitioners, 8, 38-41. [In Russian]
Guerrero, J. M., Pajares, G., Montalvo, M., Romeo, J., Guijarro, M. (2012). Support vector machines for crop/weeds identification in maize fields. Expert Systems with Applications, 39(12):11149 – 11155.
Guo, W., Rage, U. K., Ninomiya, S. (2013). Illumination invariant segmentation of vegetation for time series wheat images based on decision tree model. Computers and Electronics in Agriculture, 96:58– 66.
Hamuda, E., Glavin, M., Jones, E.(2016). A survey of im processing techniques for plant extraction and segmentation in the field. Computers and Electronics in Agriculture, 125:184–199.
Lottes, P., Hoferlin, M., Sander, S., Muter, M., Schulze-Lammers, P., Stachniss, C.(2016). An effective classification system for separating sugar beets and weeds for precision farming applications. In Proc. of the IEEE Int. Conf. on Robotics & Automation (ICRA).
Nekos, A. N., Achasov, A. B., Kochanov, E. O. (2017) Metody` vy`miryuvannya parametriv navko-ly`shn`ogo seredovy`shha: dy`stancijni metody: pidruchny`k. [Methods of measuring environmental parameters: distance learning methods] Kharkiv, 2017, 244. [In Ukrainian]
Aryeshnikova, B.A. (1992). Zaxy`st zernovy`x kul`tur vid populyaciyi shkidny`kiv, xvorob ta bur'yaniv pry` intensy`vny`x texnologiyax [Protection of grain crops from the population of pests, diseases and weeds in intensive technologies]. Kiyv. Urojai, 224. [In Ukrainian]
Koot, Th. M. Weed detection with Unmanned Aerial Vehicles in agricultural systems. Thesis Report GIRS-2014-37. - Centre for Geo-Information. Wageningen University. Available at: http://edepot.wur.nl/333537
Authors reserve the right of attribution for the submitted manuscript, while transferring to the Journal the right to publish the article under the Creative Commons Attribution License 4.0 International (CC BY 4.0). This license allows free distribution of the published work under the condition of proper attribution of the original authors and the initial publication source (i.e. the Journal)
Authors have the right to enter into separate agreements for additional non-exclusive distribution of the work in the form it was published in the Journal (such as publishing the article on the institutional website or as a part of a monograph), provided the original publication in this Journal is properly referenced
The Journal allows and encourages online publication of the manuscripts (such as on personal web pages), even when such a manuscript is still under editorial consideration, since it allows for a productive scientific discussion and better citation dynamics (see The Effect of Open Access).