CN113468742B - Machine vision-based soil environment accurate monitoring equipment and method - Google Patents

Machine vision-based soil environment accurate monitoring equipment and method Download PDF

Info

Publication number
CN113468742B
CN113468742B CN202110735140.3A CN202110735140A CN113468742B CN 113468742 B CN113468742 B CN 113468742B CN 202110735140 A CN202110735140 A CN 202110735140A CN 113468742 B CN113468742 B CN 113468742B
Authority
CN
China
Prior art keywords
soil
image
value
average
segmentation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110735140.3A
Other languages
Chinese (zh)
Other versions
CN113468742A (en
Inventor
梁忠伟
陈俊武
刘晓初
冯文康
钟剑鸣
龙胜
范立维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou University
Original Assignee
Guangzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou University filed Critical Guangzhou University
Priority to CN202110735140.3A priority Critical patent/CN113468742B/en
Publication of CN113468742A publication Critical patent/CN113468742A/en
Application granted granted Critical
Publication of CN113468742B publication Critical patent/CN113468742B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Abstract

The invention discloses a machine vision-based soil environment accurate monitoring device and a machine vision-based soil environment accurate monitoring method, wherein the soil environment accurate monitoring device comprises a soil information acquisition device, a soil information processing device and a communication device; the soil information acquisition device comprises a mobile trolley, an image acquisition module and an excavating module, wherein the image acquisition module and the excavating module are arranged on the mobile trolley, and the excavating module is positioned in front of the image acquisition module along the advancing direction of the mobile trolley; the soil information acquisition device is used for constructing a mathematical model related to soil humidity and soil pH value, and after the soil information acquisition device acquires a soil image, the soil information acquisition device is used for processing the acquired soil image to obtain an average yellow value and an average gray value of the soil image, and the average yellow value and the average gray value are input into the mathematical model to obtain a soil humidity value and a soil pH value corresponding to the soil image; the method of the invention has higher precision and lower implementation cost.

Description

Machine vision-based soil environment accurate monitoring equipment and method
Technical Field
The invention relates to soil monitoring equipment, in particular to soil environment accurate monitoring equipment and method based on machine vision.
Background
With the increase of population and the abundance of people's lives, the next chinese puts higher demands on agricultural production. In contrast, the cultivation area of the rural area is reduced, the population of the rural area is aged, and young people escape from the rural area to the city. Under such a background, how to improve cultivation efficiency and realize cultivation automation, unmanned and intelligent is a demand. The development trend of modern agriculture promotes the generation and development of various non-traditional agricultural production technologies, such as remote monitoring of ultraviolet rays, sulfur dioxide and the like of farmland ecological environment by utilizing the internet technology. Various intelligent agriculture is being developed, traditional agriculture is gradually drawn close to intellectualization, and various new technologies are continuously applied to the production processes of cultivation, irrigation, fertilization, pest removal, harvesting, packaging and the like of agricultural production, so that various links of agricultural production are more scientific, efficient and accurate, and meanwhile, the production cost and the pressure brought by labor waste can be reduced. Currently, along with the rapid development of 5G technology, ecological construction of 5G industry is gradually rising, and intelligent agriculture is gradually formed. In the intelligent agricultural production process, how to accurately acquire the characteristic parameters of the soil environment is an important problem, which is a precondition for the whole system to judge, and the system can accurately perform actions such as irrigation, fertilization and the like only by accurately collecting parameters such as humidity, PH value and the like of the soil. The methods for acquiring the soil characteristic parameters are a drilling method, a sensor matrix method and a remote sensing method, but the drilling method has serious abrasion to a probe and low efficiency; the matrix method needs a large number of sensors for arrangement when large-area cultivation is performed, and has high cost; the remote sensing method has the problems of low real-time performance and high cost.
Therefore, it is necessary to design a soil monitoring device or method with low cost, strong real-time performance, high precision and intellectualization.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides the soil environment accurate monitoring equipment based on machine vision, which adopts a machine vision mode to realize intelligent detection on soil monitoring and has the advantages of low cost, strong real-time performance and the like.
The second object of the invention is to provide a soil environment accurate monitoring method for the soil environment accurate monitoring equipment based on machine vision.
The technical scheme for solving the technical problems is as follows:
the soil environment accurate monitoring equipment based on machine vision comprises a soil information acquisition device, a soil information processing device and a communication device for uploading data information acquired by the soil information acquisition device to the soil information processing device, wherein,
the soil information acquisition device comprises a mobile trolley, an image acquisition module and an excavating module, wherein the image acquisition module and the excavating module are arranged on the mobile trolley, and the excavating module is positioned in front of the image acquisition module along the advancing direction of the mobile trolley; the image acquisition module comprises a camera, a horizontal rotation mechanism for driving the camera to rotate left and right and a vertical rotation mechanism for driving the camera to rotate up and down; the excavating module comprises an excavating mechanism and an excavating driving mechanism for driving the excavating mechanism to excavate soil;
the soil information processing device processes the acquired image information to obtain the humidity value and the pH value of the soil.
Preferably, the travelling trolley comprises a trolley body, wheels arranged on the trolley body and a travelling driving mechanism for driving the wheels to rotate, wherein the wheels are arranged on the trolley body, and the travelling driving mechanism is used for driving the wheels to move so as to realize the forward movement, the backward movement, the stop and the steering of the travelling trolley.
A soil environment accurate monitoring method based on machine vision comprises the following steps:
(1) Sampling soil to obtain a plurality of soil samples, and respectively uploading characteristic parameters of the soil samples to a soil information processing device; the uploaded characteristic parameters comprise soil image information, a soil humidity value and a soil pH value;
(2) The soil information processing device processes characteristic parameters in each soil sample and builds a mathematical model of soil humidity and soil pH value, wherein the building steps of the mathematical model comprise:
(2-1) separating soil from a green plant background in the collected soil image to obtain a soil image after green plant removal, and then binarizing the soil image to obtain a soil binary image BW after green plant segmentation;
(2-2) filtering the obtained soil binary image BW;
(2-3) performing corrosion operation on the filtered soil binary image BW, so that the green plants in the soil binary image BW and pixel points belonging to the edges of black areas in the edges of the green plants in the edge part of the soil are replaced by black, and expanding the binary image is realized;
(2-4) graying the soil image after green planting segmentation, and identifying and removing bad areas in the soil image after graying to obtain a soil binary image BW2 after removing the bad areas;
(2-5) integrating the soil binary image BW after green planting segmentation and the soil binary image BW2 after bad zone segmentation to determine a final segmentation area; obtaining an average gray value in the soil binary image after green planting segmentation and bad region segmentation;
(2-6) calculating average yellowness values in the soil binary images after green planting segmentation and bad region segmentation in a CIELAB color mode;
(2-7) linearly fitting the average gray values in the plurality of soil samples and the measured soil humidity values corresponding to the average gray values by using matlab software to obtain a fitting function between the soil gray values and the soil humidity, and linearly fitting the average yellow values in the plurality of soil samples and the measured soil pH values corresponding to the average yellow values by using the matlab function to obtain a fitting function between the soil yellow values and the soil pH values;
(3) After the mathematical model is built, the mobile trolley drives the image acquisition module and the excavating module to move into the soil environment to be detected, the excavating module excavates the soil, and the image acquisition module acquires the images of the excavated soil;
(4) And uploading the image information acquired by the image acquisition module to a soil information processing device by the communication device, processing the acquired soil image information by the soil information processing device to obtain an average gray value and an average yellow value of the soil image, and substituting the average gray value and the average yellow value into a corresponding fitting function in a mathematical model to obtain a soil humidity value and a soil pH value corresponding to the soil image.
Preferably, in the step (2-1), the collected RGB image is converted into the CIE XYZ image mode, and then converted into the Lab image, and in the CIE Lab mode, the green plant and the soil can be segmented by determining whether the value of the a channel is less than zero.
Preferably, in step (2-2), the soil binary image BW after green plant segmentation is median filtered using a 3×3 convolution kernel to filter out some non-green plant but binarized dots.
Preferably, in the step (2-4), the soil image after the green plant segmentation is grayed by an average method, and the steps are as follows: taking the average gray value of three channels in the soil image after green planting segmentation as a gray value, so that the average gray value meets the following formula:
wherein Gray (x, y) is the Gray value of the corresponding point of the image, and R (x, y), G (x, y) and B (x, y) are the numerical values of three channels of the corresponding point of the original image respectively.
Preferably, in step (2-4), excluding the (x, y) points corresponding to the green plant segmented soil binary image BW (x, y) =0, and calculating only the points corresponding to BW (x, y) =255; wherein the average gray value calculation satisfies the following formula:
wherein ,gray (x, y) is the Gray value of the corresponding point, n is the number of points satisfying the value equal to 255 in the binary image;
the gray value standard deviation satisfies the following formula:
in the formula, sigma is the standard deviation of gray scale,gray (x, y) is the Gray value of the corresponding point, n is the number of points satisfying the value equal to 255 in the binary image;
next, a map image BW2 is newly created so as to satisfy:
wherein BW2 (x, y) is the corresponding point of the mapping image, sigma is the gray standard deviation,gray (x, y) is the Gray value of the corresponding point, K is the set threshold;
and finally, carrying out median filtering on the obtained soil binary image BW2 by adopting a convolution kernel of 3 multiplied by 3.
Preferably, in step (2-5), the mapping images BW and BW2 after the green plant segmentation and the bad region segmentation are integrated, i.e. BW (x, y) =0 or BW2 (x, y) =0 is the segmentation region; when BW (x, y) =255 or BW2 (x, y) =255, the soil region is defined, so as to determine a final segmentation region, and finally, the average gray value after segmentation is obtained, so that the final average gray value satisfies the following formula:
wherein ,gray (x, y) is the Gray value of the corresponding point, and n is the number of points satisfying BW (x, y) =255 or BW2 (x, y) =255, which is the average Gray value.
Preferably, in the step (2-6), in the CIE LAB color mode, the average yellowness value is calculated on the soil binary image after green plant segmentation and bad region segmentation so as to satisfy the following formula:
wherein ,for the average yellowness value, b (x, y) is the yellowness value of the corresponding point, n is the value satisfying BW (x, y) =255 or BW2 (x, y) =255 and b (x, y)>Number of dots of 0.
Preferably, in the step (1), before soil sampling is performed, soil surface cleaning is required to remove branches, leaves, weeds, stones and garbage in the soil surface.
Compared with the prior art, the invention has the following beneficial effects:
1. according to the machine vision-based soil environment accurate monitoring device, the excavating module and the image acquisition module are driven to move through the mobile trolley, in the moving process, the excavating module excavates soil, the image acquisition module acquires images of the excavated soil, acquired image information is uploaded to the soil information processing device (such as a computer) through the communication device, and the soil information processing device analyzes and processes the acquired image information to obtain the humidity value and the pH value of the soil, so that real-time monitoring of the soil environment is realized.
2. The machine vision-based soil environment accurate monitoring equipment provided by the invention realizes intelligent detection on soil monitoring in a machine vision mode, and has the advantages of low cost, strong instantaneity and the like.
3. According to the machine vision-based soil environment accurate monitoring method, a mathematical model is constructed by collecting soil samples, the average gray value and the soil humidity value in a soil image are related through the mathematical model, and the average yellow value and the soil pH value in the soil image are related at the same time. In the actual monitoring, after the image acquisition module acquires the soil image, the soil information processing device only needs to process the soil image to acquire the average gray value and the average yellow value of the soil image, and substitutes the average gray value and the average yellow value into the mathematical model to obtain the humidity value and the pH value corresponding to the soil image. The whole implementation process is simple and convenient, and the finally obtained humidity value and pH value have smaller errors and higher precision.
Drawings
Fig. 1 is a schematic perspective view of a machine vision-based soil environment precision monitoring device.
Fig. 2 is a schematic perspective view of an image acquisition module.
Fig. 3 is a schematic perspective view of an excavating module.
Fig. 4 is a captured image of soil.
Fig. 5 is a binary image of soil after green plant segmentation.
Fig. 6 is a filtered soil binary image.
Fig. 7 is a binary image of the soil after erosion.
Fig. 8 is an image obtained by graying a soil image by an average method.
Fig. 9 is a binary image of the soil after removal of the dead zone.
Fig. 10 is a fitted line graph of gray scale value versus humidity.
FIG. 11 is a fitted line graph of yellowness versus pH.
FIG. 12 is a function format of a call.
Detailed Description
The present invention will be described in further detail with reference to examples and drawings, but embodiments of the present invention are not limited thereto.
Example 1
Referring to fig. 1-3, the machine vision-based soil environment precision monitoring apparatus of the present invention includes a soil information collecting device, a soil information processing device, and a communication device for uploading data information collected by the soil information collecting device to the soil information processing device,
referring to fig. 1-3, the soil information acquisition device comprises a mobile trolley 1, an image acquisition module 2 arranged on the mobile trolley 1 and an excavating module 3, wherein the excavating module 3 is positioned in front of the image acquisition module 2 along the travelling direction of the mobile trolley 1; in the motion process, the excavating module 3 excavates soil, the image acquisition module 2 acquires images of the excavated soil, the acquired image information is uploaded to a soil information processing device (such as a computer) through a communication device, and the soil information processing device analyzes and processes the acquired image information to obtain a humidity value and a pH value of the soil, so that real-time monitoring of a soil environment is realized.
Referring to fig. 1-fig. 3, the travelling car 1 comprises a car body, wheels arranged on the car body, and a travelling driving mechanism for driving the wheels to rotate, wherein the wheels are arranged on the car body, and the travelling driving mechanism is used for driving the wheels to move so as to realize the forward movement, the backward movement, the stop and the steering of the travelling car 1; the walking driving mechanism comprises a driving motor and a synchronous belt transmission mechanism, wherein the driving motor transmits power to wheels through the synchronous belt transmission mechanism, so that the travelling trolley 1 is driven to move forwards, backwards, accelerate, decelerate and turn left and right; the inside of the mobile trolley 1 is provided with a control module, the control module is a singlechip and a corresponding driving module, and the motion state of the mobile trolley 1, the acquisition angle of the image acquisition module 2 and the excavating action of the excavating module 3 are controlled through the singlechip and the corresponding driving module; in addition, an infrared module is arranged in the mobile trolley 1, and the mobile trolley 1 can be controlled remotely through infrared equipment.
The specific structure and control manner of the mobile cart 1 in this embodiment may be implemented with reference to an existing intelligent cart, and will not be described in detail herein.
Referring to fig. 1-3, the image acquisition module 2 includes a camera 201, a horizontal rotation mechanism for driving the camera 201 to rotate left and right, and a vertical rotation mechanism for driving the camera 201 to rotate up and down; the vertical rotating mechanism comprises a vertical mounting seat 203 and a vertical motor 202 arranged on the vertical mounting seat 203, wherein the camera 201 is arranged on the vertical mounting seat 203 through a rotating shaft, and a main shaft of the vertical motor 202 is connected with the rotating shaft; the horizontal rotating mechanism comprises a horizontal mounting seat 204 and a horizontal motor 205 arranged on the horizontal mounting seat 204, wherein the horizontal mounting seat 204 is arranged on a vehicle body, and the vertical mounting seat 203 is arranged on a main shaft of the horizontal motor 205 through a rotating shaft 206; the horizontal mount 204 is provided with bearings at positions corresponding to the rotation shafts 206. The horizontal motor 205 drives the vertical mounting seat 203 to horizontally rotate, so that the camera 201 is driven to horizontally rotate, and the vertical motor 202 can drive the camera 201 to vertically rotate, so that the camera 201 can horizontally rotate and vertically rotate, and further shooting of soil is realized at multiple angles.
Referring to fig. 1 to 3, the excavating module 3 includes an excavating mechanism and an excavating driving mechanism for driving the excavating mechanism to excavate soil, wherein the excavating mechanism includes a mechanical arm 302 provided at a lower end of a vehicle body and an excavating shovel 304 provided on the mechanical arm 302; the excavating driving mechanism comprises a first excavating motor 303 for driving the excavating shovel 304 to vertically rotate and a second excavating motor 301 for driving the mechanical arm 302 to rotate, wherein the second excavating motor 301 is installed on a vehicle body, a main shaft of the second excavating motor 301 is connected with one end of the mechanical arm 302, the first excavating motor 303 is installed at the other end of the mechanical arm 302, and the main shaft of the first excavating motor 303 is connected with the excavating shovel 304. The mechanical arm 302 is driven to rotate by the second excavating motor 301, and the excavating shovel 304 is driven to rotate by the first excavating motor 303; the two are matched, so that soil is excavated, and the excavation depth can be controlled.
Referring to fig. 1-3, the soil information processing device is used for processing the data information uploaded by the image acquisition module 2 to obtain the humidity value and the pH value of the soil, wherein the soil information processing device can be a background computer, and the specific how to obtain the pH value and the humidity value of the soil by processing the acquired soil image can be seen in the following method section.
Referring to fig. 1-3, the communication device may adopt a wireless communication manner, for example, an infrared module, a 5G module and the like are arranged on the mobile trolley 1, so that data communication is realized between the mobile trolley 1 and a control terminal (such as a background computer or an operator). Thus, the soil image information acquired by the image acquisition module 2 can be uploaded to the background computer, and the background computer or other control terminals can send control instructions to the control module on the mobile trolley 1, so that the mobile trolley 1, the image acquisition module 2 and the mining module 3 are controlled.
Referring to fig. 4-12, the machine vision-based soil environment accurate monitoring method of the invention comprises the following steps:
(1) Sampling soil to obtain a plurality of soil samples, and respectively uploading characteristic parameters of the soil samples to a soil information processing device; the uploaded characteristic parameters comprise soil image information, a soil humidity value and a soil pH value;
(2) The soil information processing device processes characteristic parameters in each soil sample and builds a mathematical model of soil humidity and soil pH value, wherein the building steps of the mathematical model comprise:
(2-1), soil image sample removal plant
In order to avoid interference of green plants with analysis of soil characteristic parameters, it is necessary to separate the soil from the plant background to obtain the desired soil image.
In this embodiment, the CIE Lab removal method is adopted, the green planting identification effect is better, part of the central area can be accurately identified, in addition, the yellowish leaves can be identified and processed, and the edge area identification effect is better, and the specific steps are as follows:
the L (luminance) channel in CIE Lab color mode is related to the illumination intensity, while the a (red green) channel is less affected by the illumination intensity. When the a channel is negative, the color is shown to be greenish, so that the part can be judged to be green plant; therefore, the green planting and the soil can be segmented by judging whether the value of the channel a is smaller than zero, namely, the image binarization processing is carried out according to the threshold value K=0. In addition, since the Lab value of the withered and yellow leaf is located between the a-axis and the b+ axis in the a-b coordinate system, that is, yellowish green, segmentation can be also accomplished using this method. When Lab values are located on the a-axis and the b-axis, if the included angle between the connecting line of the point and the origin of coordinates and the a-axis is larger than 45 degrees, that is, the value of the a channel is larger than that of the b channel, the Lab values are displayed as bluish colors instead of colors of general plants, the situation needs to be eliminated, and segmentation is not needed. Since the acquired image is in RGB mode, it is necessary to convert the RGB image into the Lab image first. The RGB image cannot be directly converted into the Lab image, and the RGB image needs to be converted into a CIE XYZ image mode and then converted into the Lab mode;
wherein, the X (red), Y (green), Z (blue) and R, G, B channels satisfy the following relation:
equivalent to the formula:
converting the XYZ color mode to Lab mode satisfies the following formula:
wherein standard light source values:
X n =95.047,Y n =100.0,Z n =108.883
f (x) is a function satisfying the following formula:
after obtaining the values of L, a and b, performing binarization processing according to a:
wherein BW (x, y) is a value of a mapping image corresponding point, a (x, y) is a value of an a channel of the original image corresponding point, K is a threshold value and k=0; the effect of binarization is shown in FIG. 5.
(2-2), filtering
After green plants in the soil image are removed, a lot of small-point area interference exists, filtering processing is needed to be carried out on the soil image, and the influence of noise is reduced on the premise of affecting the original image as little as possible.
In this embodiment, a 3×3 convolution kernel is used to perform median filtering on the binary image after green planting is removed, so as to filter out some non-green planting but binarized small points, and the obtained image is shown in fig. 6, so that it can be seen that the interference of the small points is well eliminated by using median filtering, and preparation is made for the next image processing.
(2-3), image Corrosion
After the filtered soil binary image is obtained, since the green plants in the image and the edges belonging to the green plants in the edge portion of the soil may not be completely removed, it is necessary to perform a corrosion operation on the image, so that the pixel points at the edge of the black (0) region are replaced by the minimum values in the surrounding region, that is, are replaced by black (0), thereby realizing the expansion of the binary image. In this way, although the effective pixels may be excessively removed, the soil-belonging part is removed, and the data amount is reduced, the loss is very small, the accuracy of the post-operation can be improved, and the comprehensive effect is more suitable.
Before performing the image erosion operation, a morphological structure element needs to be defined. This embodiment employs a "disk" with a radius of 1. The binary image corrosion is carried out according to the set value to obtain the image shown in fig. 7, and as can be seen from fig. 7, the range of the original green plant part is expanded a little, the identification accuracy is improved, and the edge error is avoided.
(2-4) "bad region" identification and removal
The bad region refers to a region in the image which seriously interferes with the extraction of the soil characteristic parameters. In practical farmlands there are many things other than green plants and soil, such as stones, branches, and rotted brown leaves. These "dead zones" need to be separated from the soil image. Therefore, the image needs to be converted into a gray image, the gray distribution histogram can be determined to be approximate to normal distribution through analyzing the gray value of the soil, and the soil can be considered to be mostly distributed in a certain interval [ a, b ], and the bad area is mostly distributed outside the interval [ a, b ]. Therefore, only the interval [ a, b ] is determined, and the pixel values outside the interval are divided from the values in the interval to identify a bad region.
Firstly, graying a soil image by an average value method, wherein the concrete process comprises the following steps:
the average method is that the average gray value of three channels is used as the gray value, and the following formula is satisfied:
wherein Gray (x, y) is the Gray value of the corresponding point of the image, and R (x, y), G (x, y) and B (x, y) are the numerical values of three channels of the corresponding point of the original image respectively. The gray value image obtained by this method is as follows in fig. 8:
next, the "bad region" is removed by means of the mean and standard deviation "
Since the image has been segmented by green plants, these regions (i.e., the regions where green plants exist) need to be removed when calculating the average gray value and the mean square error, i.e., the (x, y) points corresponding to the binary image BW (x, y) =0 after the green-removal process should be excluded, and only the points corresponding to BW (x, y) =255 are calculated. The average gray value calculation satisfies the following formula:
wherein ,gray (x, y) is the corresponding point for the average Gray valueThe gray value, n, is the number of points satisfying a value equal to 255 in the binary image.
The gray value standard deviation satisfies the following formula:
wherein, sigma is the standard deviation of gray scale,gray (x, y) is the Gray value of the corresponding point, and n is the number of points satisfying a value equal to 255 in the binary image.
For the case of approximately normal distribution, a threshold K can be set if the gray value of the pixel fallsIn the interval, the binary value of the map image is 0, so that a new map image BW2 is needed, and the following formula is needed to be satisfied:
wherein BW2 (x, y) is the corresponding point of the mapping image, sigma is the gray standard deviation,for the average Gray value, gray (x, y) is the Gray value of the corresponding point, and K is a threshold value set for human.
By adjusting the threshold values of different images, a proper threshold value is obtained. K=3 is initially selected, the threshold is increased or decreased at intervals of 0.5, and by comparison, the effect of dividing the 'bad zone' is optimal when k=5. Finally, median filtering is carried out on the movable binary image by adopting a convolution kernel of 3×3, and the obtained image is shown in fig. 9: it can be seen that the small stones, dead branches and the like in the image are identified, and the bad area removing effect is obvious.
(2-5) calculation of average Gray value
Integrating the two mapping images BW and BW2 after green plant segmentation and 'bad region' segmentation, namely BW (x, y) =0 or BW2 (x, y) =0 is a segmentation region; when BW (x, y) =255 or BW2 (x, y) =255, the soil area is defined, the final segmentation area is determined according to the soil area, and finally, the average gray value after segmentation is obtained; the final average gray value satisfies the following formula:
wherein ,gray (x, y) is the Gray value of the corresponding point, and n is the number of points satisfying BW (x, y) =255 or BW2 (x, y) =255, which is the average Gray value.
(2-6) calculating the average yellowness value in CIE LAB color mode
In the CIE LAB color mode, when the b axis is yellow Lan Zhou and the soil color is yellowish, the corresponding point is on the b+ axis side in the a-b coordinate system, so that the relationship curve of alkalinity and b-value can be established in a certain range. Meanwhile, in the case of the b-axis (bluish), it is considered as non-alkaline soil, and the PH cannot be calculated from the alkalinity curve. The average value of a values for points satisfying the condition can then be calculated. Since the image has been green-removed and "dead zone" removed, the operation is only required in the previously processed image, i.e. the following formula is satisfied:
wherein ,for the average yellowness value, b (x, y) is the yellowness value of the corresponding point, n is the value satisfying BW (x, y) =255 or BW2 (x, y) =255 and b (x, y)>Number of dots of 0.
(2-7) linearly fitting the average gray values in the plurality of soil samples and the measured soil humidity values corresponding to the average gray values by using matlab software to obtain a fitting function between the soil gray values and the soil humidity, and linearly fitting the average yellow values in the plurality of soil samples and the measured soil pH values corresponding to the average yellow values by using the matlab function to obtain a fitting function between the soil yellow values and the soil pH values; thus completing the construction of the mathematical model.
(3) After the mathematical model is built, the mobile trolley 1 drives the image acquisition module 2 and the excavation module 3 to move into the soil environment to be detected, the excavation module 3 excavates the soil, and the image acquisition module 2 acquires the images of the excavated soil.
(4) And uploading the image information acquired by the image acquisition module to a soil information processing device by the communication device, processing the acquired soil image information by the soil information processing device to obtain an average gray value and an average yellow value of the soil image, and substituting the average gray value and the average yellow value into a corresponding fitting function in a mathematical model to obtain a soil humidity value and a soil pH value corresponding to the soil image.
Example 2
The soil environment accurate monitoring method based on machine vision of the invention is discussed by specific cases as follows:
1. firstly, sampling soil, wherein the sampled soil is soil in a Guangdong region, and the sampled soil environment monitoring equipment is a millet HHCCCJCY 01HHCC flower and grass soil detector and a soil detector; the soil characteristic parameters to be collected are humidity value and PH value.
2. Before sampling, cleaning the soil surface layer, and removing branches, leaves, weeds, stones and garbage in the soil surface layer, so that the accuracy of soil data acquisition can be improved; then a rectangular area of about 40mm by 30mm is divided for label marking. Taking a soil photo after marking, inserting soil environment monitoring equipment into soil for 80mm according to specified operation, and recording data on a mobile phone APP and an instrument panel; then excavating the surface layer of the soil, taking a picture of the deep soil, and finally restoring the soil environment to finish the sampling process.
3. The soil environment data are soil humidity value and PH value, the humidity value and PH value on the mobile phone APP and the instrument panel are recorded according to the number, and then partial soil environment data are selected through the following table 1.
Table 1 sample data
4. The method for transmitting the collected soil data image from the STM32 singlechip development board to the computer is to use serial port communication; the data acquired by the camera is sent to the computer at the baud rate of 9600bps, so that the collected digital image is acquired on the computer.
5. The collected digital images are respectively calculated to gray values and yellow values of each sample by the machine vision-based soil environment accurate monitoring method, and data are recorded according to sampling numbers. The data recorded in part are as follows in table 2:
table 2 calculation results
Soil numbering Average gray value Average yellowness value b
1 80.809 17.471
2 81.454 18.652
3 81.748 21.156
4 79.865 17.865
5 81.771 19.749
6 80.875 21.383
7 82.376 17.253
6. The gray value of the sample image has a linear relation with the soil humidity in a certain range, the gray value calculated by the collected discrete data is taken as an independent variable x, the humidity is taken as an independent variable y, and then the x and the y meet the following formula:
y=kx+b
linear fitting is performed on the acquired and processed data in matlab to obtain a fitting straight line, and the slope and intercept thereof, as shown in fig. 10:
the slope k= -0.4012, intercept b= 47.3143 available; then obtain a fitting straight line
y=-0.4012x+47.3143
7. The yellowness value of the sample image and the PH value of the soil have a linear relation in a certain range; taking the calculated yellowness value of the acquired discrete data as an independent variable x and the PH value as a dependent variable y, and enabling the x and the y to meet the following formula:
y=kx+b
linear fitting is performed on the acquired and processed data in matlab to obtain a fitting straight line, and the slope and intercept thereof, as shown in fig. 11:
the slope k= 0.0885, intercept b= 5.7461, available. Then obtain a fitting straight line
y=0.0885x+5.7461
8. function writing
Writing a data processing program into a function, wherein the input quantity is the acquired read sampling image, and calculating to obtain the calculated average gray value, average yellow value, humidity value and PH value. The call function format is shown in fig. 12, and a function named "tuxiang" is called, and the calculated humidity value and PH value are returned.
9. Reliability test
After the fitted straight line is obtained, the image shot by the camera can be input into a matlab for processing and then humidity and PH value are obtained. And verifying the reliability of the fitting straight line by using the collected test sample. The following table 3 shows the humidity and PH values obtained by processing a portion of the test samples and the actual humidity and PH values and their errors:
table 3 relative error calculation
After analysis of 10 sets of validation samples, the relative error of the actual humidity value and the calculated humidity value was 4.5% and the relative error of the actual PH value and the calculated PH value was 1.57%.
The foregoing is illustrative of the present invention, and is not to be construed as limiting thereof, but rather as merely providing for the purpose of teaching herein before described various modifications, alternatives, variations and alternatives, as well as variations and alternatives, without departing from the spirit and principles of the invention.

Claims (7)

1. The soil environment accurate monitoring method based on the machine vision is characterized by adopting soil environment accurate monitoring equipment based on the machine vision, wherein the soil environment accurate monitoring equipment comprises a soil information acquisition device, a soil information processing device and a communication device for uploading data information acquired by the soil information acquisition device to the soil information processing device; the soil information acquisition device comprises a mobile trolley, an image acquisition module and an excavating module, wherein the image acquisition module and the excavating module are arranged on the mobile trolley, and the excavating module is positioned in front of the image acquisition module along the advancing direction of the mobile trolley; the image acquisition module comprises a camera, a horizontal rotation mechanism for driving the camera to rotate left and right and a vertical rotation mechanism for driving the camera to rotate up and down; the excavating module comprises an excavating mechanism and an excavating driving mechanism for driving the excavating mechanism to excavate soil; the soil information processing device processes the acquired image information to obtain a humidity value and a pH value of the soil;
the soil environment accurate monitoring method comprises the following steps:
(1) Sampling soil to obtain a plurality of soil samples, and respectively uploading characteristic parameters of the soil samples to a soil information processing device; the uploaded characteristic parameters comprise soil image information, a soil humidity value and a soil pH value;
(2) The soil information processing device processes characteristic parameters in each soil sample and builds a mathematical model of soil humidity and soil pH value, wherein the building steps of the mathematical model comprise:
(2-1) separating soil from a green plant background in the collected soil image to obtain a soil image after green plant removal, and then binarizing the soil image to obtain a soil binary image BW after green plant segmentation;
(2-2) filtering the obtained soil binary image BW;
(2-3) performing corrosion operation on the filtered soil binary image BW, so that the green plants in the soil binary image BW and pixel points belonging to the edges of black areas in the edges of the green plants in the edge part of the soil are replaced by black, and expanding the binary image is realized;
(2-4) graying the soil image after green planting segmentation, and identifying and removing bad areas in the soil image after graying to obtain a soil binary image BW2 after removing the bad areas;
(2-5) integrating the soil binary image BW after green planting segmentation and the soil binary image BW2 after bad zone segmentation to determine a final segmentation area; obtaining an average gray value in the soil binary image after green planting segmentation and bad region segmentation;
(2-6) calculating average yellowness values in the soil binary images after green plant segmentation and bad region segmentation in a CIE LAB color mode;
(2-7) linearly fitting the average gray values in the plurality of soil samples and the measured soil humidity values corresponding to the average gray values by using matlab software to obtain a fitting function between the soil gray values and the soil humidity, and linearly fitting the average yellow values in the plurality of soil samples and the measured soil pH values corresponding to the average yellow values by using the matlab function to obtain a fitting function between the soil yellow values and the soil pH values;
(3) After the mathematical model is built, the mobile trolley drives the image acquisition module and the excavating module to move into the soil environment to be detected, the excavating module excavates the soil, and the image acquisition module acquires the images of the excavated soil;
(4) And uploading the image information acquired by the image acquisition module to a soil information processing device by the communication device, processing the acquired soil image information by the soil information processing device to obtain an average gray value and an average yellow value of the soil image, and substituting the average gray value and the average yellow value into a corresponding fitting function in a mathematical model to obtain a soil humidity value and a soil pH value corresponding to the soil image.
2. The soil environment accurate monitoring method according to claim 1, wherein the mobile trolley comprises a trolley body, wheels arranged on the trolley body and a traveling driving mechanism for driving the wheels to rotate, wherein the wheels are arranged on the trolley body, and the traveling driving mechanism is used for driving the wheels to move so as to realize the forward movement, the backward movement, the stop and the steering of the mobile trolley.
3. The method for accurately monitoring the soil environment according to claim 2, wherein in the step (2-1), the collected RGB image is converted into a CIE XYZ image mode, and then into a Lab image, and in the CIE Lab mode, the green plant and the soil can be segmented by judging whether the value of the a channel is smaller than zero.
4. The method for accurately monitoring the soil environment according to claim 2, wherein in the step (2-2), the 3 x 3 convolution kernel is adopted to perform median filtering on the soil binary image BW after the green plant segmentation, so as to filter out partial non-green plant but binarized small points.
5. The method for accurately monitoring the soil environment according to claim 2, wherein in the step (2-4), the soil image after the green plant segmentation is grayed by an average method, comprising the steps of: taking the average gray value of three channels in the soil image after green planting segmentation as a gray value, so that the average gray value meets the following formula:
wherein Gray (x, y) is the Gray value of the corresponding point of the image, and R (x, y), G (x, y) and B (x, y) are the numerical values of three channels of the corresponding point of the original image respectively.
6. The method of claim 2, wherein in the step (2-6), in the CIE LAB color mode, the average yellowness value is calculated on the binary image of the soil after the green plant segmentation and the bad region segmentation so as to satisfy the following formula:
wherein ,for the average yellowness value, b (x, y) is the yellowness value of the corresponding point, n is the value satisfying BW (x, y) =255 or BW2 (x, y) =255 and b (x, y)>Number of dots of 0.
7. The method for accurately monitoring the soil environment according to claim 2, wherein in the step (1), before soil sampling, soil surface cleaning is required, and branches, leaves, weeds, stones and garbage in the soil surface are removed.
CN202110735140.3A 2021-06-30 2021-06-30 Machine vision-based soil environment accurate monitoring equipment and method Active CN113468742B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110735140.3A CN113468742B (en) 2021-06-30 2021-06-30 Machine vision-based soil environment accurate monitoring equipment and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110735140.3A CN113468742B (en) 2021-06-30 2021-06-30 Machine vision-based soil environment accurate monitoring equipment and method

Publications (2)

Publication Number Publication Date
CN113468742A CN113468742A (en) 2021-10-01
CN113468742B true CN113468742B (en) 2023-08-29

Family

ID=77874369

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110735140.3A Active CN113468742B (en) 2021-06-30 2021-06-30 Machine vision-based soil environment accurate monitoring equipment and method

Country Status (1)

Country Link
CN (1) CN113468742B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108537851A (en) * 2018-03-29 2018-09-14 湖南农业大学 A kind of detection method of soil moisture and its application
CN109297963A (en) * 2018-10-12 2019-02-01 湖南农业大学 Soil image acquisition equipment, soil water-containing amount detection systems and detection method
CN110264459A (en) * 2019-06-24 2019-09-20 江苏开放大学(江苏城市职业学院) A kind of interstices of soil characteristics information extraction method
CN111580490A (en) * 2020-06-30 2020-08-25 常州市摩锐可能源装备技术开发有限公司 Intelligent self-control agricultural and forestry crop monitoring system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108537851A (en) * 2018-03-29 2018-09-14 湖南农业大学 A kind of detection method of soil moisture and its application
CN109297963A (en) * 2018-10-12 2019-02-01 湖南农业大学 Soil image acquisition equipment, soil water-containing amount detection systems and detection method
CN110264459A (en) * 2019-06-24 2019-09-20 江苏开放大学(江苏城市职业学院) A kind of interstices of soil characteristics information extraction method
CN111580490A (en) * 2020-06-30 2020-08-25 常州市摩锐可能源装备技术开发有限公司 Intelligent self-control agricultural and forestry crop monitoring system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于图像灰度值模型测定土壤含水量研究;罗东城;张立成;廖健程;马小岳;谢仕豪;胡德勇;;山东农业科学(第07期);正文全文 *

Also Published As

Publication number Publication date
CN113468742A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
Hall et al. Optical remote sensing applications in viticulture‐a review
CN102663397B (en) Automatic detection method of wheat seedling emergence
CN110765916B (en) Farmland seedling ridge identification method and system based on semantics and example segmentation
CN107748886B (en) Track type modern standardized orchard information sensing system based on depth camera
KR20150000435A (en) Recongnition of Plant Growth Steps and Environmental Monitoring System and Method thereof
CN103454285A (en) Transmission chain quality detection system based on machine vision
CN111727457B (en) Cotton crop row detection method and device based on computer vision and storage medium
CN102915620B (en) Geologic environment disaster monitoring method
CN104601956A (en) Power transmission line online monitoring system and method based on fixed-wing unmanned aerial vehicle
CN104239899A (en) Electric transmission line spacer identification method for unmanned aerial vehicle inspection
CN114239756B (en) Insect pest detection method and system
CN111898494B (en) Mining disturbed land boundary identification method
CN113468742B (en) Machine vision-based soil environment accurate monitoring equipment and method
CN107977531A (en) A kind of method that ground resistance hard measurement is carried out based on image procossing and field mathematical model
CN116897668B (en) Electric-drive crop sowing and fertilizing control method and system
AU2021101780A4 (en) Aboveground Biomass Estimation and Scale Conversion for Mean Regional Spectral Units
CN113063375B (en) Unmanned aerial vehicle remote sensing extraction method for linear farming ridges
CN102288776B (en) Corn plant growth rate measuring method
CN114120218A (en) River course floater monitoring method based on edge calculation
CN103593840A (en) Method for detecting phenotype of Arabidopsis
CN105427279A (en) Grassland drought status monitoring system based on and machine vision and Internet of things, grassland drought status monitoring method
KR20180096966A (en) Automatic Counting Method of Rice Plant by Centroid of Closed Rice Plant Contour Image
CN115330868A (en) Grape picking method based on deep learning and depth information fusion
CN115372281A (en) Monitoring system and method for physical structure and chemical composition of soil
CN115294472A (en) Fruit yield estimation method, model training method, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant