CN108107887B - Farmland environment sensing method for agricultural machinery navigation - Google Patents

Farmland environment sensing method for agricultural machinery navigation Download PDF

Info

Publication number
CN108107887B
CN108107887B CN201711267193.7A CN201711267193A CN108107887B CN 108107887 B CN108107887 B CN 108107887B CN 201711267193 A CN201711267193 A CN 201711267193A CN 108107887 B CN108107887 B CN 108107887B
Authority
CN
China
Prior art keywords
target
radar
camera
distance
agricultural machinery
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711267193.7A
Other languages
Chinese (zh)
Other versions
CN108107887A (en
Inventor
张瑞宏
奚小波
金亦富
张剑峰
单翔
蔡广林
孙福华
叶伟伟
史扬杰
马国梁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Woyang Machinery Technology Co ltd
Original Assignee
Nanjing Woyang Machinery Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Woyang Machinery Technology Co ltd filed Critical Nanjing Woyang Machinery Technology Co ltd
Publication of CN108107887A publication Critical patent/CN108107887A/en
Application granted granted Critical
Publication of CN108107887B publication Critical patent/CN108107887B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/005Prospecting or detecting by optical means operating with millimetre waves, e.g. measuring the black losey radiation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geophysics (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)
  • Measuring Magnetic Variables (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Guiding Agricultural Machines (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a farmland environment sensing method for agricultural machinery navigation in the technical field of agricultural machinery path control, which specifically comprises the following steps of 1: before the agricultural machinery works, calibrating a camera, and fusing radar and visual information on a space; step 2: when the agricultural machinery works, the first distance detection device detects the height change between the radar and the ground, the second distance detection device detects the height change between the camera and the ground, and the coordinate conversion between the radar and the camera is adjusted in real time, so that the radar and the camera are synchronized in space; and step 3: the industrial personal computer resolves the received millimeter wave radar data, determines an effective target, determines a most dangerous target and synchronously acquires images of the cameras; and 4, step 4: judging the state of the most dangerous target according to the radar information, judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera, and controlling the agricultural machinery to do corresponding action by the navigation box; the method has high data fusion precision and improves the accuracy of identifying the barrier.

Description

Farmland environment sensing method for agricultural machinery navigation
Technical Field
The invention relates to an automatic turning path planning and a control method thereof, in particular to a farmland environment sensing method for agricultural machinery navigation.
Background
The precision agricultural technology is considered to be the leading edge of the development of agricultural science and technology in the 21 st century and is one of the modern agricultural production management technologies with the highest science and technology and the strongest integrated comprehensiveness. The precise agricultural technology is a system for implementing a set of modern farming operation technology and management in a positioning, timing and quantitative manner according to spatial variation, and is a novel agricultural technology which comprehensively combines information technology and agricultural production.
The application of precision agriculture to rapid development can fully excavate the maximum production potential of farmland, reasonably utilize water and fertilizer resources, reduce environmental pollution and greatly improve the yield and quality of agricultural products.
The development of the accurate agricultural technology is an effective solution for solving the problems of ensuring the total amount of agricultural products, adjusting the agricultural industrial structure, improving the quality and quality of the agricultural products, seriously insufficient resources, low utilization rate, environmental pollution and the like in the process of developing the agriculture from the traditional agriculture to the modern agriculture in China, and is a necessary way for the modern development and transformation and upgrading of the agriculture in China.
The satellite navigation technology is one of the basic components of the precise agricultural technology, so that the agricultural machine can automatically run, and the navigation system guides the agricultural machine to enter an automatic operation mode to start linear farming after parameters are set before the agricultural machine operates. In the automatic navigation process of the agricultural machine, the environment of a farmland is severe and complex, telegraph poles, ridges, soil dunes, livestock, workers appearing at any time and the like may exist in a large farmland, and the factors provide new challenges for the realization of the unmanned agricultural machine. In the prior art, the satellite navigation technology can be used for realizing automatic walking of the agricultural machine in a farmland, but the agricultural machine cannot accurately identify a barrier in front of the agricultural machine, namely the agricultural machine cannot sense the farmland environment, and not to mention the treatment of automatically stopping, waiting for running or the like according to the sensed farmland environment; therefore, a set of farmland environment sensing method is urgently needed to be researched, so that unmanned agricultural machinery has the capability of sensing the surrounding environment, and once the situations of telegraph poles, ridges, dunes, livestock, workers and the like existing at any time exist in the farmland, emergency treatment such as parking waiting can be timely adopted.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to overcome the defects in the prior art, solve the technical problem that the unmanned agricultural machine cannot sense the farmland environment in the prior art, and provide the unmanned farmland environment sensing method for the agricultural machine.
The purpose of the invention is realized as follows: a farmland environment sensing method for agricultural machinery navigation specifically comprises the following steps,
step 1: before the agricultural machinery works, calibrating a camera, carrying out spatial coordinate transformation on the camera, and then carrying out combined calibration on radar vision so as to fuse radar and vision information on the space;
step 2, when the agricultural machinery works, the distance detection device detects △ h of height change between the radar and the ground in real timestThe height change △ h between the distance detection device II and the ground is detected in real timectThe industrial personal computer performs data processing and adjusts the coordinate conversion relationship between the radar and the camera in real time, so that the radar and the camera are synchronized in space under the operation condition;
and step 3: the industrial personal computer resolves the received millimeter wave radar data, determines an effective target, selects an area in front of the agricultural machinery operation where the radar is interested, determines the most dangerous target, and synchronously collects images of the cameras;
and 4, step 4: judging the motion state of the most dangerous target according to the information of the radar, judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera, transmitting an action instruction to a navigation box by an industrial personal computer, and controlling the agricultural machinery to do corresponding action by the navigation box;
wherein, when the agricultural machine works, the running speed of the agricultural machine is uniform;
the first distance detection device is mounted on the front side of the agricultural machine and arranged right below the radar, and the second distance detection device is mounted on the lower side of the agricultural machine and arranged right below the camera; the distance detection device comprises a guide sleeve and a guide rod, wherein the guide sleeve is provided with a containing cavity, the guide sleeve is arranged on the agricultural machine, the inner wall of the top of the guide sleeve is connected with a distance sensor which is right opposite to the guide rod, the guide rod can slide in the guide sleeve, the upper side of the guide rod is detachably connected with a limiting plate which limits the guide rod to move in the containing cavity, and the bottom of the guide rod is provided with a universal roller which can roll on the ground;
and the industrial personal computer receives the data signal sent by the distance sensor and performs data processing.
In order to achieve a preliminary synchronization of the radar and the camera in space, the step 1 of converting the vehicle coordinates into image pixel coordinates specifically comprises the steps of,
step 1.1: before the agricultural machine works, the ground is defaulted to be horizontal, the millimeter wave radar is fixedly installed on the front side of the agricultural machine and located on the longitudinal center axis of the agricultural machine, and the radar emitting surface faces outwards, so that the radar emitting surface is perpendicular to the ground; when the camera is installed, the optical axis of the camera is parallel to the ground;
step 1.2: establishing a radar coordinate system 0 by taking the center of the radar as an origin0-X0Y0Z0The plane of the millimeter wave radar is defined by X0Axis and Y0Axis determination and Z0Axis vertical, Z0The shaft is parallel to the ground and is superposed with the central axis of the agricultural machine; establishing a camera coordinate system Oc-XcYcZc, taking the center of the camera as an origin Oc, wherein a plane XcOcYc is parallel to an imaging plane of the camera, and a Zc axis is a framing optical axis of the camera and is vertical to the imaging plane; establishing a vehicle coordinate system Ow-XwYwZw, wherein Ow is the intersection point of the center of the agricultural machinery rear axle of the agricultural machinery and the central axis of the vehicle, Xw axis is horizontally rightward and vertical to the longitudinal central axis of the agricultural machinery, Zw axis is horizontally forward and is coincided with the central axis of the agricultural machinery, Yw axis is vertical to the water surface and the ground surface and upward, and X of a radar coordinate system is0O0Z0The plane is parallel to the XwOwZw plane of the vehicle coordinate system;
step 1.3: the point where the optical axis intersects the imaging plane is the principal point O' of the image, and the vehicle coordinates pass through the rotation matrix R and the translation vector scObtaining the coordinates (x) of the camera after conversionc,yc,zc,1)TThe vehicle coordinate of the arbitrary point P is (x)w,yw,zw,1)TThe vehicle coordinates are converted into camera coordinates, and the specific conversion relationship is as follows,
Figure 100002_DEST_PATH_IMAGE001
(1-1)
in the formula (1-1), R is an orthogonal identity matrix of three rows and three columns, sc1 x 3 translation matrix from vehicle coordinate system to camera coordinate system in initial condition, xcIs the distance between the central axis of the camera and the central axis of the vehicle, ycHeight of the camera from the ground in the initial condition, zcThe distance between the camera and the rear shaft of the agricultural machine is shown;
step 1.4: will camera coordinate (x)c,yc,zc,1)TConversion to image matterPhysical coordinate (x)1,y1TAnd the specific conversion relationship is as follows,
Figure 906052DEST_PATH_IMAGE002
(1-2)
in the formula (1-2), f is the focal length of the camera, and the focal length unit is mm;
step 1.5: the physical coordinates (x) of the image1,y1TAnd converting to image pixel coordinates (u, v), wherein the specific conversion relation is as follows:
Figure 898279DEST_PATH_IMAGE003
(1-3)
where dx and dy denote the unit size of each pixel in the horizontal and vertical axes, u0、v0Respectively are the horizontal and vertical coordinates of the intersection point of the optical axis of the camera and the imaging plane under the image pixel coordinate system, and the coordinate unit is pixel;
step 1.6: the conversion formula from the image pixel coordinate system to the vehicle coordinate system is obtained according to the above formulas (1-1) to (1-3), specifically,
Figure 796965DEST_PATH_IMAGE004
(1-4);
step 1.7: in order to spatially fuse the radar and visual information, the coordinate transformation relationship in step 1.6 is updated to,
Figure DEST_PATH_IMAGE005
(1-5);
wherein s = sc+s0,s0Is set as (x)s0,ys0,zs0),xs0=0,ys0Height of radar from ground in initial condition, zs0The distance between the radar and the rear axle of the agricultural machine.
In order to improve the fusion precision of the radar and the camera in the operation process of the agricultural machine, the method comprises the following stepsThe industrial personal computer in the step 2 carries out data processing and real-time adjustment on the conversion relation between the radar and the camera coordinates, specifically, the translation vector s is adjusted in real time according to the actual road condition of the agricultural machinery, and the adjusted translation vector s is obtained under the scanning period tt= sc+s0+△stThe real-time transformation relationship between the vehicle coordinates and the image pixel coordinates, specifically,
Figure 225541DEST_PATH_IMAGE006
(2-1)
Figure DEST_PATH_IMAGE007
(2-2)
wherein, △ hct△ h as the change value of the height of the camera and the ground under the scanning period tstIs the change value of the radar and the ground height under the scanning period t, j is the scanning period number, (u)t,vt) And updating the image pixel coordinates obtained by calculation in real time under the scanning period t in the agricultural machinery operation process.
In order to further improve the accuracy of obtaining the radar height variation value from the ground and the camera height variation value from the ground, △ h is obtained from the translation vector adjusted in real time in the step 2tThe steps of (a) are as follows,
step 2.1: the height variation between the radar and the ground and the height variation between the camera and the ground are calculated in real time, specifically,
the distance between the radar and the ground is △ h when the distance between the radar and the ground is changed at the moment i and the moment i-1 in the scanning period tstiCalculating the height variation △ h of the radar relative to the ground in the scanning period t by using an averaging methodst
Figure 172769DEST_PATH_IMAGE008
(2-3);
Suppose that the height variation value between the camera and the ground at the sampling time i and the sampling time i-1 in the scanning period t is △ hctiCalculating the height variation between the camera and the ground in the scanning period t by averagingAmount △ hct
Figure 100002_DEST_PATH_IMAGE009
(2-4);
Step 2.2: calculating a translation vector s after self-adaptive adjustment under a scanning period t in real timetSpecifically, the method comprises the following steps of,
Figure 504655DEST_PATH_IMAGE010
(2-5)
where k is the total number of sample points in one scan period.
In order to further improve the accuracy of the resolved radar data, the determining of the effective target by the resolved radar data in the step 3 specifically includes the following steps,
step 3.1, resolving the data received by the radar according to a millimeter wave radar protocol to obtain the angle α, the distance r and the relative speed v of the front object relative to the radarRelative to each otherThe reflection intensity of the front object and unique ID is allocated to each target;
and 3.2, filtering the random noise signal to ensure the continuous validity of the radar data, specifically defining z = [ r, α, v ] toRelative to each other]TZ (k) is a measurement value of the kth output of the millimeter wave radar,
Figure 308663DEST_PATH_IMAGE011
(3-1)
filtering out data signals which do not conform to the formula (3-1); wherein d is the weighted Euclidean distance between adjacent measurement vectors z (k), z (k-1), S is the weighting matrix, rsIs a set threshold value;
step 3.3: judging whether the target is in a lane where the agricultural machine runs, when di is less than or equal to ds, the target is in the lane where the agricultural machine runs, otherwise, the target is not in the lane where the agricultural machine runs, primarily selecting the target in the lane where the agricultural machine runs as an effective target, and sequencing and numbering the effective target according to a criterion from near to far; in agricultural machineryTargets outside the driving lane are non-dangerous targets and are excluded; wherein ds is a safety distance threshold, ds = L/2+ ks, and di is a target and Z measured at the sampling point of i0The distance between the shafts, L is the width of a plough hung on the agricultural machine, and ks is a set safety margin;
as illustrated below, it can be seen from FIG. 5 that the 2 obstacles B, C are located a longitudinal distance greater than ds from the center of the agricultural machine, outside the lane of travel of the agricultural machine; A. d, the longitudinal distance between the 2 obstacles and the center of the agricultural machine is less than ds, and in a driving lane of the agricultural machine, A and D are primarily selected as effective targets;
FIG. 6 shows the obstacle E in the driving lane, the obstacle is far from the center O of the agricultural machineryAgricultural machineThe distance of the obstacle E is less than L/2+ ks, and the obstacle E is in a driving lane of the agricultural machine;
step 3.4: carrying out validity check on the initially selected valid target, and finally determining the valid target;
step 3.5: according to the determined effective target, determining the nearest distance obstacle obtained by the millimeter wave radar as a candidate most dangerous target, if dj is less than or equal to dmin, dj is the distance between the agricultural machinery obtained by the millimeter wave radar and the effective target with ID being j, dmin is the distance between the agricultural machinery obtained in one scanning period of the millimeter wave radar and the nearest effective target, and the effective target with ID being j is the most dangerous target;
in the design, random noise signals generated by interference and noise signals are filtered, so that the accuracy of radar data calculation is improved; by judging the driving lane of the agricultural machine, excluding the obstacle targets outside the driving lane of the agricultural machine, preliminarily selecting the obstacles in the same lane as effective targets, and inspecting the preliminarily selected effective targets to further determine the effective targets, thereby improving the accuracy of effective target identification; and determining the most dangerous target according to the rule of the effective targets in the sequence from near to far.
In order to further improve the accuracy of the determination of the valid target, the validity check of the initially selected valid target in step 3.4 specifically includes the following steps,
step 3.4.1: predicting the effective target of initial selection, and selecting Sn = [ d ]n,vn,an]The state prediction equation of the initially selected effective target is,
Figure DEST_PATH_IMAGE012
(3-2)
wherein,
Figure 958956DEST_PATH_IMAGE013
is the status information of the valid obstacle target predicted by the previous scan cycle,
Figure DEST_PATH_IMAGE014
respectively representing the relative distance, the relative speed and the relative acceleration of an effective obstacle target measured in the nth detection period of the millimeter wave radar, wherein t is the scanning period of the millimeter wave radar;
step 3.4.2: by comparing the state information of the predicted n +1 th cycle valid target with the state information of the n +1 th cycle valid target actually measured by the radar, specifically as follows,
Figure 975454DEST_PATH_IMAGE015
(3-3)
wherein d is0、v0、a0Is the error threshold between the set effective obstacle target measurement value and the predicted value;
step 3.4.3: the effective barrier target is continuously detected for more than m times in the scanning period of the radar, and meanwhile, if the effective target meeting the formula (3-3) in the step 3.4.2 is consistent with the initially selected effective target, the relative distance, the relative speed, the relative angle and the number information of the target are updated; otherwise, the primarily selected effective target is not in the detection target of the millimeter wave radar, the primarily selected effective target is tracked by using the effective target prediction information, if the primarily selected effective target is still not detected in the next scanning period of the radar, the corresponding primarily selected effective target information is stopped from being used, the effective target information is updated, and the step 3.4.1 is returned to be executed circularly;
in the design, whether the effective target information is consistent or not is judged by comparing the state information of the effective target predicted by the previous scanning with the tested effective target, so that the false target is further eliminated, and the determination of the effective target is further guaranteed.
As a further improvement of the present invention, the step 4 of determining the motion state of the most dangerous object specifically comprises the following steps,
step 4.1: continuously updating the relative speed and relative distance information of the most dangerous target according to the most dangerous target determined in the step 2.5, and judging whether the distance between the most dangerous target and the radar is within the parking distance range, namely zd>zmin(4-1),zdRelative distance, z, of radar detected by millimeter-wave radar to the most dangerous targetminWhen the most dangerous target meets the formula (4-1) for the set parking distance threshold value, the agricultural machinery continues to run;
step 4.2: determining the dynamic and static states of the most dangerous target according to the relative speed, specifically, vRelative to each other≠vVehicle with wheels(4-2)
In a scanning period, when the formula (4-2) is always satisfied, the state of the target is judged to be dynamic, and at the moment, the navigation box gives out audible and visual alarm and gives out audible and visual alarm at zd=zminParking waiting processing is performed; otherwise, the agricultural machine continues to run and returns to the step 3.1 to execute the circulation, wherein vRelative to each otherIs the velocity of the radar relative to the target, vVehicle with wheelsThe running speed of the agricultural machine;
in the design, the dynamic and static principle of judging the most dangerous target is simple, and the response speed is improved.
The step 4 of judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera specifically comprises the following steps,
step 4.1 a: if the most dangerous target is static all the time, the navigation box controls the agricultural machinery to stop for waiting treatment; otherwise, the camera identifies the most dangerous target;
step 4.2 a: the camera acquires the image of the most dangerous target, matches and compares the image with a trained human body sample training library, and outputs a target identification result;
step 4.3 a: the navigation box controls the agricultural machinery to act according to the output target recognition result, and if the agricultural machinery is not a human body, the navigation box gives out sound and light alarm and controls the agricultural machinery to stop for waiting processing; if the target recognition result is a human body, the navigation box gives out sound and light alarm to judge whether the human body deviates from a driving lane of the agricultural machine or moves away from the agricultural machine, the following formula is used for judging,
zw(n+1)>zwn(4-3)
di>ds (4-4)
if the human body target detected by the radar meets (4-3) or (4-4), the agricultural machine continues to drive forwards, otherwise, the navigation box controls the agricultural machine to stop for waiting processing; z is a radical ofwnFor the nth detection scan cycle the distance of the radar to the most dangerous object, zw(n+1)The distance of the radar relative to the most dangerous target in the next scanning period;
in the design, the dynamic and static states of the most dangerous target are firstly judged, if the most dangerous target is always static, the most dangerous targets are considered to be non-living bodies such as telegraph poles, trees and the like, otherwise, the most dangerous targets are considered to be farm workers or livestock, the image of the most dangerous target is collected by the camera, whether the most dangerous target is a human body is identified, the target identification result is output, if the most dangerous target is the human body, the navigation box gives out sound and light alarm, because the working personnel have danger avoidance awareness, the working personnel can go out of a driving lane of the agricultural machine or walk in a direction far away from the movement direction of the agricultural machine after hearing the alarm sound of the agricultural machine, a judgment program is set by utilizing the habitual reaction of the working personnel, the adaptability is good, the working personnel in front of the agricultural machine is reminded of avoiding the collision of the agricultural machine with non-human bodies such as telegraph poles and livestock, and the continuous driving or parking waiting processing is carried out according to the behaviors of the working personnel.
Before the agricultural machinery works, the calibration of the camera and the radar is carried out under the condition of level ground; when the agricultural machine works, the ground of a farmland is uneven, and because the radar and the camera are not arranged at the same position of the agricultural machine, the heights of the radar and the camera relative to the ground are different and change along with the terrain; the working process of the first distance detection device is specifically that the universal idler wheel rolls along the rugged ground, when the ground is protruded, the universal idler wheel is protruded out of the ground to apply upward acting force to the universal idler wheel, the guide rod slides upwards along the inner wall of the guide sleeve, and the distance sensor detects the ascending distance of the guide rod, namely the height variation between the radar and the ground; when the upward convex ground is gradually leveled, the guide rod gradually slides downwards; when the ground surface is recessed downwards, the guide rod slides downwards under the action of self weight until the universal roller is contacted with the ground surface, the distance sensor detects the descending distance of the guide rod, and the distance sensor sends the detected height change value between the radar and the current ground surface to the industrial personal computer in real time; the working principle of the distance detection device II is the same as that of the distance detection device I, and the distance detection device II sends the detected height change value between the camera and the current ground to the industrial personal computer in real time;
compared with the prior art, the method has the advantages that the millimeter wave radar and the camera are combined to sense the farmland environment, the height change of the radar and the camera from the ground is detected in real time, the height change quantity is added into the translation vector of the coordinate conversion of the radar and the camera, the camera and the radar are truly synchronized in space during the operation of agricultural machinery, and the fusion precision of the camera and the radar is improved; random noise signals generated by noise and interference signals are filtered, so that the accuracy of radar detection signals is improved; determining the target as an agricultural machinery driving lane according to the set course of the agricultural machinery, primarily selecting the obstacle target in the agricultural machinery driving lane as an effective target, and further checking the primarily selected effective target to further determine the effective target and improve the effectiveness and accuracy of radar sensing of the obstacle target in the same lane; selecting a most dangerous target and tracking the most dangerous target, identifying the target by the camera on the basis of the dynamic state and the static state of the most dangerous target, if the most dangerous target is dynamic, only identifying whether the dynamic target is a human body or not without identifying a specific type, reducing the operation amount and improving the response speed, and controlling the action of the agricultural machine by the navigation box according to the image identification result to avoid the collision of the agricultural machine with an obstacle when the agricultural machine is in unmanned driving; if the recognition result is a human body, the navigation box gives an audible and visual alarm to remind workers to avoid agricultural machinery, whether the human body deviates from a driving lane of the agricultural machinery or whether the human body moves away from the agricultural machinery is continuously detected by utilizing the characteristic of habitual thinking of the workers, and the navigation box controls the agricultural machinery to stop for waiting according to the detection result, so that the adaptability is good.
Drawings
FIG. 1 is a flow chart of a method for sensing farmland environment based on a millimeter wave radar and a camera.
Fig. 2 is a schematic diagram of the relationship between the camera coordinate system and the vehicle coordinate system in the present invention.
FIG. 3 is a schematic diagram of the relationship between the camera coordinate system and the image physical coordinate system according to the present invention.
FIG. 4 is a diagram illustrating a relationship between an image physical coordinate system and an image pixel coordinate system according to the present invention.
FIG. 5 is a schematic view of the environment of the agricultural machinery of the present invention during the driving process.
FIG. 6 is a schematic view of lane identification during the driving of the agricultural machinery of the present invention.
Fig. 7 is a flow chart of the present invention for checking the initially selected valid target to further determine the valid target.
Fig. 8 is a schematic structural diagram of the first distance detection device.
Wherein, 1 guide arm, 2 uide bushing, 3 distance sensor, 4 limiting plates, 5 hold the chamber, 6 universal gyro wheels, 7 countersunk screw.
Detailed Description
The invention will be further described with reference to the accompanying drawings.
As shown in fig. 1 to 8, a farmland environment sensing method for agricultural machinery navigation specifically includes the following steps:
step 1: before the agricultural machinery works, calibrating a camera, carrying out spatial coordinate transformation on the camera, and then carrying out combined calibration on radar vision so as to fuse radar and vision information on the space;
step 2, when the agricultural machinery works, the distance detection device detects △ h of height change between the radar and the ground in real timestThe height change △ h between the distance detection device II and the ground is detected in real timectThe industrial control computer processes data and adjusts the conversion relation between the radar and the camera coordinates in real time to ensure that the radar is in a radar-camera coordinate conversion relationAchieving the spatial synchronization with the camera under the working condition;
and step 3: the industrial personal computer resolves the received millimeter wave radar data, determines an effective target, selects an area in front of the agricultural machinery operation where the radar is interested, determines the most dangerous target, and synchronously collects images of the cameras;
and 4, step 4: judging the motion state of the most dangerous target according to the information of the radar, judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera, transmitting an action instruction to a navigation box by an industrial personal computer, and controlling the agricultural machinery to do corresponding action by the navigation box;
wherein, when the agricultural machine works, the running speed of the agricultural machine is uniform;
the first distance detection device and the second distance detection device are identical in structure, the first distance detection device is installed on the front side of the agricultural machine and is arranged right below the radar, and the second distance detection device is installed on the lower side of the agricultural machine and is arranged right below the camera; as shown in fig. 8, the distance detection device comprises a guide sleeve 2 and a guide rod 1, wherein the guide sleeve 2 is provided with a containing cavity 5, the guide sleeve 2 is installed on the agricultural machine, the inner wall of the top of the guide sleeve 2 is connected with a distance sensor 3 which is opposite to the guide rod 1, the guide rod 1 can slide in the guide sleeve 2, the upper side of the guide rod 1 is detachably connected with a limiting plate 4 which limits the guide rod 1 to move in the containing cavity 5, and the bottom of the guide rod 1 is provided with a universal roller 6 which can roll on the ground; the limiting plate 4 can be connected with the guide rod 1 through a countersunk screw 7.
The industrial personal computer receives the data signal sent by the distance sensor 3 and performs data processing;
in order to achieve the preliminary synchronization of the camera and the millimeter wave radar in space, as shown in fig. 2 to 4, the step 1 of converting the coordinates of the vehicle into the coordinates of the pixels of the image specifically includes the following steps,
the conversion of the vehicle coordinates into image pixel coordinates in step 1 specifically comprises the steps of,
step 1.1: before the agricultural machine works, the ground is defaulted to be horizontal, the millimeter wave radar is fixedly installed on the front side of the agricultural machine and located on the longitudinal center axis of the agricultural machine, and the radar emitting surface faces outwards, so that the radar emitting surface is perpendicular to the ground; when the camera is installed, the optical axis of the camera is parallel to the ground;
step 1.2: establishing a radar coordinate system 0 by taking the center of the radar as an origin0-X0Y0Z0The plane of the millimeter wave radar is defined by X0Axis and Y0Axis determination and Z0Axis vertical, Z0The shaft is parallel to the ground and is superposed with the central axis of the agricultural machine; establishing a camera coordinate system Oc-XcYcZc, taking the center of the camera as an origin Oc, wherein a plane XcOcYc is parallel to an imaging plane of the camera, and a Zc axis is a framing optical axis of the camera and is vertical to the imaging plane; establishing a vehicle coordinate system Ow-XwYwZw, wherein Ow is the intersection point of the center of the agricultural machinery rear axle of the agricultural machinery and the central axis of the vehicle, Xw axis is horizontally rightward and vertical to the longitudinal central axis of the agricultural machinery, Zw axis is horizontally forward and is coincided with the central axis of the agricultural machinery, Yw axis is vertical to the water surface and the ground surface and upward, and X of a radar coordinate system is0O0Z0The plane is parallel to the XwOwZw plane of the vehicle coordinate system;
step 1.3: the point where the optical axis intersects the imaging plane is the principal point O' of the image, and the vehicle coordinates pass through the rotation matrix R and the translation vector scObtaining the coordinates (x) of the camera after conversionc,yc,zc,1)TThe vehicle coordinate of the arbitrary point P is (x)w,yw,zw,1)TThe vehicle coordinates are converted into camera coordinates, and the specific conversion relationship is as follows,
Figure 695935DEST_PATH_IMAGE001
(1-1)
in the formula (1-1), R is an orthogonal identity matrix of three rows and three columns, sc1 x 3 translation matrix from vehicle coordinate system to camera coordinate system in initial condition, xcIs the distance between the central axis of the camera and the central axis of the vehicle, ycHeight of the camera from the ground in the initial condition, zcThe distance between the camera and the rear shaft of the agricultural machine is shown;
step 1.4: will camera coordinate (x)c,yc,zc,1)TConversion to image physicsCoordinate (x)1,y1TAnd the specific conversion relationship is as follows,
Figure DEST_PATH_IMAGE016
(1-2)
in the formula (1-2), f is the focal length of the camera, and the focal length unit is mm;
step 1.5: the physical coordinates (x) of the image1,y1TAnd converting to image pixel coordinates (u, v), wherein the specific conversion relation is as follows:
Figure 405265DEST_PATH_IMAGE017
(1-3)
where dx and dy denote the unit size of each pixel in the horizontal and vertical axes, u0、v0Respectively are the horizontal and vertical coordinates of the intersection point of the optical axis of the camera and the imaging plane under the image pixel coordinate system, and the coordinate unit is pixel;
step 1.6: the conversion formula from the image pixel coordinate system to the vehicle coordinate system is obtained according to the above formulas (1-1) to (1-3), specifically,
Figure 559166DEST_PATH_IMAGE004
(1-4);
step 1.7: in order to spatially fuse the radar and visual information, the coordinate transformation relationship in step 1.6 is updated to,
Figure 159780DEST_PATH_IMAGE005
(1-5);
wherein s = sc+s0,s0Is set as (x)s0,ys0,zs0),xs0=0,ys0Height of radar from ground in initial condition, zs0The distance between the radar and the rear axle of the agricultural machine;
converting radar coordinates into image coordinates and radar numbers by means of a shared vehicle coordinate systemMatching target information to visual information is completed according to the three-dimensional coordinate inverse transformation, and the relative position of the radar and the camera space is obtained by means of a vehicle coordinate system; in the step 2, the conversion relation between the image pixel coordinate and the vehicle coordinate is adjusted in real time, specifically, the translation vector s is adjusted in real time according to the actual road condition of the agricultural machinery, and the adjusted translation vector s is adjusted in the scanning period tt= sc+s0+△stThe real-time transformation relationship between the vehicle coordinates and the image pixel coordinates, specifically,
Figure 981106DEST_PATH_IMAGE006
(2-1)
Figure 392495DEST_PATH_IMAGE018
(2-2)
wherein, △ hct△ h as the change value of the height of the camera and the ground under the scanning period tstIs the change value of the radar and the ground height under the scanning period t, j is the scanning period number, (u)t,vt) Updating and calculating the image pixel coordinates obtained by real-time updating under a scanning period t in the agricultural machinery operation process;
the translation vector s adjusted in real time in step 2tThe steps for obtaining (a) are as follows,
step 2.1: the height variation between the radar and the ground and the height variation between the camera and the ground are calculated in real time, specifically,
the distance between the radar and the ground is △ h when the distance between the radar and the ground is changed at the moment i and the moment i-1 in the scanning period tstiCalculating the height variation △ h of the radar relative to the ground in the scanning period t by using an averaging methodst
Figure 299271DEST_PATH_IMAGE008
(2-3);
Suppose that the height variation value between the camera and the ground at the sampling time i and the sampling time i-1 in the scanning period t is △ hctiCalculating the height of the camera from the ground in the scanning period t by averagingDegree of change △ hct
Figure 939462DEST_PATH_IMAGE019
(2-4);
Step 2.2: calculating a translation vector s after self-adaptive adjustment under a scanning period t in real timetSpecifically, the method comprises the following steps of,
Figure 880874DEST_PATH_IMAGE020
(2-5)
where k is the total number of sample points in one scan period.
The resolving of the radar data to determine the valid target in step 3 specifically comprises the following steps,
step 3.1, resolving the data received by the radar according to a millimeter wave radar protocol to obtain the angle α, the distance r and the relative speed v of the front object relative to the radarRelative to each otherThe reflection intensity of the front object and unique ID is allocated to each target;
and 3.2, filtering the random noise signal to ensure the continuous validity of the radar data, specifically defining z = [ r, α, v ] toRelative to each other]TZ (k) is a measurement value of the kth output of the millimeter wave radar,
Figure 197585DEST_PATH_IMAGE021
(3-1)
filtering out data signals which do not conform to the formula (3-1); wherein d is the weighted Euclidean distance between adjacent measurement vectors z (k), z (k-1), S is the weighting matrix, rsIs a set threshold value;
step 3.3: judging whether the target is in a lane where the agricultural machine runs, when di is less than or equal to ds, the target is in the lane where the agricultural machine runs, otherwise, the target is not in the lane where the agricultural machine runs, primarily selecting the target in the lane where the agricultural machine runs as an effective target, and sequencing and numbering the effective target according to a criterion from near to far; the target outside the driving lane of the agricultural machine is a non-dangerous target and is usedRemoving; wherein ds is a safety distance threshold, ds = L/2+ ks, and di is a target and Z measured at the sampling point of i0The distance between the shafts, L is the width of a plough hung on the agricultural machine, and ks is a set safety margin;
as illustrated below, it can be seen from FIG. 5 that the 2 obstacles B, C are located a longitudinal distance greater than ds from the center of the agricultural machine, outside the lane of travel of the agricultural machine; A. d, the longitudinal distance between the 2 obstacles and the center of the agricultural machine is less than ds, and in a driving lane of the agricultural machine, A and D are primarily selected as effective targets;
FIG. 6 shows the obstacle E in the driving lane, the obstacle is far from the center O of the agricultural machineryAgricultural machineThe distance of E is less than L/2+ ks, and E is in a driving lane of the agricultural machine;
step 3.4: carrying out validity check on the initially selected valid target, and finally determining the valid target;
step 3.5: according to the determined effective target, determining the nearest distance obstacle obtained by the millimeter wave radar as a candidate most dangerous target, if dj is less than or equal to dmin, dj is the distance between the agricultural machinery obtained by the millimeter wave radar and the effective target with ID being j, dmin is the distance between the agricultural machinery obtained in one scanning period of the millimeter wave radar and the nearest effective target, and the effective target with ID being j is the most dangerous target;
the validity check of the initially selected valid target in step 3.4 specifically comprises the following steps,
step 3.4.1: predicting the effective target of initial selection, and selecting Sn = [ d ]n,vn,an]The state prediction equation of the initially selected effective target is,
Figure 840925DEST_PATH_IMAGE022
(3-2)
wherein,
Figure 799654DEST_PATH_IMAGE013
is the status information of the valid obstacle target predicted by the previous scan cycle,
Figure 329992DEST_PATH_IMAGE014
respectively representing the relative distance, the relative speed and the relative acceleration of an effective obstacle target measured in the nth detection period of the millimeter wave radar, wherein t is the scanning period of the millimeter wave radar;
step 3.4.2: by comparing the state information of the predicted n +1 th cycle valid target with the state information of the n +1 th cycle valid target actually measured by the radar, specifically as follows,
Figure 817606DEST_PATH_IMAGE023
(3-3)
wherein d is0、v0、a0Is the error threshold between the set effective obstacle target measurement value and the predicted value;
step 3.4.3: the effective barrier target is continuously detected for more than m times in the scanning period of the radar, and meanwhile, if the effective target meeting the formula (3-3) in the step 3.4.2 is consistent with the initially selected effective target, the relative distance, the relative speed, the relative angle and the number information of the target are updated; otherwise, the primarily selected effective target is not in the detection target of the millimeter wave radar, the primarily selected effective target is tracked by using the effective target prediction information, if the primarily selected effective target is still not detected in the next scanning period of the radar, the corresponding primarily selected effective target information is stopped from being used, the effective target information is updated, and the step 3.4.1 is returned to be executed circularly;
the step 4 of judging the dynamic and static states of the most dangerous target specifically comprises the following steps,
step 4.1: continuously updating the relative speed and relative distance information of the most dangerous target according to the most dangerous target determined in the step 2.5, and judging whether the distance between the most dangerous target and the radar is within the parking distance range, namely zd>zmin(4-1),zdRelative distance, z, of radar detected by millimeter-wave radar to the most dangerous targetminWhen the most dangerous target meets the formula (4-1) for the set parking distance threshold value, the agricultural machinery continues to run;
step 4.2: judging the most dangerous purpose according to the relative speedThe target dynamic and static states are as follows, vRelative to each other≠vVehicle with wheels(4-2)
In a scanning period, when the formula (4-2) is always satisfied, the state of the target is judged to be dynamic, and at the moment, the navigation box gives out audible and visual alarm and gives out audible and visual alarm at zd=zminParking waiting processing is performed; otherwise, the agricultural machine continues to run and returns to the step 3.1 to execute the circulation, wherein vRelative to each otherIs the velocity of the radar relative to the target, vVehicle with wheelsThe running speed of the agricultural machine;
the step 4 of judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera specifically comprises the following steps,
step 4.1 a: if the most dangerous target is static all the time, the navigation box controls the agricultural machinery to stop for waiting treatment; otherwise, the camera identifies the most dangerous target;
step 4.2 a: the camera acquires the image of the most dangerous target, matches and compares the image with a trained human body sample training library, and outputs a target identification result;
step 4.3 a: the navigation box controls the agricultural machinery to act according to the output target recognition result, and if the agricultural machinery is not a human body, the navigation box gives out sound and light alarm and controls the agricultural machinery to stop for waiting processing; if the target recognition result is a human body, the navigation box gives out sound and light alarm to judge whether the human body deviates from a driving lane of the agricultural machine or moves away from the agricultural machine, the following formula is used for judging,
zw(n+1)>zwn(4-3)
di>ds (4-4)
if the human body target detected by the radar meets (4-3) or (4-4), the agricultural machine continues to drive forwards, otherwise, the navigation box controls the agricultural machine to stop for waiting processing; z is a radical ofwnFor the nth detection scan cycle the distance of the radar to the most dangerous object, zw(n+1)The distance of the radar relative to the most dangerous target for the next scanning cycle.
Before the agricultural machinery works, the calibration of the camera and the radar is carried out under the condition of level ground; when the agricultural machine works, the ground of a farmland is uneven, and because the radar and the camera are not arranged at the same position of the agricultural machine, the heights of the radar and the camera relative to the ground are different and change along with the terrain; the working process of the first distance detection device is specifically that the universal idler wheel 6 rolls along the rugged ground, when the ground is protruded, the ground protrudes to provide an upward acting force for the universal idler wheel 6, the guide rod 1 slides upwards along the inner wall of the guide sleeve 2, and the distance sensor 3 detects the ascending distance of the guide rod 1, namely the height variation between the radar and the ground; when the upward convex ground is gradually leveled, the guide rod 1 gradually slides downwards; when the ground is recessed downwards, the guide rod 1 slides downwards under the action of self weight until the universal idler wheel 6 is contacted with the ground, the distance sensor 3 detects the descending distance of the guide rod 1, and the distance sensor 3 sends the detected height change value between the radar and the current ground to the industrial personal computer in real time; the working principle of the distance detection device II is the same as that of the distance detection device I, and the distance detection device II sends the detected height change value between the camera and the current ground to the industrial personal computer in real time;
compared with the prior art, the method has the advantages that the millimeter wave radar and the camera are combined to sense the farmland environment, the height change of the radar and the camera from the ground is detected in real time, the height change quantity is added into the translation vector of the coordinate conversion of the radar and the camera, the camera and the radar are truly synchronized in space during the operation of agricultural machinery, and the fusion precision of the camera and the radar is improved; random noise signals generated by noise and interference signals are filtered, so that the accuracy of radar detection signals is improved; determining the target as an agricultural machinery driving lane according to the set course of the agricultural machinery, primarily selecting the obstacle target in the agricultural machinery driving lane as an effective target, and further checking the primarily selected effective target to further determine the effective target and improve the effectiveness and accuracy of radar sensing of the obstacle target in the same lane; selecting a most dangerous target and tracking the most dangerous target, identifying the target by the camera on the basis of the dynamic state and the static state of the most dangerous target, if the most dangerous target is dynamic, only identifying whether the dynamic target is a human body or not without identifying a specific type, reducing the operation amount and improving the response speed, and controlling the action of the agricultural machine by the navigation box according to the image identification result to avoid the collision of the agricultural machine with an obstacle when the agricultural machine is in unmanned driving; if the recognition result is a human body, the navigation box gives an audible and visual alarm to remind workers to avoid agricultural machinery, whether the human body deviates from a driving lane of the agricultural machinery or whether the human body moves away from the agricultural machinery is continuously detected by utilizing the characteristic of habitual thinking of the workers, and the navigation box controls the agricultural machinery to stop for waiting according to the detection result, so that the adaptability is good.
The present invention is not limited to the above embodiments, and based on the technical solutions disclosed in the present invention, those skilled in the art can make some substitutions and modifications to some technical features without creative efforts based on the disclosed technical solutions, and these substitutions and modifications are all within the protection scope of the present invention.

Claims (7)

1. A farmland environment sensing method for agricultural machinery navigation is characterized by comprising the following steps,
step 1: before the agricultural machinery works, calibrating a camera, carrying out spatial coordinate transformation on the camera, and then carrying out combined calibration on radar vision so as to fuse radar and vision information on the space;
step 2, when the agricultural machinery works, the distance detection device detects △ h of height change between the radar and the ground in real timestThe height change △ h between the distance detection device II and the ground is detected in real timectThe industrial personal computer performs data processing and adjusts the coordinate conversion relationship between the radar and the camera in real time, so that the radar and the camera are synchronized in space under the operation condition;
and step 3: the industrial personal computer resolves the received millimeter wave radar data, determines an effective target, selects an area in front of the agricultural machinery operation where the radar is interested, determines the most dangerous target, and synchronously collects images of the cameras;
and 4, step 4: judging the motion state of the most dangerous target according to the information of the radar, judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera, transmitting an action instruction to a navigation box by an industrial personal computer, and controlling the agricultural machinery to do corresponding action by the navigation box;
wherein, in the step (4), the step of judging the motion state of the most dangerous target specifically comprises the following steps,
step 4.1: continuously updating the relative speed and relative distance information of the most dangerous target according to the most dangerous target determined in the step 3, and judging whether the distance between the most dangerous target and the radar is within the parking distance range, namely zd>zmin(4-1),zdRelative distance, z, of radar detected by millimeter-wave radar to the most dangerous targetminWhen the most dangerous target meets the formula (4-1) for the set parking distance threshold value, the agricultural machinery continues to run;
step 4.2: determining the dynamic and static states of the most dangerous target according to the relative speed, specifically, vRelative to each other≠vVehicle with wheels(4-2);
In a scanning period, when the formula (4-2) is always satisfied, the state of the target is judged to be dynamic, and at the moment, the navigation box gives out audible and visual alarm and gives out audible and visual alarm at zd=zminParking waiting processing is performed; otherwise, the agricultural machine continues to run and returns to the step 3.1 to execute the circulation, wherein vRelative to each otherIs the velocity of the radar relative to the target, vVehicle with wheelsThe running speed of the agricultural machine;
wherein, when the agricultural machine works, the running speed of the agricultural machine is uniform; the effective target is a target in a driving lane of the agricultural machinery, and the most dangerous target is a closest distance obstacle obtained by a millimeter wave radar;
the first distance detection device is mounted on the front side of the agricultural machine and arranged right below the radar, and the second distance detection device is mounted on the lower side of the agricultural machine and arranged right below the camera; the first distance detection device comprises a guide sleeve and a guide rod, wherein the guide sleeve is provided with an accommodating cavity and can be opened and closed, the guide sleeve is installed on the agricultural machine, the top of the inner wall of the guide sleeve is connected with a distance sensor which is right opposite to the guide rod, the guide rod can slide in the guide sleeve, the upper side of the guide rod is detachably connected with a limiting plate which limits the guide rod to move in the accommodating cavity, and the bottom of the guide rod is provided with a universal roller which can roll on the ground;
and the industrial personal computer receives the data signal sent by the distance sensor and performs data processing.
2. The agricultural machinery navigation agricultural field environment perception method according to claim 1, wherein the step 1 of converting radar coordinates into image pixel coordinates specifically comprises the steps of,
step 1.1: before the agricultural machine works, the ground is defaulted to be horizontal, the millimeter wave radar is fixedly installed on the front side of the agricultural machine and located on the longitudinal center axis of the agricultural machine, and the radar emitting surface faces outwards, so that the radar emitting surface is perpendicular to the ground; when the camera is installed, the optical axis of the camera is parallel to the ground;
step 1.2: establishing a radar coordinate system 0 by taking the center of the radar as an origin0-X0Y0Z0The plane of the millimeter wave radar is defined by X0Axis and Y0Axis determination and Z0Axis vertical, Z0The shaft is parallel to the ground and is superposed with the central axis of the agricultural machine; establishing a camera coordinate system Oc-XcYcZc, taking the center of the camera as an origin Oc, wherein a plane XcOcYc is parallel to an imaging plane of the camera, and a Zc axis is a framing optical axis of the camera and is vertical to the imaging plane; establishing a vehicle coordinate system Ow-XwYwZw, wherein Ow is the intersection point of the center of the agricultural machinery rear axle of the agricultural machinery and the central axis of the vehicle, Xw axis is horizontally rightward and vertical to the longitudinal central axis of the agricultural machinery, Zw axis is horizontally forward and is coincided with the central axis of the agricultural machinery, Yw axis is vertical to the water surface and the ground surface and upward, and X of a radar coordinate system is0O0Z0The plane is parallel to the XwOwZw plane of the vehicle coordinate system;
step 1.3: the point where the optical axis intersects the imaging plane is the principal point O' of the image, and the vehicle coordinates pass through the rotation matrix R and the translation vector scObtaining the coordinates (x) of the camera after conversionc,yc,zc,1)TThe vehicle coordinate of the arbitrary point P is (x)w,yw,zw,1)TThe vehicle coordinates are converted into camera coordinates, and the specific conversion relationship is as follows,
Figure DEST_PATH_IMAGE001
(1-1)
in the formula (1-1), R is an orthogonal identity matrix of three rows and three columns, sc1 x 3 translation matrix from vehicle coordinate system to camera coordinate system in initial condition, xcIs the distance between the central axis of the camera and the central axis of the vehicle, ycHeight of the camera from the ground in the initial condition, zcThe distance between the camera and the rear shaft of the agricultural machine is shown;
step 1.4: will camera coordinate (x)c,yc,zc,1)TConversion to image physical coordinates (x)1,y1TAnd the specific conversion relationship is as follows,
Figure 393902DEST_PATH_IMAGE002
(1-2)
in the formula (1-2), f is the focal length of the camera, and the focal length unit is mm;
step 1.5: the physical coordinates (x) of the image1,y1TAnd converting to image pixel coordinates (u, v), wherein the specific conversion relation is as follows:
Figure 294731DEST_PATH_IMAGE003
(1-3)
where dx and dy denote the unit size of each pixel in the horizontal and vertical axes, u0、v0Respectively are the horizontal and vertical coordinates of the intersection point of the optical axis of the camera and the imaging plane under the image pixel coordinate system, and the coordinate unit is pixel;
step 1.6: the conversion formula from the image pixel coordinate system to the vehicle coordinate system is obtained according to the above formulas (1-1) to (1-3), specifically,
Figure 12151DEST_PATH_IMAGE004
(1-4);
step 1.7: in order to spatially fuse the radar and visual information, the coordinate transformation relationship in step 1.6 is updated to,
Figure 328732DEST_PATH_IMAGE005
(1-5);
wherein s = sc+s0,s0Is set as (x)s0,ys0,zs0),xs0=0,ys0Height of radar from ground in initial condition, zs0The distance between the radar and the rear axle of the agricultural machine.
3. The agricultural machinery navigation farmland environment sensing method according to claim 2, characterized in that the industrial personal computer in the step 2 performs data processing to adjust the conversion relation between the radar and the camera coordinate in real time, specifically, the translation vector s is adjusted in real time according to the actual road condition of the agricultural machinery, and the adjusted translation vector s is adjusted in the scanning period tt= sc+s0+△stThe real-time transformation relationship between the vehicle coordinates and the image pixel coordinates, specifically,
Figure 611946DEST_PATH_IMAGE006
(2-1)
Figure 90200DEST_PATH_IMAGE007
(2-2)
wherein, △ hct△ h as the change value of the height of the camera and the ground under the scanning period tstIs the change value of the radar and the ground height under the scanning period t, j is the scanning period number, (u)t,vt) And updating the image pixel coordinates obtained by calculation in real time under the scanning period t in the agricultural machinery operation process.
4. The agricultural land environment perception method for agricultural machinery navigation according to claim 3,
△ h is obtained from the translation vector adjusted in real time in the step 2tThe steps of (a) are as follows,
step 2.1: the height variation between the radar and the ground and the height variation between the camera and the ground are calculated in real time, specifically,
the distance between the radar and the ground is △ h when the distance between the radar and the ground is changed at the moment i and the moment i-1 in the scanning period tstiCalculating the height variation △ h of the radar relative to the ground in the scanning period t by using an averaging methodst
Figure 560496DEST_PATH_IMAGE008
(2-3);
Suppose that the height variation value between the camera and the ground at the sampling time i and the sampling time i-1 in the scanning period t is △ hctiCalculating the height variation △ h between the camera and the ground in the scanning period t by averagingct
Figure DEST_PATH_IMAGE009
(2-4);
Step 2.2: calculating a translation vector s after self-adaptive adjustment under a scanning period t in real timetSpecifically, the method comprises the following steps of,
Figure 664456DEST_PATH_IMAGE010
(2-5)
where k is the total number of sample points in a scan period, hstiIs the height variation value between the camera and the ground in the scanning period t.
5. The agricultural machinery navigation farmland environment perception method according to any one of claims 1 to 4, wherein the resolving radar data in the step 3 determines the effective target, specifically comprising the steps of,
step 3.1, resolving the data received by the radar according to a millimeter wave radar protocol to obtain the angle α, the distance r and the relative speed v of the front object relative to the radarRelative to each otherThe reflection intensity of the front object and unique ID is allocated to each target;
step 3.2: for random noise signalsLine filtering to ensure continuous validity of radar data, specifically, defining z = [ r, α, v ]Relative to each other]TZ (k) is a measurement value of the kth output of the millimeter wave radar,
Figure DEST_PATH_IMAGE011
(3-1)
filtering out data signals which do not conform to the formula (3-1); wherein d is the weighted Euclidean distance between adjacent measurement vectors z (k), z (k-1), S is the weighting matrix, rsIs a set threshold value;
step 3.3: judging whether the target is in a lane where the agricultural machine runs, when di is less than or equal to ds, the target is in the lane where the agricultural machine runs, otherwise, the target is not in the lane where the agricultural machine runs, primarily selecting the target in the lane where the agricultural machine runs as an effective target, and sequencing and numbering the effective target according to a criterion from near to far; the target outside the driving lane of the agricultural machine is a non-dangerous target, and the non-dangerous target is removed; wherein ds is a safety distance threshold, ds = L/2+ ks, and di is a target and Z measured at the sampling point of i0The distance between the shafts, L is the width of a plough hung on the agricultural machine, and ks is a set safety margin;
step 3.4: carrying out validity check on the initially selected valid target, and finally determining the valid target;
step 3.5: and according to the determined effective target, determining the nearest distance obstacle obtained by the millimeter wave radar as a candidate most dangerous target, wherein if dj is less than or equal to dmin, dj is the distance between the agricultural machine obtained by the millimeter wave radar and the effective target with the ID being j, dmin is the distance between the agricultural machine obtained in one scanning period of the millimeter wave radar and the nearest effective target, and the effective target with the ID being j is the most dangerous target.
6. The agricultural machinery navigation farmland environment perception method according to claim 5, characterized in that the validity check of the initially selected valid target in the step 3.4 comprises the following steps,
step 3.4.1: predicting the effective target of initial selection, and selecting Sn = [ d ]n,vn,an]The state prediction equation of the initially selected effective target is,
Figure 192389DEST_PATH_IMAGE012
(3-2)
wherein,
Figure DEST_PATH_IMAGE013
is the status information of the valid obstacle target predicted by the previous scan cycle,
Figure 841545DEST_PATH_IMAGE014
respectively representing the relative distance, the relative speed and the relative acceleration of an effective obstacle target measured in the nth detection period of the millimeter wave radar, wherein t is the scanning period of the millimeter wave radar;
step 3.4.2: by comparing the state information of the predicted n +1 th cycle valid target with the state information of the n +1 th cycle valid target actually measured by the radar, specifically as follows,
Figure 64716DEST_PATH_IMAGE015
(3-3)
wherein d is0、v0、a0Is the error threshold between the set effective obstacle target measurement value and the predicted value;
step 3.4.3: the effective barrier target is continuously detected for more than m times in the scanning period of the radar, and meanwhile, if the effective target meeting the formula (3-3) in the step 3.4.2 is consistent with the initially selected effective target, the relative distance, the relative speed, the relative angle and the number information of the target are updated; otherwise, the primarily selected effective target is not in the detection target of the millimeter wave radar, the primarily selected effective target is tracked by using the effective target prediction information, if the primarily selected effective target is still not detected in the next scanning period of the radar, the corresponding primarily selected effective target information is stopped from being used, the effective target information is updated, and the step 3.4.1 is returned to be executed circularly.
7. The agricultural machinery navigation farmland environment perception method according to any one of claims 1 to 4, characterized in that the step 4 of judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera specifically comprises the following steps,
step 4.1 a: if the most dangerous target is static all the time, the navigation box controls the agricultural machinery to stop for waiting treatment; otherwise, the camera identifies the most dangerous target;
step 4.2 a: the camera acquires the image of the most dangerous target, matches and compares the image with a trained human body sample training library, and outputs a target identification result;
step 4.3 a: the navigation box controls the agricultural machinery to act according to the output target recognition result, and if the agricultural machinery is not a human body, the navigation box gives out sound and light alarm and controls the agricultural machinery to stop for waiting processing; if the target recognition result is a human body, the navigation box gives out sound and light alarm to judge whether the human body deviates from a driving lane of the agricultural machine or moves away from the agricultural machine, the following formula is used for judging,
zw(n+1)>zwn(4-3)
di>ds (4-4)
if the human body target detected by the radar meets (4-3) or (4-4), the agricultural machine continues to drive forwards, otherwise, the navigation box controls the agricultural machine to stop for waiting processing; z is a radical ofwnFor the nth detection scan cycle the distance of the radar to the most dangerous object, zw(n+1)The distance of the radar relative to the most dangerous target for the next scanning cycle.
CN201711267193.7A 2017-03-10 2017-12-05 Farmland environment sensing method for agricultural machinery navigation Active CN108107887B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2017101416944 2017-03-10
CN201710141694.4A CN107092039A (en) 2017-03-10 2017-03-10 Farm machinery navigation farm environment cognitive method

Publications (2)

Publication Number Publication Date
CN108107887A CN108107887A (en) 2018-06-01
CN108107887B true CN108107887B (en) 2020-09-29

Family

ID=59648815

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201710141694.4A Pending CN107092039A (en) 2017-03-10 2017-03-10 Farm machinery navigation farm environment cognitive method
CN201711267193.7A Active CN108107887B (en) 2017-03-10 2017-12-05 Farmland environment sensing method for agricultural machinery navigation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201710141694.4A Pending CN107092039A (en) 2017-03-10 2017-03-10 Farm machinery navigation farm environment cognitive method

Country Status (1)

Country Link
CN (2) CN107092039A (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106909148A (en) * 2017-03-10 2017-06-30 南京沃杨机械科技有限公司 Based on the unmanned air navigation aid of agricultural machinery that farm environment is perceived
CN108845574B (en) * 2018-06-26 2021-01-12 北京旷视机器人技术有限公司 Target identification and tracking method, device, equipment and medium
CN111045000A (en) * 2018-10-11 2020-04-21 阿里巴巴集团控股有限公司 Monitoring system and method
CN109375210B (en) * 2018-12-10 2022-10-21 西安电子科技大学 Unmanned device target detection system and method based on radar and panorama combination
TWI691729B (en) * 2019-01-31 2020-04-21 宏碁股份有限公司 Navigation system and method for operating a navigation system
CN111003671B (en) * 2019-12-19 2020-12-15 中联重科股份有限公司 Anti-collision assembly, anti-collision method of lifting platform and lifting platform
CN113093191B (en) * 2021-03-31 2022-07-05 武汉大学 Road vehicle detection system based on millimeter wave radar

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101335431A (en) * 2008-07-27 2008-12-31 广西电力工业勘察设计研究院 Overhead power transmission line optimized line selection method based on airborne laser radar data
WO2016179637A1 (en) * 2015-05-12 2016-11-17 Precision Autonomy Pty Ltd Systems and methods of unmanned vehicle control and monitoring

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101641610A (en) * 2007-02-21 2010-02-03 电子地图北美公司 System and method for vehicle navigation and piloting including absolute and relative coordinates
US8478493B2 (en) * 2008-09-11 2013-07-02 Deere & Company High integrity perception program
US8649930B2 (en) * 2009-09-17 2014-02-11 Agjunction Llc GNSS integrated multi-sensor control system and method
CN101975951B (en) * 2010-06-09 2013-03-20 北京理工大学 Field environment barrier detection method fusing distance and image information
CN103885359A (en) * 2014-02-26 2014-06-25 中山大学 Intelligent vehicle-mounted terminal based on embedded ARM
CN106441319B (en) * 2016-09-23 2019-07-16 中国科学院合肥物质科学研究院 A kind of generation system and method for automatic driving vehicle lane grade navigation map
CN106909148A (en) * 2017-03-10 2017-06-30 南京沃杨机械科技有限公司 Based on the unmanned air navigation aid of agricultural machinery that farm environment is perceived

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101335431A (en) * 2008-07-27 2008-12-31 广西电力工业勘察设计研究院 Overhead power transmission line optimized line selection method based on airborne laser radar data
WO2016179637A1 (en) * 2015-05-12 2016-11-17 Precision Autonomy Pty Ltd Systems and methods of unmanned vehicle control and monitoring

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Application of virtual reality in ecological farmland navigating system;Chia-Chen Kuo等;《Proceedings. Seventh International Conference on High Performance Computing and Grid in Asia Pacific Region, 2004》;20040722;第1-4页 *

Also Published As

Publication number Publication date
CN107092039A (en) 2017-08-25
CN108107887A (en) 2018-06-01

Similar Documents

Publication Publication Date Title
CN108089185B (en) Agricultural machinery unmanned navigation method based on farmland environment perception
CN108154084B (en) Agricultural machinery unmanned multi-sensor fusion farmland environment sensing method
CN108107887B (en) Farmland environment sensing method for agricultural machinery navigation
CN108082181B (en) Agricultural machinery navigation control method based on farmland environment perception
CN106950952B (en) Farmland environment sensing method for unmanned agricultural machinery
CN108169743B (en) Agricultural machinery unmanned farmland environment sensing method
CN111461023B (en) Method for quadruped robot to automatically follow pilot based on three-dimensional laser radar
CN106681353B (en) The unmanned plane barrier-avoiding method and system merged based on binocular vision with light stream
EP3825903A1 (en) Method, apparatus and storage medium for detecting small obstacles
CN112149550B (en) Automatic driving vehicle 3D target detection method based on multi-sensor fusion
CN113850102B (en) Vehicle-mounted vision detection method and system based on millimeter wave radar assistance
US11538241B2 (en) Position estimating device
CN105700525B (en) Method is built based on Kinect sensor depth map robot working environment uncertainty map
CN113378741B (en) Auxiliary sensing method and system for aircraft tractor based on multi-source sensor
CN111077890A (en) Implementation method of agricultural robot based on GPS positioning and automatic obstacle avoidance
CN111123262B (en) Automatic driving 3D modeling method, device and system
CN110824495A (en) Laser radar-based drosophila visual inspired three-dimensional moving target detection method
CN114280611A (en) Road side sensing method integrating millimeter wave radar and camera
CN111126363B (en) Object recognition method and device for automatic driving vehicle
CN103729644B (en) Satellite tracking method, overcoming interference caused when fixed star passes through probe window, of foundation optical measuring equipment
CN116380110B (en) Real-time path planning method based on big data unmanned vehicle
Li et al. Mobile robot map building based on laser ranging and kinect
CN113971699A (en) Target object identification method and device, electronic equipment and storage medium
Hou et al. Research on GDR Obstacle Detection Method Based on Stereo Vision
CN117930224B (en) Vehicle ranging method based on monocular vision depth estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant