CN106950952B - Farmland environment sensing method for unmanned agricultural machinery - Google Patents

Farmland environment sensing method for unmanned agricultural machinery Download PDF

Info

Publication number
CN106950952B
CN106950952B CN201710142305.XA CN201710142305A CN106950952B CN 106950952 B CN106950952 B CN 106950952B CN 201710142305 A CN201710142305 A CN 201710142305A CN 106950952 B CN106950952 B CN 106950952B
Authority
CN
China
Prior art keywords
target
radar
agricultural machine
camera
dangerous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710142305.XA
Other languages
Chinese (zh)
Other versions
CN106950952A (en
Inventor
张锋
吕金秋
程方
韦永清
徐涛
王烁
王浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Kalman Navigation Technology Co ltd
Original Assignee
Wuxi Kalman Navigation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Kalman Navigation Technology Co ltd filed Critical Wuxi Kalman Navigation Technology Co ltd
Priority to CN201710142305.XA priority Critical patent/CN106950952B/en
Publication of CN106950952A publication Critical patent/CN106950952A/en
Application granted granted Critical
Publication of CN106950952B publication Critical patent/CN106950952B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a farmland environment perception method for agricultural machinery unmanned driving in the technical field of control, which specifically comprises the following steps of 1: calibrating the camera to obtain a projection matrix from world coordinates to graphic pixel coordinates, and converting radar coordinates into image pixel coordinates to enable the radar and the camera to be synchronous in space; step 2: the industrial personal computer calculates the received millimeter wave radar data, determines an effective target, selects an area in front of the agricultural machinery operation where the radar is interested, determines the most dangerous target, and synchronously acquires images of the cameras; and step 3: judging the motion state of the most dangerous target according to the information of the radar, judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and a camera, transmitting an action instruction to a navigation box by an industrial personal computer, and controlling the agricultural machinery to do corresponding action by the navigation box; wherein, when the agricultural machine works, the running speed of the agricultural machine is uniform; the method has high accuracy of identifying the obstacle in front of the agricultural machinery, and improves the reliability of unmanned driving.

Description

Farmland environment sensing method for unmanned agricultural machinery
Technical Field
The invention relates to an environment perception method, in particular to a farmland environment perception method for agricultural machinery unmanned driving.
Background
The precision agricultural technology is considered to be the leading edge of the development of agricultural science and technology in the 21 st century and is one of the modern agricultural production management technologies with the highest science and technology and the strongest integrated comprehensiveness. The precise agricultural technology is a system for implementing a set of modern farming operation technology and management in a positioning, timing and quantitative manner according to spatial variation, and is a novel agricultural technology which comprehensively combines information technology and agricultural production.
The application of precision agriculture to rapid development can fully excavate the maximum production potential of farmland, reasonably utilize water and fertilizer resources, reduce environmental pollution and greatly improve the yield and quality of agricultural products.
The development of the accurate agricultural technology is an effective solution for solving the problems of ensuring the total amount of agricultural products, adjusting the agricultural industrial structure, improving the quality and quality of the agricultural products, seriously insufficient resources, low utilization rate, environmental pollution and the like in the process of developing the agriculture from the traditional agriculture to the modern agriculture in China, and is a necessary way for the modern development and transformation and upgrading of the agriculture in China.
The satellite navigation technology is one of the basic components of the precise agricultural technology, so that the agricultural machine can automatically run, and the navigation system guides the agricultural machine to enter an automatic operation mode to start linear farming after parameters are set before the agricultural machine operates. In the automatic navigation process of the agricultural machine, the environment of a farmland is severe and complex, telegraph poles, ridges, soil dunes, livestock, workers appearing at any time and the like may exist in a large farmland, and the factors provide new challenges for the realization of the unmanned agricultural machine. In the prior art, the satellite navigation technology can be used for realizing automatic walking of agricultural machinery in a farmland, but the agricultural machinery cannot accurately identify the obstacle in front of the agricultural machinery, namely the agricultural machinery cannot sense the farmland environment, and even if the agricultural machinery automatically does parking waiting or continues to run according to the sensed farmland environment, the agricultural machinery needs to be assisted by a driver to control the action of the agricultural machinery during operation, and the agricultural machinery collides with the obstacle in front without paying attention to the action, so that a set of farmland environment sensing method needs to be researched to enable the unmanned agricultural machinery to have the ability of sensing the surrounding environment, and once the situations of telegraph poles, ridges, soil dunes, livestock, workers appearing at any time in the farmland exist, emergency processing such as parking waiting can be timely adopted.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to overcome the defects in the prior art, solve the technical problem that the unmanned agricultural machine cannot sense the farmland environment in the prior art, and provide the farmland environment sensing method for the unmanned agricultural machine.
The purpose of the invention is realized as follows: a farmland environment perception method for unmanned agricultural machinery specifically comprises the following steps,
step 1: calibrating a camera to obtain a projection matrix from a world coordinate to a graphic pixel coordinate, establishing a mutual relation between a radar coordinate system and the world coordinate system, and converting the radar coordinate into the graphic pixel coordinate to enable the radar and the camera to be synchronous in space;
step 2: the industrial personal computer resolves the received millimeter wave radar data, determines an effective target, selects an area in front of the agricultural machinery operation where the radar is interested, determines the most dangerous target in the effective target, and synchronously collects images of the cameras;
and step 3: judging the motion state of the most dangerous target according to the information of the radar, judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera, calculating an action instruction of the agricultural machine by the industrial personal computer according to the radar data and the image data of the camera and transmitting the action instruction to the navigation box, and controlling the agricultural machine to do corresponding action by the navigation box;
wherein, before the step 3, the running speed of the agricultural machine is uniform when the agricultural machine works.
In order to achieve spatial synchronization between the camera and the millimeter-wave radar, the step 1 of converting the radar coordinates into graphic pixel coordinates specifically includes the steps of,
step 1.1: before the agricultural machine works, the ground is defaulted to be horizontal, the millimeter wave radar is fixedly installed on the front side of the agricultural machine and located on a longitudinal middle shaft in the vehicle, and the emission surface of the radar faces outwards, so that the emission surface of the radar is perpendicular to the ground; a chessboard used for calibration and a radar reflecting surface are positioned on the same plane and are positioned right above a radar and are vertical to a ground plane, the connecting line of the upper left corner of the chessboard and the center point of the radar reflecting surface is vertical to the ground, and the distance between the upper left corner of the chessboard and the radar mounting position is determinedIs arranged at a height of y0Height in mm; when the camera is installed, the optical axis of the camera is parallel to the ground;
step 1.2: establishing a radar coordinate system 0 by taking the center of the radar as an origin0-X0Y0Z0The plane of the millimeter wave radar is defined by X0Axis and Y0Axis determination and Z0Axis vertical, Z0The shaft is parallel to the ground; establishing a camera coordinate system Oc-XcYcZc, taking the center of the camera as an origin Oc, wherein a plane XcOcYc is parallel to an imaging plane of the camera, and a Zc axis is a framing optical axis of the camera and is vertical to the imaging plane; establishing a world coordinate system Ow-XwYwZw, wherein Ow is the intersection point of the center of gravity of the agricultural machine and a horizontal plane, the Xw axis is horizontally rightward and vertical to the longitudinal central axis of the agricultural machine, Yw is horizontally forward and vertical to the longitudinal central axis of the agricultural machine, Zw is vertical to the horizontal plane and upward, and X of a radar coordinate system is0O0Z0The plane is parallel to the XwOwZw plane of the world coordinate system;
step 1.3: for any point in space P (x, y, z)TThe relative distance between the point P and the radar is r in m, the relative angle is α in degrees, and the X0Z plane of the world coordinate system is parallel to the X of the radar coordinate system0O0Z0A plane for converting the radar coordinates of the obstacle P into three-dimensional world coordinates, the specific conversion relationship is as follows,
Figure DEST_PATH_GDA0001267003240000031
step 1.4: the point where the optical axis intersects with the imaging plane is an image principal point O', and the world coordinate is converted by the rotation matrix R and the translation vector s to obtain a camera coordinate (x)c,yc,zc,1)TThe world coordinate of an arbitrary point P is (x)w,yw,zw,1)TThe world coordinates are converted into camera coordinates, and the specific conversion relationship is as follows,
Figure DEST_PATH_GDA0001267003240000032
in the formula (1-2), R is an orthogonal unit matrix of three rows and three columns, and s is a 1 x 3 translation matrix from a world coordinate system to a camera coordinate system;
step 1.5: will camera coordinate (x)c,yc,zc,1)TConversion to image physical coordinates (x)1,y1)TAnd the specific conversion relationship is as follows,
Figure DEST_PATH_GDA0001267003240000041
in the formula (1-3), f is the focal length of the camera, and the focal length unit is mm;
step 1.6: the physical coordinates (x) of the image1,y1)TAnd converting to image pixel coordinates (u, v), wherein the specific conversion relation is as follows:
Figure DEST_PATH_GDA0001267003240000042
where dx and dy denote the unit size of each pixel in the horizontal and vertical axes, u0、v0Respectively are the horizontal and vertical coordinates of the intersection point of the optical axis of the camera and the imaging plane under the image pixel coordinate system, and the coordinate unit is pixel;
step 1.7: the conversion relation between the world coordinates and the image pixel coordinates is obtained according to the above formulas (1-1) to (1-4), specifically,
Figure DEST_PATH_GDA0001267003240000043
in order to further improve the accuracy of the resolved radar data, the resolved radar data in the step 2 excludes the false target to determine the valid target, specifically comprising the following steps,
the step 2 of determining the effective target by resolving the radar data specifically comprises the following steps,
step 2.1, resolving data received by the radar according to a millimeter wave radar protocol to obtain an angle α of a front object relative to the radar, a distance r, a relative speed v and the reflection intensity of the front object, and allocating a unique ID to each target;
step 2.2: filtering the random noise signal to ensure the continuous validity of radar data, specifically, defining z ═ r, theta, v]TZ (k) is a measurement value of the kth output of the millimeter wave radar,
d2=S(z(k)-z(k-1))(z(k)-z(k-1))T<rs 2(2-1)
filtering out data signals which do not conform to the formula (2-1); wherein d is the weighted Euclidean distance between adjacent measurement vectors z (k), z (k-1), S is the weighting matrix, rsIs a set threshold value;
step 2.3: judging whether the target is in a lane where the agricultural machine runs, when di is less than or equal to ds, the target is in the lane where the agricultural machine runs, otherwise, the target is not in the lane where the agricultural machine runs, the target in the lane where the agricultural machine runs is initially selected as an effective target, and the effective target is sorted and numbered according to a criterion from near to far; the target outside the driving lane of the agricultural machine is a non-dangerous target, and the non-dangerous target is removed; wherein ds is a safety distance threshold, ds is L/2+ ks, and di is a target and Z measured at the sampling point of i0The distance between the shafts, L is the width of a plough hung on the agricultural machine, and ks is a set safety margin;
step 2.4: carrying out validity check on the initially selected valid target, and finally determining the valid target;
step 2.5: according to the determined effective target, determining the nearest distance obstacle obtained by the millimeter wave radar as a candidate most dangerous target, if dj is less than or equal to dmin, dj is the distance between the agricultural machinery obtained by the millimeter wave radar and the effective target with ID being j, dmin is the distance between the agricultural machinery obtained in one scanning period of the millimeter wave radar and the nearest effective target, and at the moment, the effective target with ID being j is the most dangerous target;
in the design, random noise signals generated by interference and noise signals are filtered, so that the accuracy of radar data calculation is improved; by judging the driving lane of the agricultural machine, excluding the obstacle targets outside the driving lane of the agricultural machine, preliminarily selecting the obstacles in the same lane as effective targets, and inspecting the preliminarily selected effective targets to further determine the effective targets, thereby improving the accuracy of effective target identification; and determining the most dangerous target according to the rule of the effective targets in the sequence from near to far.
In order to further improve the accuracy of the determination of the valid target, the validity check of the initially selected valid target in step 2.4 specifically includes the following steps,
step 2.4.1: predicting the effective target of initial selection, and selecting Sn ═ dn,vn,an]The state prediction equation of the initially selected effective target is,
Figure DEST_PATH_GDA0001267003240000061
wherein d is(n+1,n)、v(n+1,n)、a(n+1,n)Is the status information of the valid obstacle target predicted by the previous scanning cycle, dn,vn,anRespectively representing the relative distance, the relative speed and the relative acceleration of an effective obstacle target measured in the nth detection period of the millimeter wave radar, wherein t is the scanning period of the millimeter wave radar;
step 2.4.2: by comparing the state information of the predicted n +1 th cycle valid target with the state information of the n +1 th cycle valid target actually measured by the radar, specifically as follows,
Figure DEST_PATH_GDA0001267003240000062
wherein d is0、v0、a0Is the error threshold between the set effective obstacle target measurement value and the predicted value;
step 2.4.3: the effective barrier target is continuously detected for more than m times in the scanning period of the radar, and meanwhile, if the effective target meeting the formula (2-3) in the step 2.4.2 is consistent with the initially selected effective target, the relative distance, the relative speed, the relative angle and the number information of the target are updated; otherwise, the primarily selected effective target is not in the detection target of the millimeter wave radar, the primarily selected effective target is tracked by using the effective target prediction information, if the primarily selected effective target is still not detected in the next scanning period of the radar, the corresponding primarily selected effective target information is stopped from being used, the effective target information is updated, and the step 2.4.1 is returned to be executed circularly;
in the design, whether the effective target information is consistent or not is judged by comparing the state information of the effective target predicted by the previous scanning with the tested effective target, so that the false target is further eliminated, and the determination of the effective target is further guaranteed.
As a further improvement of the present invention, the step 3 of determining the dynamic and static states of the most dangerous target specifically comprises the following steps,
step 3.1: continuously updating the relative speed and relative distance information of the most dangerous target according to the most dangerous target determined in the step 2.5, and judging whether the distance between the most dangerous target and the radar is within the range of the parking safety distance, namely zwd>zmin(3-1),zwdRelative distance, z, of radar detected by millimeter-wave radar to the most dangerous targetminWhen the most dangerous target meets the formula (3-1) for the set parking safety distance threshold value, the agricultural machine continues to run, otherwise, the navigation box adjusts the action of the agricultural machine;
step 3.2: the dynamic and static states of the most dangerous target are judged according to the relative speed, and concretely, as follows,
v≠vvehicle with wheels(3-2)
Determining that the state of the target is dynamic when the formula (3-2) is always satisfied in at least two consecutive scanning periods; otherwise, the agricultural machine continues to run and returns to the step 3.1 to be executed circularly; where v is the velocity of the radar relative to the target, vVehicle with wheelsThe running speed of the agricultural machine;
in the design, the dynamic and static principle of judging the most dangerous target is simple, and the response speed is improved.
As a further improvement of the present invention, the step 3 of determining the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera specifically includes the following steps,
step 3.1 a: if the most dangerous target is static all the time, the navigation box controls the agricultural machinery to stop for waiting treatment; otherwise, the camera identifies the most dangerous target;
step 3.2 a: the camera acquires the image of the most dangerous target, matches and compares the image with a trained human body sample training library, and outputs a target identification result;
step 3.3: the navigation box controls the agricultural machinery to act according to the output target recognition result, and if the agricultural machinery is not a human body, the navigation box gives out sound and light alarm and controls the agricultural machinery to stop for waiting processing; if the target recognition result is a human body, the navigation box gives out sound and light alarm to judge whether the human body deviates from a driving lane of the agricultural machine or moves away from the agricultural machine, the following formula is used for judging,
zwn+1>zwn(3-3)
di>ds (3-4)
if the human body target detected by the radar meets (3-3) or (3-4), the agricultural machine continues to drive forwards, otherwise, the navigation box controls the agricultural machine to stop for waiting processing; z is a radical ofwnFor the nth detection scan cycle the distance of the radar to the most dangerous object, zw(n+1)The distance of the radar relative to the most dangerous target in the next scanning period;
in the design, the dynamic and static states of the most dangerous target are firstly judged, if the most dangerous target is always static, the most dangerous targets are considered to be non-living bodies such as telegraph poles, trees and the like, otherwise, the most dangerous targets are considered to be farm workers or livestock, the image of the most dangerous target is collected by the camera, whether the most dangerous target is a human body is identified, the target identification result is output, if the most dangerous target is the human body, the navigation box gives out sound and light alarm, because the working personnel have danger avoidance awareness, the working personnel can go out of a driving lane of the agricultural machine or walk in a direction far away from the movement direction of the agricultural machine after hearing the alarm sound of the agricultural machine, a judgment program is set by utilizing the habitual reaction of the working personnel, the adaptability is good, the working personnel in front of the agricultural machine is reminded of avoiding the collision of the agricultural machine with non-human bodies such as telegraph poles and livestock, and the continuous driving or parking waiting processing is carried out according to the behaviors of the working personnel.
Compared with the prior art, the method has the advantages that the millimeter wave radar and the camera are combined to sense the farmland environment, the camera and the millimeter wave radar are spatially synchronized through the conversion of a world coordinate system and an image pixel coordinate system, random noise signals generated by noise and interference signals are filtered, and the accuracy of radar detection signals is improved; determining the target as an agricultural machinery driving lane according to the set course of the agricultural machinery, primarily selecting the obstacle target in the agricultural machinery driving lane as an effective target, and further checking the primarily selected effective target to further determine the effective target and improve the effectiveness and accuracy of radar sensing of the obstacle target in the same lane; selecting a most dangerous target and tracking the most dangerous target, identifying the target by the camera on the basis of the dynamic state and the static state of the most dangerous target, if the most dangerous target is dynamic, only identifying whether a dynamic object is a human body or not without identifying a specific type, reducing the operation amount and improving the response speed, and controlling the action of the agricultural machine by the navigation box according to the image identification result to avoid the collision of the agricultural machine with an obstacle when the agricultural machine is in unmanned driving; if the recognition result is a human body, the navigation box gives an audible and visual alarm to remind workers to avoid agricultural machinery, whether the human body deviates from a driving lane of the agricultural machinery or whether the human body moves away from the agricultural machinery is continuously detected by utilizing the characteristic of habitual thinking of the workers, and the navigation box controls the agricultural machinery to stop for waiting according to the detection result, so that the adaptability is good.
Drawings
The invention will be further described with reference to the following description and embodiments in conjunction with the accompanying drawings:
FIG. 1 is a flow chart of a method for sensing farmland environment based on a millimeter wave radar and a camera.
FIG. 2 is a schematic diagram of the relationship between a radar coordinate system and a world coordinate system according to the present invention.
Fig. 3 is a schematic diagram of the relationship between the camera coordinate system and the world coordinate system in the present invention.
FIG. 4 is a diagram illustrating the relationship between the camera coordinate system and the image physical coordinate system.
FIG. 5 is a diagram illustrating a relationship between an image physical coordinate system and an image pixel coordinate system according to the present invention.
FIG. 6 is a schematic view of the agricultural environment during the driving of the agricultural machine according to the present invention.
FIG. 7 is a schematic view of lane identification during the driving of the agricultural machinery of the present invention.
Fig. 8 is a flow chart of the present invention for checking the initially selected valid target to further determine the valid target.
Detailed Description
The invention will be further described with reference to the accompanying drawings.
As shown in fig. 1 to 8, a farmland environment sensing method for agricultural unmanned aerial vehicle specifically includes the following steps:
the method specifically comprises the following steps of,
step 1: calibrating a camera to obtain a projection matrix from a world coordinate to a graphic pixel coordinate, establishing a mutual relation between a radar coordinate system and the world coordinate system, and converting the radar coordinate into the graphic pixel coordinate to enable the radar and the camera to be synchronous in space;
step 2: the industrial personal computer resolves the received millimeter wave radar data, determines an effective target, selects an area in front of the agricultural machinery operation where the radar is interested, determines the most dangerous target in the effective target, and synchronously collects images of the cameras;
and step 3: judging the motion state of the most dangerous target according to the information of the radar, judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera, calculating an action instruction of the agricultural machine by the industrial personal computer according to the radar data and the image data of the camera and transmitting the action instruction to the navigation box, and controlling the agricultural machine to do corresponding action by the navigation box;
wherein, before the step 3, the running speed of the agricultural machine is uniform when the agricultural machine works.
In order to achieve the spatial synchronization between the camera and the millimeter-wave radar, as shown in fig. 2 to 5, the step 1 of converting the radar coordinates into the graphic pixel coordinates specifically includes the following steps,
step 1.1: before the agricultural machinery works, the ground is defaulted to be horizontal, and the millimeter wave radar is fixedly installed on the front side of the agricultural machinery and is positioned in the longitudinal direction of the vehicleThe shaft is used for enabling the radar emission surface to face outwards and enabling the radar emission surface to be vertical to the ground; a chessboard used for calibration and a radar reflecting surface are positioned on the same plane and are positioned right above a radar and are vertical to a ground plane, the connecting line of the upper left corner of the chessboard and the center point of the radar reflecting surface is vertical to the ground, and the height from the upper left corner of the chessboard to the installation position of the radar is determined to be y0Height in mm; when the camera is installed, the optical axis of the camera is parallel to the ground;
step 1.2: establishing a radar coordinate system 0 by taking the center of the radar as an origin0-X0Y0Z0The plane of the millimeter wave radar is defined by X0Axis and Y0Axis determination and Z0Axis vertical, Z0The shaft is parallel to the ground; establishing a camera coordinate system Oc-XcYcZc with the center of the camera as an origin Oc, the plane XcOcYc being parallel to the imaging plane of the camera, the Zc axis being the viewfinder optical axis of the camera and perpendicular to the imaging plane (X)1O1Y1) (ii) a Establishing a world coordinate system Ow-XwYwZw, wherein Ow is the intersection point of the center of gravity of the agricultural machine and a horizontal plane, the Xw axis is horizontally rightward and vertical to the longitudinal central axis of the agricultural machine, Yw is horizontally forward and vertical to the longitudinal central axis of the agricultural machine, Zw is vertical to the horizontal plane and upward, and X of a radar coordinate system is0O0Z0The plane is parallel to the XwOwZw plane of the world coordinate system;
step 1.3: for any point in space P (x, y, z)TThe relative distance between the point P and the radar is r, PO0R in m, relative angle α, ∠ PO0α in degrees, the X0Z plane of the world coordinate system being parallel to the X of the radar coordinate system0O0Z0A plane for converting the radar coordinates of the obstacle P into three-dimensional world coordinates (as shown in fig. 2), the specific conversion relationship is as follows,
Figure DEST_PATH_GDA0001267003240000111
step 1.4: the point where the optical axis intersects with the imaging plane is an image principal point O', and the world coordinate is converted by the rotation matrix R and the translation vector s to obtain a camera coordinate (x)c,yc,zc,1)TThe world coordinate of an arbitrary point P is (x)w,yw,zw,1)TThe world coordinates are converted into camera coordinates, and the specific conversion relationship is as follows (as shown in fig. 3),
Figure DEST_PATH_GDA0001267003240000112
in the formula (1-2), R is an orthogonal unit matrix of three rows and three columns, and s is a 1 x 3 translation matrix from a world coordinate system to a camera coordinate system;
step 1.5: will camera coordinate (x)c,yc,zc,1)TConversion to image physical coordinates (x)1,y1)TAnd the specific conversion relationship is as follows,
Figure DEST_PATH_GDA0001267003240000113
as shown in fig. 4, plane X1O1Y1The point is an imaging plane, O 'X' Y 'is a virtual imaging plane, the two planes are point-symmetric about Oc, the distance between the two points O1Oc is the focal length f of the lens of the camera, the coordinates of the PC (xc, yc, zc) in the camera coordinate system in the imaging plane, namely the coordinates in the image physical coordinate system are P1(X1, Y1), the P' point is point-symmetric about Oc with the point P1, a conversion formula shown in the formula (1-3) is obtained according to the geometrical relationship, f is the focal length of the camera, and the focal length unit is mm;
step 1.6: the physical coordinates (x) of the image1,y1)TAnd converting to image pixel coordinates (u, v), wherein the specific conversion relation is as follows:
Figure DEST_PATH_GDA0001267003240000121
the image is a unit of pixel, which is different from the unit of the image physical coordinate system, as shown in fig. 5, an image pixel coordinate system UOV is established, the dot of the image pixel coordinate system is at the upper left corner of the image, and U represents a diagramThe horizontal axis of the image, V the vertical axis of the image, O1(u0,v0) The point is the optical axis Zc of the camera and the imaging plane X1O1Y1Obtaining a conversion formula of an image physical coordinate system and an image pixel coordinate system as shown in the formula (1-4);
wherein dx and dy respectively represent the unit size of each pixel on the horizontal axis and the vertical axis, u0 and v0 respectively represent the horizontal and vertical coordinates of the intersection point of the optical axis of the camera and the imaging plane under the image pixel coordinate system, and the coordinate unit is pixel;
step 1.7: the conversion relation between the world coordinates and the image pixel coordinates is obtained according to the above formulas (1-1) to (1-4), specifically,
Figure DEST_PATH_GDA0001267003240000122
the step 2 of determining the effective target by resolving the radar data specifically comprises the following steps,
step 2.1, resolving data received by the radar according to a millimeter wave radar protocol to obtain an angle α of a front object relative to the radar, a distance r, a relative speed v and the reflection intensity of the front object, and allocating a unique ID to each target;
step 2.2: filtering the random noise signal to ensure the continuous validity of radar data, specifically, defining z ═ r, theta, v]TZ (k) is a measurement value of the kth output of the millimeter wave radar,
d2=S(z(k)-z(k-1))(z(k)-z(k-1))T<rs 2(2-1)
filtering out data signals which do not conform to the formula (2-1); wherein d is the weighted Euclidean distance between adjacent measurement vectors z (k), z (k-1), S is the weighting matrix, rsIs a set threshold value;
step 2.3: judging whether the target is in a lane where the agricultural machine runs, when di is less than or equal to ds, determining that the target is in the lane where the agricultural machine runs, otherwise, determining that the target is not in the lane where the agricultural machine runs, primarily selecting the target in the lane where the agricultural machine runs as an effective target, and driving the vehicle on the agricultural machineThe off-road target is a false target; wherein ds is a safety distance threshold, ds is L/2+ ks, and di is a target and Z measured at the sampling point of i0The distance between the shafts, L is the width of a plough hung on the agricultural machine, and ks is a set safety margin;
for example, as can be seen in FIG. 6, the 2 obstacles B, C are located a longitudinal distance greater than ds from the center of the agricultural machine, outside the lane of travel of the agricultural machine; A. d, the longitudinal distance between the 2 obstacles and the center of the agricultural machine is less than ds, and in a driving lane of the agricultural machine, A and D are primarily selected as effective targets;
FIG. 7 is a view showing an obstacle E from the center O of the agricultural machinery when the obstacle is in the driving laneAgricultural machineThe distance of E is less than L/2+ ks, and E is in a driving lane of the agricultural machine;
step 2.4: carrying out validity check on the initially selected valid target, and finally determining the valid target;
step 2.5: according to the determined effective target, determining the nearest distance obstacle obtained by the millimeter wave radar as a candidate most dangerous target, if dj is less than or equal to dmin, dj is the distance between the agricultural machinery obtained by the millimeter wave radar and the effective target with ID being j, dmin is the distance between the agricultural machinery obtained in one scanning period of the millimeter wave radar and the nearest effective target, and at the moment, the effective target with ID being j is the most dangerous target;
in order to further improve the accuracy of the determination of the valid target, the validity check of the initially selected valid target in step 2.4 specifically includes the following steps,
step 2.4.1: predicting the effective target of initial selection, and selecting Sn ═ dn,vn,an]The state prediction equation of the initially selected effective target is,
Figure DEST_PATH_GDA0001267003240000131
wherein d is(n+1,n)、v(n+1,n)、a(n+1,n)Is the status information of the valid obstacle target predicted by the previous scanning cycle, dn,vn,anRespectively represent the nth detection period of the millimeter wave radarThe relative distance, the relative speed and the relative acceleration of the effective obstacle target are measured, and t is the scanning period of the millimeter wave radar;
step 2.4.2: by comparing the state information of the predicted n +1 th cycle valid target with the state information of the n +1 th cycle valid target actually measured by the radar, specifically as follows,
Figure DEST_PATH_GDA0001267003240000141
wherein d is0、v0、a0Is the error threshold between the set effective obstacle target measurement value and the predicted value;
step 2.4.3: the effective barrier target is continuously detected for more than m times in the scanning period of the radar, and meanwhile, if the effective target meeting the formula (2-3) in the step 2.4.2 is consistent with the initially selected effective target, the relative distance, the relative speed, the relative angle and the number information of the target are updated; otherwise, the primarily selected effective target is not in the detection target of the millimeter wave radar, the primarily selected effective target is tracked by using the effective target prediction information, if the primarily selected effective target is still not detected in the next scanning period of the radar, the corresponding primarily selected effective target information is stopped from being used, the effective target information is updated, and the step 2.4.1 is returned to be executed circularly;
the step 3 of judging the dynamic and static states of the most dangerous target specifically comprises the following steps,
step 3.1: continuously updating the relative speed and relative distance information of the most dangerous target according to the most dangerous target determined in the step 2.5, and judging whether the distance between the most dangerous target and the radar is within the parking distance range, namely zwd>zmin(3-1),zwdRelative distance, z, of radar detected by millimeter-wave radar to the most dangerous targetminWhen the most dangerous target meets the formula (3-1) for the set parking distance threshold value, the agricultural machine continues to run;
step 3.2: and determining the motion state of the most dangerous target according to the relative speed, specifically as follows,
v≠vvehicle with wheels(3-2)
In a scanning period, when the formula (3-2) is always satisfied, the state of the target is judged to be dynamic; otherwise, the agricultural machine continues to run and returns to the step 3.1 to be executed circularly; where v is the velocity of the radar relative to the target, vVehicle with wheelsThe running speed of the agricultural machine;
the step 3 of judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera specifically comprises the following steps,
step 3.1 a: if the most dangerous target is static all the time, the navigation box controls the agricultural machinery to stop for waiting treatment; otherwise, the camera identifies the most dangerous target;
step 3.2 a: the camera acquires the image of the most dangerous target, matches and compares the image with a trained human body sample training library, and outputs a target identification result;
step 3.3 a: the navigation box controls the agricultural machinery to act according to the output target recognition result, and if the agricultural machinery is not a human body, the navigation box gives out sound and light alarm and controls the agricultural machinery to stop for waiting processing; if the target recognition result is a human body, the navigation box gives out sound and light alarm to judge whether the human body deviates from a driving lane of the agricultural machine or moves away from the agricultural machine, the following formula is used for judging,
zw(n+1)>zwn(3-3)
di>ds (3-4)
if the human body target detected by the radar meets (3-3) or (3-4), the agricultural machine continues to drive forwards, otherwise, the navigation box controls the agricultural machine to stop for waiting processing; wherein z iswnFor the nth detection scan cycle the distance of the radar to the most dangerous object, zw(n+1)The distance of the radar relative to the most dangerous target for the next scanning cycle.
Compared with the prior art, the method has the advantages that the millimeter wave radar and the camera are combined to sense the farmland environment, the camera and the millimeter wave radar are spatially synchronized through the conversion of a world coordinate system and an image pixel coordinate system, random noise signals generated by noise and interference signals are filtered, and the accuracy of radar detection signals is improved; determining the target as an agricultural machinery driving lane according to the set course of the agricultural machinery, primarily selecting the obstacle target in the agricultural machinery driving lane as an effective target, and then checking the effectiveness of the primarily selected effective target to further determine the effective target, so that the effectiveness and the accuracy of radar sensing the obstacle target in the same lane are improved; selecting a most dangerous target and tracking the most dangerous target, identifying the target by the camera on the basis of the dynamic state and the static state of the most dangerous target, if the most dangerous target is dynamic, only identifying whether a dynamic object is a human body or not without identifying a specific type, reducing the operation amount and improving the response speed, and controlling the action of the agricultural machine by the navigation box according to the image identification result to avoid the collision of the agricultural machine with an obstacle when the agricultural machine is in unmanned driving; if the recognition result is a human body, the navigation box gives an audible and visual alarm to remind workers to avoid agricultural machinery, whether the human body deviates from a driving lane of the agricultural machinery or whether the human body moves away from the agricultural machinery is continuously detected by utilizing the characteristic of habitual thinking of the workers, and the navigation box controls the agricultural machinery to stop for waiting according to the detection result, so that the adaptability is good.
The present invention is not limited to the above embodiments, and based on the technical solutions disclosed in the present invention, those skilled in the art can make some substitutions and modifications to some technical features without creative efforts based on the disclosed technical solutions, and these substitutions and modifications are all within the protection scope of the present invention.

Claims (4)

1. A farmland environment perception method for unmanned agricultural machinery is characterized by comprising the following steps,
step 1: calibrating a camera to obtain a projection matrix from a world coordinate to a graphic pixel coordinate, establishing a mutual relation between a radar coordinate system and the world coordinate system, and converting the radar coordinate into the graphic pixel coordinate to enable the radar and the camera to be synchronous in space;
step 2: the industrial personal computer resolves the received millimeter wave radar data, determines an effective target, selects an area in front of the agricultural machinery operation where the radar is interested, determines the most dangerous target in the effective target, and synchronously collects images of the cameras;
and step 3: judging the motion state of the most dangerous target according to the information of the radar, judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera, calculating an action instruction of the agricultural machine by the industrial personal computer according to the radar data and the image data of the camera and transmitting the action instruction to the navigation box, and controlling the agricultural machine to do corresponding action by the navigation box;
the determination of the dynamic and static states of the most dangerous target specifically comprises the following steps,
step 3.1: continuously updating the relative speed and relative distance information of the most dangerous target according to the most dangerous target determined in the step 2.5, and judging whether the distance between the most dangerous target and the radar is within the range of the parking safety distance, namely zwd>zmin(3-1),zwdRelative distance, z, of radar detected by millimeter-wave radar to the most dangerous targetminWhen the most dangerous target meets the formula (3-1) for the set parking safety distance threshold value, the agricultural machine continues to run, otherwise, the navigation box adjusts the action of the agricultural machine;
step 3.2: judging the dynamic and static states of the most dangerous target according to the relative speed, wherein v is not equal to vVehicle with wheels(3-2)
Determining that the state of the target is dynamic when the formula (3-2) is always satisfied in at least two consecutive scanning periods; otherwise, the agricultural machine continues to run and returns to the step 3.1 to be executed circularly; where v is the velocity of the radar relative to the target, vVehicle with wheelsThe running speed of the agricultural machine is the running speed of the agricultural machine,
the method for judging the type of the most dangerous target according to the image data of the most dangerous target collected by the radar and the camera specifically comprises the following steps,
step 3.1 a: if the most dangerous target is static all the time, the navigation box controls the agricultural machinery to stop for waiting treatment; otherwise, the camera identifies the most dangerous target;
step 3.2 a: the camera acquires the image of the most dangerous target, matches and compares the image with a trained human body sample training library, and outputs a target identification result;
step 3.3 a: the navigation box controls the agricultural machinery to act according to the output target recognition result, and if the agricultural machinery is not a human body, the navigation box gives out sound and light alarm and controls the agricultural machinery to stop for waiting processing; if the target recognition result is a human body, the navigation box gives out sound and light alarm to judge whether the human body deviates from a driving lane of the agricultural machine or moves away from the agricultural machine, the following formula is used for judging,
zwn+1>zwn(3-3)
di>ds (3-4)
if the human body target detected by the radar meets (3-3) or (3-4), the agricultural machine continues to drive forwards, otherwise, the navigation box controls the agricultural machine to stop for waiting processing; z is a radical ofwnFor the nth detection scan cycle the distance of the radar to the most dangerous object, zw(n+1)The distance of the radar relative to the most dangerous target in the next scanning period;
wherein, before the step 3, the running speed of the agricultural machine is uniform when the agricultural machine works.
2. The agricultural unmanned farm field environment sensing method according to claim 1, wherein the step 1 of converting radar coordinates into graphic pixel coordinates specifically comprises the steps of,
step 1.1: before the agricultural machine works, the ground is defaulted to be horizontal, the millimeter wave radar is fixedly installed on the front side of the agricultural machine and located on a longitudinal middle shaft in the vehicle, and the emission surface of the radar faces outwards, so that the emission surface of the radar is perpendicular to the ground; a chessboard used for calibration and a radar reflecting surface are positioned on the same plane and are positioned right above a radar and are vertical to a ground plane, the connecting line of the upper left corner of the chessboard and the center point of the radar reflecting surface is vertical to the ground, and the height from the upper left corner of the chessboard to the installation position of the radar is determined to be y0Height in mm; when the camera is installed, the optical axis of the camera is parallel to the ground;
step 1.2: establishing a radar coordinate system 0 by taking the center of the radar as an origin0-X0Y0Z0The plane of the millimeter wave radar is defined by X0Axis and Y0Axis determination and Z0Axis vertical, Z0The shaft is parallel to the ground; establishing a camera coordinate system Oc-XcYcZc with the center of the camera as an origin Oc and the plane XcOcYc parallel to the imaging plane of the cameraThe Zc axis is the viewfinder optical axis of the camera and is perpendicular to the imaging plane; establishing a world coordinate system Ow-XwYwZw, wherein Ow is the intersection point of the center of gravity of the agricultural machine and a horizontal plane, the Xw axis is horizontally rightward and vertical to the longitudinal central axis of the agricultural machine, Yw is horizontally forward and vertical to the longitudinal central axis of the agricultural machine, Zw is vertical to the horizontal plane and upward, and X of a radar coordinate system is0O0Z0The plane is parallel to the XwOwZw plane of the world coordinate system;
step 1.3: for any point in space P (x, y, z)TThe relative distance between the point P and the radar is r in m, the relative angle is α in degrees, and the X0Z plane of the world coordinate system is parallel to the X of the radar coordinate system0O0Z0A plane for converting the radar coordinates of the obstacle P into three-dimensional world coordinates, the specific conversion relationship is as follows,
Figure DEST_PATH_IMAGE002
(1-1);
step 1.4: the point where the optical axis intersects with the imaging plane is an image principal point O', and the world coordinate is converted by the rotation matrix R and the translation vector s to obtain a camera coordinate (x)c,yc,zc,1)TThe world coordinate of an arbitrary point P is (x)w,yw,zw,1)TThe world coordinates are converted into camera coordinates, and the specific conversion relationship is as follows,
Figure DEST_PATH_IMAGE004
(1-2)
in the formula (1-2), R is an orthogonal unit matrix of three rows and three columns, and s is a 1 x 3 translation matrix from a world coordinate system to a camera coordinate system;
step 1.5: will camera coordinate (x)c,yc,zc,1)TConversion to image physical coordinates (x)1,y1TAnd the specific conversion relationship is as follows,
Figure DEST_PATH_IMAGE006
(1-3)
in the formula (1-3), f is the focal length of the camera, and the focal length unit is mm;
step 1.6: the physical coordinates (x) of the image1,y1TAnd converting to image pixel coordinates (u, v), wherein the specific conversion relation is as follows:
Figure DEST_PATH_IMAGE008
(1-4)
where dx and dy denote the unit size of each pixel in the horizontal and vertical axes, u0、v0Respectively are the horizontal and vertical coordinates of the intersection point of the optical axis of the camera and the imaging plane under the image pixel coordinate system, and the coordinate unit is pixel;
step 1.7: the conversion relation between the world coordinates and the image pixel coordinates is obtained according to the above formulas (1-1) to (1-4), specifically,
Figure DEST_PATH_IMAGE010
(1-5)。
3. the agricultural unmanned farm field environment sensing method according to claim 2, wherein the step 2 of determining the effective target by the resolved radar data comprises the following steps,
step 2.1, resolving data received by the radar according to a millimeter wave radar protocol to obtain an angle α of a front object relative to the radar, a distance r, a relative speed v and the reflection intensity of the front object, and allocating a unique ID to each target;
step 2.2: filtering the random noise signal to ensure the continuous validity of radar data, and specifically defining z = [ r, theta, v = [, theta, v ]]TZ (k) is a measurement value of the kth output of the millimeter wave radar,
Figure DEST_PATH_IMAGE012
(2-1)
filtering out data signals which do not conform to the formula (2-1); wherein d is the weighted Euclidean distance between adjacent measurement vectors z (k), z (k-1), S is the weighting matrix, rsIs a set threshold value;
step 2.3: judging whether the target is in a lane where the agricultural machine runs, when di is less than or equal to ds, the target is in the lane where the agricultural machine runs, otherwise, the target is not in the lane where the agricultural machine runs, primarily selecting the target in the lane where the agricultural machine runs as an effective target, and sequencing and numbering the effective target according to a criterion from near to far; the target outside the driving lane of the agricultural machine is a non-dangerous target, and the non-dangerous target is removed; wherein ds is a safety distance threshold, ds = L/2+ ks, and di is a target and Z measured at the sampling point of i0The distance between the shafts, L is the width of a plough hung on the agricultural machine, ks is a set safety margin, and L is larger than the width of the body of the agricultural machine;
step 2.4: carrying out validity check on the initially selected valid target, and finally determining the valid target;
step 2.5: and according to the determined effective target, determining the nearest distance obstacle obtained by the millimeter wave radar as a candidate most dangerous target, wherein if dj is less than or equal to dmin, dj is the distance between the agricultural machine obtained by the millimeter wave radar and the effective target with the ID being j, dmin is the distance between the agricultural machine obtained in one scanning period of the millimeter wave radar and the nearest effective target, and at the moment, the effective target with the ID being j is the most dangerous target.
4. The method as claimed in claim 3, wherein the step 2.4 of checking the validity of the initially selected valid target includes the following steps,
step 2.4.1: predicting the effective target of initial selection, and selecting Sn = [ d ]n,vn,an]The state prediction equation of the initially selected effective target is,
Figure DEST_PATH_IMAGE014
(2-2)
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE016
is the status information of the valid obstacle target predicted by the previous scan cycle,
Figure DEST_PATH_IMAGE018
respectively representing the relative distance, the relative speed and the relative acceleration of an effective obstacle target measured in the nth detection period of the millimeter wave radar, wherein t is the scanning period of the millimeter wave radar;
step 2.4.2: by comparing the state information of the predicted n +1 th cycle valid target with the state information of the n +1 th cycle valid target actually measured by the radar, specifically as follows,
Figure DEST_PATH_IMAGE020
(2-3)
wherein d is0、v0、a0Is the error threshold between the set effective obstacle target measurement value and the predicted value;
step 2.4.3: the effective barrier target is continuously detected for more than m times in the scanning period of the radar, and meanwhile, if the effective target meeting the formula (2-3) in the step 2.4.2 is consistent with the initially selected effective target, the relative distance, the relative speed, the relative angle and the number information of the target are updated; otherwise, the primarily selected effective target is not in the detection target of the millimeter wave radar, the primarily selected effective target is tracked by using the effective target prediction information, if the primarily selected effective target is still not detected in the next scanning period of the radar, the corresponding primarily selected effective target information is stopped from being used, the effective target information is updated, and the step 2.4.1 is returned to be executed circularly.
CN201710142305.XA 2017-03-10 2017-03-10 Farmland environment sensing method for unmanned agricultural machinery Active CN106950952B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710142305.XA CN106950952B (en) 2017-03-10 2017-03-10 Farmland environment sensing method for unmanned agricultural machinery

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710142305.XA CN106950952B (en) 2017-03-10 2017-03-10 Farmland environment sensing method for unmanned agricultural machinery

Publications (2)

Publication Number Publication Date
CN106950952A CN106950952A (en) 2017-07-14
CN106950952B true CN106950952B (en) 2020-04-03

Family

ID=59468178

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710142305.XA Active CN106950952B (en) 2017-03-10 2017-03-10 Farmland environment sensing method for unmanned agricultural machinery

Country Status (1)

Country Link
CN (1) CN106950952B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107862287A (en) * 2017-11-08 2018-03-30 吉林大学 A kind of front zonule object identification and vehicle early warning method
CN110077402B (en) * 2019-05-13 2021-09-28 奇瑞汽车股份有限公司 Target object tracking method, target object tracking device and storage medium
CN110597390B (en) * 2019-09-12 2022-05-20 Oppo广东移动通信有限公司 Control method, electronic device, and storage medium
CN110764512A (en) * 2019-11-18 2020-02-07 江苏农林职业技术学院 Agricultural machinery driving control system
CN110908387A (en) * 2019-12-13 2020-03-24 齐鲁工业大学 Method, medium and electronic device for planning paths of unmanned surface vehicle in dynamic environment
CN111157996B (en) * 2020-01-06 2022-06-14 珠海丽亭智能科技有限公司 Parking robot running safety detection method
CN113433566B (en) * 2020-03-04 2023-07-25 宏碁股份有限公司 Map construction system and map construction method
CN111459166B (en) * 2020-04-22 2024-03-29 北京工业大学 Scene map construction method containing trapped person position information in post-disaster rescue environment
CN112363498B (en) * 2020-10-19 2022-09-23 山东交通学院 Underwater robot intelligent motion control method based on laser radar
CN112967501B (en) * 2021-02-23 2022-07-05 长安大学 Early warning system and method for dangerous driving-off behavior of vehicles on ramp
CN115741680A (en) * 2022-11-03 2023-03-07 三峡大学 Multi-degree-of-freedom mechanical arm system based on laser guidance and visual assistance and hole accurate positioning method
CN116501048B (en) * 2023-04-26 2023-09-12 无锡卡尔曼导航技术有限公司南京技术中心 Self-mobile equipment ground penetrating path planning method
CN117516485B (en) * 2024-01-04 2024-03-22 东北大学 Pose vision measurement method for automatic guiding and mounting of aircraft engine

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2273334A1 (en) * 2009-06-22 2011-01-12 BAE Systems PLC Terrain sensing
CN104637059A (en) * 2015-02-09 2015-05-20 吉林大学 Night preceding vehicle detection method based on millimeter-wave radar and machine vision
CN105511468A (en) * 2015-12-15 2016-04-20 中国北方车辆研究所 Light beam double reflection discrimination method of laser radar and line structured light visual system
CN105956268A (en) * 2016-04-29 2016-09-21 百度在线网络技术(北京)有限公司 Construction method and device applied to test scene of pilotless automobile
CN106101590A (en) * 2016-06-23 2016-11-09 上海无线电设备研究所 The detection of radar video complex data and processing system and detection and processing method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6636148B2 (en) * 2000-09-04 2003-10-21 Fujitsu Ten Limited Periphery monitoring system
KR102472494B1 (en) * 2014-03-28 2022-11-29 얀마 파워 테크놀로지 가부시키가이샤 Autonomous travelling service vehicle
CN105151043B (en) * 2015-08-19 2018-07-06 内蒙古麦酷智能车技术有限公司 A kind of method of pilotless automobile Emergency avoidance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2273334A1 (en) * 2009-06-22 2011-01-12 BAE Systems PLC Terrain sensing
CN104637059A (en) * 2015-02-09 2015-05-20 吉林大学 Night preceding vehicle detection method based on millimeter-wave radar and machine vision
CN105511468A (en) * 2015-12-15 2016-04-20 中国北方车辆研究所 Light beam double reflection discrimination method of laser radar and line structured light visual system
CN105956268A (en) * 2016-04-29 2016-09-21 百度在线网络技术(北京)有限公司 Construction method and device applied to test scene of pilotless automobile
CN106101590A (en) * 2016-06-23 2016-11-09 上海无线电设备研究所 The detection of radar video complex data and processing system and detection and processing method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于测距雷达和机器视觉数据融合的前方车辆检测系统;庞成;《中国优秀硕士学位论文全文数据库 信息科技辑》;20160815;第3-5章 *

Also Published As

Publication number Publication date
CN106950952A (en) 2017-07-14

Similar Documents

Publication Publication Date Title
CN106950952B (en) Farmland environment sensing method for unmanned agricultural machinery
CN108154084B (en) Agricultural machinery unmanned multi-sensor fusion farmland environment sensing method
CN108089185B (en) Agricultural machinery unmanned navigation method based on farmland environment perception
CN108107887B (en) Farmland environment sensing method for agricultural machinery navigation
CN108082181B (en) Agricultural machinery navigation control method based on farmland environment perception
EP3540464B1 (en) Ranging method based on laser radar system, device and readable storage medium
JP6825569B2 (en) Signal processor, signal processing method, and program
CN108169743B (en) Agricultural machinery unmanned farmland environment sensing method
TWI710798B (en) Laser scanning system atteched on moving object, laser scanning method for laser scanner atteched on moving object, and laser scanning program
CN113424079A (en) Obstacle detection method, obstacle detection device, computer device, and storage medium
CN108509972A (en) A kind of barrier feature extracting method based on millimeter wave and laser radar
CN109471096B (en) Multi-sensor target matching method and device and automobile
US10101448B2 (en) On-board radar apparatus and region detection method
CN109212531A (en) The method for determining target vehicle orientation
CN113850102B (en) Vehicle-mounted vision detection method and system based on millimeter wave radar assistance
CN112149550A (en) Automatic driving vehicle 3D target detection method based on multi-sensor fusion
CN113203409B (en) Method for constructing navigation map of mobile robot in complex indoor environment
CN111257892A (en) Obstacle detection method for automatic driving of vehicle
CN111222441B (en) Point cloud target detection and blind area target detection method and system based on vehicle-road cooperation
CN111427355B (en) Obstacle data processing method, device, equipment and storage medium
CN116310679A (en) Multi-sensor fusion target detection method, system, medium, equipment and terminal
CN111913177A (en) Method and device for detecting target object and storage medium
CN117274749B (en) Fused 3D target detection method based on 4D millimeter wave radar and image
CN114280611A (en) Road side sensing method integrating millimeter wave radar and camera
CN111123262A (en) Automatic driving 3D modeling method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Farmland environment perception method for agricultural machinery driverless

Effective date of registration: 20220506

Granted publication date: 20200403

Pledgee: Jiangsu SINOSURE technology microfinance Co.,Ltd.

Pledgor: WUXI KALMAN NAVIGATION TECHNOLOGY CO.,LTD.

Registration number: Y2022320000200