CN117893596A - Obstacle ranging method, obstacle ranging device, and computer storage medium - Google Patents

Obstacle ranging method, obstacle ranging device, and computer storage medium Download PDF

Info

Publication number
CN117893596A
CN117893596A CN202311844578.0A CN202311844578A CN117893596A CN 117893596 A CN117893596 A CN 117893596A CN 202311844578 A CN202311844578 A CN 202311844578A CN 117893596 A CN117893596 A CN 117893596A
Authority
CN
China
Prior art keywords
obstacle
angle
detection frame
wide
narrow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311844578.0A
Other languages
Chinese (zh)
Inventor
陈张林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Zero Run Technology Co Ltd
Original Assignee
Zhejiang Zero Run Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Zero Run Technology Co Ltd filed Critical Zhejiang Zero Run Technology Co Ltd
Priority to CN202311844578.0A priority Critical patent/CN117893596A/en
Publication of CN117893596A publication Critical patent/CN117893596A/en
Pending legal-status Critical Current

Links

Abstract

The application provides an obstacle ranging method, an obstacle ranging device and a computer storage medium. The obstacle ranging method includes: acquiring a wide-angle obstacle detection frame from the simultaneously acquired wide-angle images, and acquiring a narrow-angle obstacle detection frame from the narrow-angle images; acquiring a detected obstacle track; matching the wide-angle obstacle detection frame by using an obstacle track; matching the narrow-angle obstacle detection frame by using the obstacle track and the wide-angle obstacle detection frame; fusing the successfully matched narrow-angle obstacle detection frame and/or wide-angle obstacle detection frame to an obstacle track; and obtaining the obstacle distance by using the updated obstacle track. By the obstacle ranging method, the number and the accuracy of detected obstacle targets are improved by adopting a narrow-angle wide-angle double-camera fusion ranging mode.

Description

Obstacle ranging method, obstacle ranging device, and computer storage medium
Technical Field
The present disclosure relates to the field of obstacle detection technology, and in particular, to an obstacle ranging method, an obstacle ranging device, and a computer storage medium.
Background
And in the automatic driving process, ranging is performed through a forward-looking camera, a detection frame of a dynamic obstacle 2d or a pseudo 3d frame is detected and projected to a vehicle body coordinate system to obtain a distance measurement value, and then scene optimization is performed through subsequent logic and strategy.
Then, the existing automatic driving distance measurement mainly comprises single-camera distance measurement or double-camera distance measurement with the same focal length, and can not accurately detect obstacles with different distances.
Disclosure of Invention
In order to solve the technical problems, the application provides an obstacle ranging method, an obstacle ranging device and a computer storage medium.
In order to solve the above technical problems, the present application provides an obstacle ranging method, which includes:
acquiring a wide-angle obstacle detection frame from the simultaneously acquired wide-angle images, and acquiring a narrow-angle obstacle detection frame from the narrow-angle images;
acquiring a detected obstacle track;
matching the wide-angle obstacle detection frame by using the obstacle track;
matching the narrow-angle obstacle detection frame by using the obstacle track and the wide-angle obstacle detection frame;
fusing the successfully matched narrow-angle obstacle detection frame and/or wide-angle obstacle detection frame to an obstacle track;
and obtaining the obstacle distance by using the updated obstacle track.
Wherein, utilize the obstacle track to match the wide angle obstacle detection frame includes:
acquiring a wide-angle obstacle real-time distance based on the wide-angle obstacle detection frame;
acquiring an obstacle track distance based on the obstacle track;
and matching the real-time distance of the wide-angle obstacle with the path distance of the obstacle to obtain a wide-angle obstacle detection frame matched with the path of the obstacle.
Wherein the narrow-angle obstacle detection frame is matched by the obstacle track and the wide-angle obstacle detection frame;
acquiring an intersection ratio with the wide-angle obstacle detection frame based on the narrow-angle obstacle detection frame;
performing first matching on the narrow-angle obstacle detection frame and the wide-angle obstacle detection frame by utilizing the intersection ratio;
acquiring a real-time distance of the narrow-angle obstacle based on the narrow-angle obstacle detection frame;
acquiring an obstacle track distance based on the obstacle track;
performing secondary matching by utilizing the real-time distance of the narrow-angle obstacle and the obstacle track distance to obtain a narrow-angle obstacle detection frame matched with the obstacle track;
and acquiring a narrow-angle obstacle detection frame matched with the obstacle track and/or the wide-angle obstacle detection frame.
Before the narrow-angle obstacle detection frame acquires the intersection ratio with the wide-angle obstacle detection frame, the obstacle ranging method further comprises the following steps:
acquiring homography matrixes of a wide-angle camera and a narrow-angle camera;
and projecting the narrow-angle obstacle detection frame to the wide-angle image based on the homography matrix.
Wherein, after the obstacle distance is obtained by using the updated obstacle track, the obstacle distance measuring method further comprises:
determining an obstacle coordinate according to the coordinate of the latest wide-angle obstacle detection frame in the updated obstacle track in response to the fact that the updated obstacle track is not matched with the narrow-angle obstacle detection frame;
and determining the coordinates of the obstacle according to the coordinates of the latest narrow-angle obstacle detection frame in the updated obstacle track in response to the updated obstacle track not being matched with the upper wide-angle obstacle detection frame.
Wherein, after the obstacle distance is obtained by using the updated obstacle track, the obstacle distance measuring method further comprises:
responding to the updated obstacle track to match an upper wide-angle obstacle detection frame and a narrow-angle obstacle detection frame, and determining a distance proportion according to the latest obstacle distance;
and fusing the coordinates of the latest narrow-angle obstacle detection frame and the coordinates of the latest wide-angle obstacle detection frame according to the distance proportion, and determining the coordinates of the obstacle.
The step of fusing the coordinates of the latest narrow-angle obstacle detection frame and the coordinates of the latest wide-angle obstacle detection frame according to the distance proportion to determine the obstacle coordinates comprises the following steps:
in response to the latest obstacle distance being less than a first distance threshold, taking coordinates of a latest wide-angle obstacle detection frame as the obstacle coordinates;
and taking the coordinates of the latest narrow-angle obstacle detection frame as the obstacle coordinates in response to the latest obstacle distance being greater than a second distance threshold.
The obstacle ranging method further comprises the following steps:
creating a new obstacle course based on the narrow angle obstacle detection boxes and/or the wide angle obstacle detection boxes that are not matched to the obstacle course.
In order to solve the technical problem, the application also provides an obstacle ranging device, which comprises a memory and a processor coupled with the memory; the memory is used for storing program data, and the processor is used for executing the program data to realize the obstacle ranging method.
In order to solve the above technical problem, the present application further proposes a computer storage medium for storing program data, which when executed by a computer, is configured to implement the above obstacle ranging method.
Compared with the prior art, the beneficial effects of this application are: the obstacle ranging device acquires a wide-angle obstacle detection frame from the wide-angle image acquired at the same time, and acquires a narrow-angle obstacle detection frame from the narrow-angle image; acquiring a detected obstacle track; matching the wide-angle obstacle detection frame by using an obstacle track; matching the narrow-angle obstacle detection frame by using the obstacle track and the wide-angle obstacle detection frame; fusing the successfully matched narrow-angle obstacle detection frame and/or wide-angle obstacle detection frame to an obstacle track; and obtaining the obstacle distance by using the updated obstacle track. By the obstacle ranging method, the number and the accuracy of detected obstacle targets are improved by adopting a narrow-angle wide-angle double-camera fusion ranging mode.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Wherein:
FIG. 1 is a flow chart of an embodiment of an obstacle ranging method provided herein;
fig. 2 is a schematic overall flow diagram of a narrow-angle wide-angle camera fusion ranging provided in the present application;
FIG. 3 is a view field schematic of a narrow angle wide angle camera provided herein;
FIG. 4 is a schematic view of an embodiment of an obstacle ranging device provided herein;
FIG. 5 is a schematic view of another embodiment of an obstacle ranging device provided herein;
fig. 6 is a schematic structural diagram of an embodiment of a computer storage medium provided in the present application.
Detailed Description
The following description of the technical solutions in the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented, for example, in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The application designs a fusion ranging method for dynamic obstacle target detection of a front-view long-short focal length dual-camera. Because the wide-angle camera focal segment is relatively narrow, the distance target that can measure is limited, so the application can utilize the narrow-angle camera of long focal segment to detect the target of more remote distance, and with the wide-angle camera common view in the field of vision narrow angle detect the frame bigger more accurate generally, the range finding precision is higher, consequently can adopt narrow-angle wide-angle double camera to fuse the mode of range finding to promote barrier target quantity and the precision of detection. The method can measure the targets at the farther distance section through the binocular long-focus short-focus fusion mode, and improves the distance measurement precision and stability of the targets at the middle and long distance sections. Because the distant target images are larger and clearer in the long-focus narrow angle, the accuracy of the altitude ranging is higher and more stable.
Referring to fig. 1 and fig. 2, fig. 1 is a flow chart of an embodiment of an obstacle ranging method provided in the present application, and fig. 2 is an overall flow chart of a narrow-angle wide-angle camera fusion ranging method provided in the present application.
The obstacle ranging method is applied to an obstacle ranging device, wherein the obstacle ranging device can be a server, terminal equipment or a system formed by mutually matching the server and the terminal equipment. Accordingly, each part, for example, each unit, sub-unit, module, and sub-module, included in the obstacle ranging device may be all disposed in the server, may be all disposed in the terminal device, or may be disposed in the server and the terminal device, respectively.
Further, the server may be hardware or software. When the server is hardware, the server may be implemented as a distributed server cluster formed by a plurality of servers, or may be implemented as a single server. When the server is software, it may be implemented as a plurality of software or software modules, for example, software or software modules for providing a distributed server, or may be implemented as a single software or software module, which is not specifically limited herein.
Specifically, the obstacle ranging device of the present application may be an autonomous vehicle, or a processing unit on the autonomous vehicle, or other processing devices hung on the autonomous vehicle.
As shown in fig. 1, the specific steps are as follows:
step S11: and acquiring a wide-angle obstacle detection frame in the wide-angle images acquired simultaneously, and acquiring a narrow-angle obstacle detection frame in the narrow-angle images.
In the embodiment of the application, the obstacle ranging device respectively adopts a wide-angle camera mounted on the same automatic driving vehicle to acquire a wide-angle image and a narrow-angle camera to acquire a narrow-angle image. Referring to fig. 3, fig. 3 is a schematic view of a field of view of the wide-angle camera and the narrow-angle camera provided in the present application.
As shown in fig. 3, the wide-angle camera and the narrow-angle camera are mounted at the same place, specifically, a certain mounting point on an automatic driving vehicle, such as a roof, etc., and the wide-angle camera and the narrow-angle camera are used for detecting and ranging a target obstacle when the automatic driving vehicle is driving. The origin of the camera coordinate system is taken as a starting point, so that the wide-angle camera view is larger than the narrow-angle camera view, and the wide-angle camera view is completely covered; whereas the narrow angle camera field of view is far away from the wide angle camera field of view. Thus, the narrow angle camera sees a far target more clearly, but the imaging interval is smaller, the wide angle camera sees a near target, but the far target is relatively blurred. According to the method and the device, the detection results of the wide-angle camera and the detection results of the narrow-angle camera are fused, the detection advantages of the wide-angle camera and the narrow-angle camera are integrated, and the detection quantity and the detection accuracy of the obstacles can be effectively improved.
After the obstacle ranging device acquires the wide-angle image and the narrow-angle image, the obstacle ranging device detects the images through an obstacle detection network or model to obtain detection values of the wide-angle image and the narrow-angle image respectively. In the present application, the detection values each include at least an obstacle detection frame.
It should be noted that, the detection value in the embodiment of the present application may be a 2d obstacle detection frame, or may be a pseudo 3d obstacle detection frame. The false 3d obstacle detection frame comprises coordinate information of an obstacle and pose information of the obstacle. Because the installation errors of the wide-angle camera and the narrow-angle camera cannot be completely installed to the same position point, the obstacle ranging device can correct the wide-angle camera and the narrow-angle camera through the pseudo 3d obstacle detection frame in the wide-angle image and the narrow-angle image, so that the obstacle detection precision is further improved.
Step S12: and acquiring the detected obstacle track.
In the embodiment of the application, the obstacle ranging device extracts the obstacle tracks which are stored in advance or successfully matched in the previous image detection as the prior information of the detected obstacle tracks.
Step S13: and matching the wide-angle obstacle detection frame by using the obstacle tracks.
In the embodiment of the present application, as shown in fig. 2, the obstacle detection device projects a narrow-angle target calculated mainly according to a homography matrix projection relationship to a wide-angle image, and then performs two-stage hungarian matching on a narrow-angle camera target by using an intersection ratio between the narrow-angle target and the wide-angle target and a 3d distance as a cost function; and carrying out Hungary matching on the wide-angle camera according to the 3d distance as a cost function. And finally, the obstacle detection device respectively performs initialization of the track, distance measurement and output of the track and deletion of the track according to the matching result of the wide-angle target and the narrow-angle target.
Specifically, the obstacle detection device matches the obstacle track of the previous frame image with the wide-angle obstacle detection frame of the current frame image, and the matching strategy adopts a Hungary matching scheme of distance matching. The obstacle detection device adopts Hungary matching to match the wide-angle obstacle detection frame with the obstacle track by setting a cost matrix of a distance cost function solution bipartite graph.
Updating the wide-angle obstacle detection frame successfully matched with the obstacle track into the information of the obstacle track so as to update the obstacle track; and for the wide-angle obstacle detection frames which are not matched with all obstacle tracks, generating a new initialization track and allocating a new obstacle id number.
Step S14: and matching the narrow-angle obstacle detection frame by using the obstacle track and the wide-angle obstacle detection frame.
In the embodiment of the application, the obstacle detection device matches the obstacle track of the previous frame image, the wide-angle obstacle detection frame of the current frame image and the narrow-angle obstacle detection frame of the current frame image, and the matching strategy adopts a two-stage matching Hungary matching scheme.
Specifically, the obstacle detection device adopts Hungary matching to carry out first matching on the wide-angle obstacle detection frame and the narrow-angle obstacle detection frame by setting a cost matrix of an intersection ratio cost function solution bipartite graph; and then, carrying out second matching on the narrow-angle obstacle detection frame and the obstacle track by adopting Hungary matching and solving a cost matrix of the bipartite graph by setting a distance cost function.
Further, the above-mentioned intersection ratio is a ratio of the narrow-angle obstacle detection frame projected onto the wide-angle image and the wide-angle obstacle detection frame. The projection process is as follows:
assuming that the 2d point P1 of the narrow-angle obstacle detection frame on the narrow-angle image is (x 1, y 1), the 2d point P2 on the wide-angle image is (x 2, x 2), and the homography matrix of the narrow-angle image to the wide-angle image is H, the projection relationship can be expressed as:
according to the field of view and the installation position of the narrow-angle camera and the wide-angle camera shown in fig. 3, the homography matrix can be obtained from the internal reference matrix and the rotation matrix of the two cameras:
wherein K is wide Is an internal reference matrix of the wide-angle camera,a rotation matrix for the wide-angle camera; />Is an internal reference matrix of a narrow-angle camera, R narrow Is a rotation matrix of the narrow angle camera.
In the embodiment of the present application, the reason why the narrow-angle projection wide angle is selected is because the objects in the narrow-angle field of view are all in the wide-angle field of view. The rectangular frames projected by the homography matrix are still rectangular frames, the bottom edge of each rectangular frame is parallel to the edge of the image, and the intersection of the two rectangular frames is rectangular, so that the intersection ratio of the two rectangular frames can be directly calculated:
IoU=Area_I/(Area_A+Area_B-Area_I)
wherein, area_i is the intersection of two detection frames, area_a is the narrow angle obstacle detection frame, and area_b is the wide angle obstacle detection frame.
Step S15: and fusing the successfully matched narrow-angle obstacle detection frame and/or wide-angle obstacle detection frame to the obstacle track.
In the embodiment of the present application, through the matching process in the above step S14, after traversing all the obstacle detection frames and the obstacle tracks, the following matching results are generated: 1. track on the unmatched narrow angle detection value or wide angle detection value; 2. a narrow angle detection value which is not matched with the track; 3. wide angle detection values not matched with the track; 4. only matching the track with the narrow angle detection value; 5. only matching the track with the wide-angle detection value; 6. and simultaneously matching the track with the narrow angle detection value and the wide angle detection value.
Step S16: and obtaining the obstacle distance by using the updated obstacle track.
In this embodiment of the present application, the obstacle detection device respectively performs corresponding processing according to the matching result and the track management logic shown in fig. 2: and (3) carrying out frame supplementing on the track in the case 1, reserving two frames, and if the two frames are not matched after the two frames, taking the track as the track to be deleted. And generating a new initialization track for the detection values of the case 1 and the case 2, and allocating a new id number. For cases 4, 5 and 6, the narrow angle detection and/or wide angle detection values on the match are fused into the track. Specifically, for case 4, the obstacle distance is updated by the narrow angle detection value; for case 5, the obstacle distance is updated by the wide-angle detection value; for case 6, the obstacle distance is updated by proportional summation of the narrow angle detection value and the wide angle detection value.
Further, for case 6, the obstacle detecting apparatus also needs to consider the actual distance of the obstacle. For example, for a distance segment of 30m to 70m, the obstacle detection device determines a distance ratio according to the obstacle distance in the track of the previous frame:
alpha=(x1_traj-30)/(70-30),beta=1-alpha
the obstacle detection device fuses the coordinates of the narrow-angle obstacle detection frame and the coordinates of the wide-angle obstacle detection frame according to the distance proportion, and determines the latest obstacle coordinates:
x_traj=alpha*x_narrow+beta*x_wide
y_traj=alpha*y_narrow+beta*y_wide
for distance segments of 70m or more, which are not in the wide-angle field of view, the obstacle detection device uses the center coordinates of the narrow-angle obstacle detection frame as the latest obstacle coordinates:
x_traj=x_narrow,y_traj=y_narrow
for distance segments of 30m or more, which are not within the narrow angle field of view, the obstacle detection device uses the center coordinates of the wide angle obstacle detection frame as the latest obstacle coordinates:
x_traj=x_wide,y_traj=y_wide
in the embodiment of the application, the obstacle ranging device acquires a wide-angle obstacle detection frame in a wide-angle image acquired at the same time, and acquires a narrow-angle obstacle detection frame in a narrow-angle image; acquiring a detected obstacle track; matching the wide-angle obstacle detection frame by using an obstacle track; matching the narrow-angle obstacle detection frame by using the obstacle track and the wide-angle obstacle detection frame; fusing the successfully matched narrow-angle obstacle detection frame and/or wide-angle obstacle detection frame to an obstacle track; and obtaining the obstacle distance by using the updated obstacle track. By the obstacle ranging method, the number and the accuracy of detected obstacle targets are improved by adopting a narrow-angle wide-angle double-camera fusion ranging mode.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
In order to implement the above-mentioned obstacle ranging method, the present application further provides an obstacle ranging device, refer to fig. 4 specifically, and fig. 4 is a schematic structural diagram of an embodiment of the obstacle ranging device provided in the present application.
The obstacle ranging device 300 of the present embodiment includes an acquisition module 31, a matching module 32, a fusion module 33, and a ranging module 34.
The acquisition module 31 is configured to acquire a wide-angle obstacle detection frame from a wide-angle image acquired simultaneously, and acquire a narrow-angle obstacle detection frame from a narrow-angle image; and acquiring the detected obstacle track.
A matching module 32, configured to match the wide-angle obstacle detection frame with the obstacle track; and matching the narrow-angle obstacle detection frame by using the obstacle track and the wide-angle obstacle detection frame.
And a fusion module 33, configured to fuse the successfully matched narrow-angle obstacle detection frame and/or wide-angle obstacle detection frame to the obstacle track.
The ranging module 34 is configured to acquire the obstacle distance using the updated obstacle track.
In order to implement the above-mentioned obstacle ranging method, another obstacle ranging device is further provided, refer to fig. 5 specifically, and fig. 5 is a schematic structural diagram of another embodiment of the obstacle ranging device provided in the present application.
The obstacle ranging device 400 of the present embodiment includes a processor 41, a memory 42, an input-output device 43, and a bus 44.
The processor 41, the memory 42, and the input/output device 43 are respectively connected to the bus 44, and the memory 42 stores program data, and the processor 41 is configured to execute the program data to implement the obstacle ranging method according to the above embodiment.
In the present embodiment, the processor 41 may also be referred to as a CPU (Central Processing Unit ). The processor 41 may be an integrated circuit chip with signal processing capabilities. The processor 41 may also be a general purpose processor, a digital signal processor (DSP, digital Signal Process), an application specific integrated circuit (ASIC, application Specific Integrated Circuit), a field programmable gate array (FPGA, field Programmable Gate Array) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The general purpose processor may be a microprocessor or the processor 41 may be any conventional processor or the like.
Still further, referring to fig. 6, fig. 6 is a schematic structural diagram of an embodiment of the computer storage medium provided in the present application, in which the computer program 61 is stored in the computer storage medium 600, and the computer program 61 is configured to implement the obstacle ranging method of the above embodiment when executed by the processor.
Embodiments of the present application are implemented in the form of software functional units and sold or used as a stand-alone product, which may be stored on a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution, in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing description is only of embodiments of the present application, and is not intended to limit the scope of the patent application, and all equivalent structures or equivalent processes using the descriptions and the contents of the present application or other related technical fields are included in the scope of the patent application.

Claims (10)

1. An obstacle ranging method, comprising:
acquiring a wide-angle obstacle detection frame from the simultaneously acquired wide-angle images, and acquiring a narrow-angle obstacle detection frame from the narrow-angle images;
acquiring a detected obstacle track;
matching the wide-angle obstacle detection frame by using the obstacle track;
matching the narrow-angle obstacle detection frame by using the obstacle track and the wide-angle obstacle detection frame;
fusing the successfully matched narrow-angle obstacle detection frame and/or wide-angle obstacle detection frame to an obstacle track;
and obtaining the obstacle distance by using the updated obstacle track.
2. The obstacle ranging method as claimed in claim 1, wherein,
the matching of the wide-angle obstacle detection frame by using the obstacle track comprises the following steps:
acquiring a wide-angle obstacle real-time distance based on the wide-angle obstacle detection frame;
acquiring an obstacle track distance based on the obstacle track;
and matching the real-time distance of the wide-angle obstacle with the path distance of the obstacle to obtain a wide-angle obstacle detection frame matched with the path of the obstacle.
3. The obstacle ranging method as claimed in claim 1 or 2, wherein,
the narrow-angle obstacle detection frame is matched by utilizing the obstacle track and the wide-angle obstacle detection frame;
acquiring an intersection ratio with the wide-angle obstacle detection frame based on the narrow-angle obstacle detection frame;
performing first matching on the narrow-angle obstacle detection frame and the wide-angle obstacle detection frame by utilizing the intersection ratio;
acquiring a real-time distance of the narrow-angle obstacle based on the narrow-angle obstacle detection frame;
acquiring an obstacle track distance based on the obstacle track;
performing secondary matching by utilizing the real-time distance of the narrow-angle obstacle and the obstacle track distance to obtain a narrow-angle obstacle detection frame matched with the obstacle track;
and acquiring a narrow-angle obstacle detection frame matched with the obstacle track and/or the wide-angle obstacle detection frame.
4. The obstacle ranging method as claimed in claim 3, wherein,
before the narrow-angle obstacle detection frame acquires the intersection ratio with the wide-angle obstacle detection frame, the obstacle ranging method further comprises:
acquiring homography matrixes of a wide-angle camera and a narrow-angle camera;
and projecting the narrow-angle obstacle detection frame to the wide-angle image based on the homography matrix.
5. The obstacle ranging method as claimed in claim 1, wherein,
after the obstacle distance is obtained by using the updated obstacle track, the obstacle distance measuring method further comprises the following steps:
determining an obstacle coordinate according to the coordinate of the latest wide-angle obstacle detection frame in the updated obstacle track in response to the fact that the updated obstacle track is not matched with the narrow-angle obstacle detection frame;
and determining the coordinates of the obstacle according to the coordinates of the latest narrow-angle obstacle detection frame in the updated obstacle track in response to the updated obstacle track not being matched with the upper wide-angle obstacle detection frame.
6. The obstacle ranging method as claimed in claim 1 or 5, wherein,
after the obstacle distance is obtained by using the updated obstacle track, the obstacle distance measuring method further comprises the following steps:
responding to the updated obstacle track to match an upper wide-angle obstacle detection frame and a narrow-angle obstacle detection frame, and determining a distance proportion according to the latest obstacle distance;
and fusing the coordinates of the latest narrow-angle obstacle detection frame and the coordinates of the latest wide-angle obstacle detection frame according to the distance proportion, and determining the coordinates of the obstacle.
7. The obstacle ranging method as claimed in claim 6, wherein,
fusing the coordinates of the latest narrow-angle obstacle detection frame and the coordinates of the latest wide-angle obstacle detection frame according to the distance proportion, and determining the coordinates of the obstacle, wherein the method comprises the following steps:
in response to the latest obstacle distance being less than a first distance threshold, taking coordinates of a latest wide-angle obstacle detection frame as the obstacle coordinates;
and taking the coordinates of the latest narrow-angle obstacle detection frame as the obstacle coordinates in response to the latest obstacle distance being greater than a second distance threshold.
8. The obstacle ranging method as claimed in claim 1, wherein,
the obstacle ranging method further comprises the following steps:
creating a new obstacle course based on the narrow angle obstacle detection boxes and/or the wide angle obstacle detection boxes that are not matched to the obstacle course.
9. An obstacle ranging device, wherein the terminal device comprises a memory and a processor coupled to the memory;
wherein the memory is for storing program data and the processor is for executing the program data to implement the obstacle ranging method of any one of claims 1 to 8.
10. A computer storage medium for storing program data which, when executed by a computer, is adapted to carry out the obstacle ranging method as claimed in any one of claims 1 to 8.
CN202311844578.0A 2023-12-28 2023-12-28 Obstacle ranging method, obstacle ranging device, and computer storage medium Pending CN117893596A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311844578.0A CN117893596A (en) 2023-12-28 2023-12-28 Obstacle ranging method, obstacle ranging device, and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311844578.0A CN117893596A (en) 2023-12-28 2023-12-28 Obstacle ranging method, obstacle ranging device, and computer storage medium

Publications (1)

Publication Number Publication Date
CN117893596A true CN117893596A (en) 2024-04-16

Family

ID=90646627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311844578.0A Pending CN117893596A (en) 2023-12-28 2023-12-28 Obstacle ranging method, obstacle ranging device, and computer storage medium

Country Status (1)

Country Link
CN (1) CN117893596A (en)

Similar Documents

Publication Publication Date Title
US10260862B2 (en) Pose estimation using sensors
WO2021072696A1 (en) Target detection and tracking method and system, and movable platform, camera and medium
US10909395B2 (en) Object detection apparatus
LU502288B1 (en) Method and system for detecting position relation between vehicle and lane line, and storage medium
JP7209115B2 (en) Detection, 3D reconstruction and tracking of multiple rigid objects moving in relatively close proximity
EP3623993A1 (en) Method and device of multi-focal sensing of an obstacle and non-volatile computer-readable storage medium
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
CN112200851B (en) Point cloud-based target detection method and device and electronic equipment thereof
CN115164918B (en) Semantic point cloud map construction method and device and electronic equipment
CN112666550B (en) Moving object detection method and device, fusion processing unit and medium
CN111753901A (en) Data fusion method, device and system and computer equipment
CN117893596A (en) Obstacle ranging method, obstacle ranging device, and computer storage medium
CN115661014A (en) Point cloud data processing method and device, electronic equipment and storage medium
CN111986248B (en) Multi-vision sensing method and device and automatic driving automobile
CN113126117B (en) Method for determining absolute scale of SFM map and electronic equipment
US20180060671A1 (en) Image processing device, imaging device, equipment control system, equipment, image processing method, and recording medium storing program
CN113066100A (en) Target tracking method, device, equipment and storage medium
CN112262411A (en) Image association method, system and device
JPWO2020244717A5 (en)
JPH11223516A (en) Three dimensional image pickup device
CN116228820B (en) Obstacle detection method and device, electronic equipment and storage medium
CN115205828B (en) Vehicle positioning method and device, vehicle control unit and readable storage medium
CN116721162A (en) External parameter calibration method for radar and camera, electronic equipment and storage medium
Pereira Automated calibration of multiple LIDARs and cameras using a moving sphere
CN118037802A (en) Target depth estimation method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination