WO2020098297A1 - Procédé et système de mesure de distance par rapport à un véhicule de tête - Google Patents
Procédé et système de mesure de distance par rapport à un véhicule de tête Download PDFInfo
- Publication number
- WO2020098297A1 WO2020098297A1 PCT/CN2019/095980 CN2019095980W WO2020098297A1 WO 2020098297 A1 WO2020098297 A1 WO 2020098297A1 CN 2019095980 W CN2019095980 W CN 2019095980W WO 2020098297 A1 WO2020098297 A1 WO 2020098297A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- target
- distance
- rgb image
- frame
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/08—Systems for measuring distance only
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
Definitions
- the present disclosure relates to the technical field of distance detection, and in particular to a method and system for distance detection of a preceding vehicle.
- the distance detection and early warning technology came into being.
- the principle of detection and early warning is that when the vehicle distance is too close and touches the threshold, the driver will be reminded of the collision or automatically take braking measures to reduce the occurrence of rear-end collisions.
- the first is a visual solution based on a common camera, which first performs target recognition, and then obtains the distance of the vehicle in front and warns according to the monocular or binocular ranging algorithm;
- the millimeter wave radar sends electromagnetic waves to the front area and receives echoes to measure the distance, speed and angle of the object in front of it, to get the distance of the vehicle in front and to give early warning.
- These two early warning technologies have their own advantages and disadvantages.
- the solution based on the ordinary camera has a lower cost, can accurately identify the position of the front car in the field of view, and provides more semantic information, but the ranging distance and ranging of the visual solution The accuracy is far inferior to millimeter wave radar.
- millimeter-wave radar has high ranging accuracy, its field of view is relatively narrow, and it cannot return any semantic information. It is difficult to accurately identify the position of the preceding vehicle in two-dimensional space.
- the purpose of the present disclosure is to provide a method and system for detecting the distance between a vehicle in front and capable of ensuring both ranging accuracy and positioning accuracy.
- an aspect of the present disclosure provides a method for detecting a distance ahead of a vehicle, including:
- the overlap rate and the normalized distance corresponding to each of the plurality of vehicles determine the target vehicle in front of the plurality of vehicles, and obtain the target vehicle in front of the vehicle and the own vehicle according to the depth image Distance.
- the acquiring RGB images and depth images of the front angle of the vehicle includes:
- the RGB image is collected using a 2D camera installed in the vehicle, and the depth image is collected using a millimeter wave radar / distance sensor installed in the vehicle.
- the determining the constraint frame of the vehicle in the RGB image according to the size data of the vehicle includes:
- the calculating the overlapping rate of the constraint frame of the vehicle and the target area frame includes:
- Car represents the bounding box area of the vehicle
- ROI represents the area of the target area box
- the vehicle is calculated based on the size data of the vehicle, the position of the constraint frame of the vehicle in the RGB image, and the position of the target area frame in the RGB image.
- the normalized distance of the target area frame includes:
- Car.x represents the horizontal coordinate value of the center point of the constraint frame
- Car.y represents the vertical coordinate value of the center point of the constraint frame
- Target.x represents the predetermined target point of the target area frame
- the abscissa value, Target.y represents the ordinate value of the target point
- Car.width represents the width data of the vehicle
- Car.height represents the height data of the vehicle.
- the target area frame is a trapezoidal area frame directly in front of the vehicle selected from the RGB image, and the target point is a center point of the target area frame.
- the target vehicle ahead is determined from the plurality of vehicles according to the overlapping rate and normalized distance corresponding to each vehicle of the plurality of vehicles, and the front The distance between the target vehicle and the vehicle includes:
- the preceding vehicle distance detection method may further include:
- the abnormal window detection process detects an abnormal window in the predetermined length of time window, and replaces the abnormal window with a vehicle distance fitting result calculated based on the vehicle distance values in time windows before and after the abnormal window Car distance value.
- a front vehicle distance detection system including:
- the image acquisition unit is configured to acquire RGB images and depth images of the front angle of the vehicle;
- a target area frame setting unit configured to preset a target area frame in the RGB image
- a size data extraction unit configured to extract size data corresponding to multiple vehicles in the RGB image
- a constraint frame determination unit configured to determine the constraint frame of the vehicle in the RGB image according to the size data of the vehicle for each of the plurality of vehicles
- An overlapping rate calculation unit configured to calculate the overlapping rate of the constraint frame of the vehicle and the target area frame for each of the plurality of vehicles;
- the distance calculation unit is configured to, for each vehicle of the plurality of vehicles, based on the size data of the vehicle, the position of the constraint frame of the vehicle in the RGB image, and the target area frame in the The position in the RGB image, calculating the normalized distance between the vehicle and the target area frame;
- the preceding vehicle target vehicle determining unit is configured to determine the preceding vehicle target vehicle from the plurality of vehicles according to the overlap rate and the normalized distance corresponding to each vehicle of the plurality of vehicles, and according to the depth image Obtain the distance between the preceding target vehicle and the own vehicle.
- the image acquisition unit is configured to:
- the RGB image is collected using a 2D camera installed in the vehicle, and the depth image is collected using a millimeter wave radar / distance sensor installed in the vehicle.
- the constraint box determination unit is configured to:
- the overlapping rate calculation unit is configured to:
- Car represents the bounding box area of the vehicle
- ROI represents the area of the target area box
- the distance calculation unit is configured to:
- Car.x represents the horizontal coordinate value of the center point of the constraint frame
- Car.y represents the vertical coordinate value of the center point of the constraint frame
- Target.x represents the predetermined target point of the target area frame
- the abscissa value, Target.y represents the ordinate value of the target point
- Car.width represents the width data of the vehicle
- Car.height represents the height data of the vehicle.
- the target area frame is a trapezoidal area frame directly in front of the vehicle selected from the RGB image, and the target point is a center point of the target area frame.
- the preceding vehicle target vehicle determination unit is configured to:
- the preceding vehicle distance detection system may further include:
- a motion filtering unit configured to sequentially perform median filtering, anomalous window detection processing, and Kalman filtering on the distance between the preceding target vehicle and the own vehicle in a predetermined length of time window centered on the current time , To get the optimized distance between the target vehicle in front and the vehicle,
- the abnormal window detection process detects an abnormal window in the predetermined length of time window, and replaces the abnormal window with a vehicle distance fitting result calculated based on the vehicle distance values in time windows before and after the abnormal window Car distance value.
- Yet another aspect of the present disclosure provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the processor is caused to perform the following processing:
- the overlap rate and the normalized distance corresponding to each of the plurality of vehicles determine the target vehicle in front of the plurality of vehicles, and obtain the target vehicle in front of the vehicle and the own vehicle according to the depth image Distance.
- a preceding vehicle distance detection method and system capable of simultaneously ensuring ranging accuracy and positioning accuracy can be realized.
- FIG. 1 is a schematic flowchart of a method for detecting a distance ahead of a vehicle according to an embodiment of the present disclosure
- FIG. 2 is an exemplary structural block diagram of a preceding vehicle distance detection system according to an embodiment of the present disclosure
- FIG. 3 is a block diagram of an exemplary configuration of a computing device that can implement embodiments of the present disclosure.
- FIG. 1 is a schematic flowchart of a method for detecting a distance ahead of a vehicle according to an embodiment of the present disclosure.
- step S101 an RGB image and a depth image of the front angle of the host vehicle are acquired.
- the RGB image and the depth image of the front view of the vehicle can be collected in real time by an image acquisition unit installed in the vehicle (for example, the front of the vehicle).
- the RGB image may include all vehicles in the current perspective.
- the depth image can extract the distance information between each vehicle and the own vehicle in the RGB image.
- a depth camera installed in the vehicle may be used to simultaneously acquire RGB images and depth images of the front view angle.
- a 2D camera mounted on the vehicle can be used to collect RGB images in the front view
- a millimeter wave radar / distance sensor mounted on the front of the vehicle can be used to collect depth images in the front view.
- the depth camera may be an Intel RealSense active infrared stereo depth camera D435.
- step S102 a target area frame is set in the RGB image.
- the target area box indicates the area where the vehicle in front may appear.
- the target area frame may be set directly in front of the vehicle.
- the target area frame may be a trapezoidal area frame.
- a fixed point may be selected as a target point in the target area frame, and the target point may represent a desired position where the preceding vehicle appears.
- the target point may be the center point of the target area frame.
- the trapezoidal area directly in front of the depth camera or 2D camera can be selected from the RGB image as the target area frame; the center point of the selected target area frame is defined as the target point.
- the settings of the target area frame and the target point can also be fine-tuned according to the experience of the engineer. For example, when the depth camera is installed on the front left of the vehicle, the target area frame and target point can be set to the left of the center of the RGB image.
- step S103 size data corresponding to a plurality of vehicles in the RGB image is extracted.
- a vehicle detection algorithm may be used to identify all vehicles in the RGB image, and correspondingly extract the size data of each vehicle.
- the size data of the vehicle may include width data, height data, and the like of the vehicle.
- a pre-trained vehicle detection model may be used to obtain all vehicles in the RGB image and identify the size data of each vehicle in the RGB image.
- the vehicle detection model can use, for example, a target detection algorithm (such as Faster RCNN, SSD, YOLO, etc.), which is obtained by training using the COCO data set.
- a target detection algorithm such as Faster RCNN, SSD, YOLO, etc.
- step S104 for each of the plurality of vehicles, the constraint frame of the vehicle is determined in the RGB image based on the size data of the vehicle.
- a virtual coordinate system can be constructed in the RGB image.
- a virtual coordinate system can be constructed in an RGB image with a depth camera or a 2D camera as the origin. Then, extract the coordinates of the upper left corner of each vehicle in the RGB image based on the virtual coordinate system, and draw the rectangular constraint frame including the vehicle in the RGB image according to the size data of the corresponding vehicle obtained.
- the width of the vehicle in the RGB image is used as the rectangular constraint frame
- the width of the car, with the height of the vehicle in the image as the length of the rectangular constrained frame, can quickly draw the rectangular constrained frame corresponding to the vehicle in the RGB image.
- the coordinates of other points of the vehicle in the RGB image may also be extracted to draw the constraint frame.
- the shape of the constraint frame is not limited to a rectangle, but may be any shape designed according to actual needs.
- step S105 for each of a plurality of vehicles, the overlapping rate of the constraint frame of the vehicle and the target area frame is calculated.
- the intersection ratio of the vehicle and the target area may be calculated based on the bounding frame area of the vehicle and the target area frame area as the overlap ratio.
- step S106 for each of the plurality of vehicles, based on the size data of the vehicle, the position of the constraint frame of the vehicle in the RGB image, and the position of the target area frame in the RGB image, calculate the vehicle and the The normalized distance of the target area frame.
- a normalized distance formula may be used
- Car.x represents the abscissa value of the center point of the constraint frame
- Car.y represents the ordinate value of the center point of the constraint frame
- Target.x Represents the abscissa value of the target
- Target.y represents the ordinate value of the target
- Car.width represents the width data of the vehicle
- Car.height represents the height data of the vehicle.
- step S107 according to the overlap rate and the normalized distance corresponding to each of the plurality of vehicles, a preceding vehicle target vehicle is determined from the plurality of vehicles, and the preceding vehicle target vehicle is obtained according to the depth image The distance from the car.
- an unsupervised front vehicle screening algorithm may be used to lock the target vehicle in front of the vehicle based on the overlap rate and normalized distance corresponding to each vehicle.
- the unsupervised preceding vehicle screening algorithm is as follows:
- the vehicle distance corresponding to the preceding vehicle target vehicle is extracted from the depth image as the vehicle distance between the preceding vehicle target vehicle and the own vehicle.
- the first overlap rate threshold, the second overlap rate threshold, and the distance threshold can be arbitrarily selected according to actual experience, and the disclosure does not limit the selection of the threshold.
- the foregoing embodiment may also use a neural network front-vehicle screening algorithm to locate the preceding vehicle target vehicle.
- the specific method is:
- xi represents the vehicle characteristics, including three dimensional vectors, which are the overlap rate (intersection and merge ratio IOU), normalization
- yi represents the screening result of the target vehicle in front
- yi will automatically output the recognition result based on the three-dimensional vector values in xi. For example, when the output of yi is 1, it indicates that the vehicle is a target vehicle in front, and when the output of yi is 0, it indicates that the vehicle is not a target vehicle in front.
- the classifier can select different neural network frameworks to train the sample set, and the neural network frameworks can be AlexNet, VGG, etc.
- a combination of RGB image and depth image is adopted, which can accurately locate the position of each vehicle in the RGB image and the distance from the vehicle, and can simultaneously ensure the ranging accuracy and positioning Accuracy.
- an unsupervised front vehicle screening algorithm is used to lock the target vehicle in front of the vehicle, and the distance between the target vehicle in front is combined with the depth image to achieve accurate and fast detection of the distance in front of the vehicle.
- the distance between the target vehicle in front of the vehicle and the own vehicle can also be optimized.
- median distance filtering, anomaly window detection processing, and Kalman filtering may be sequentially performed on the distance between the target vehicle in front of the vehicle and the vehicle in a predetermined length of time window centered on the current time to obtain the optimized The distance between the target vehicle in front and the own vehicle.
- the median filter is a commonly used time-series filtering algorithm.
- the isolated noise caused by the false detection of the target detection algorithm is similar to salt and pepper noise, showing the characteristics of pulses.
- the median filter can be used to remove it, that is, at the current time as the center In a time window with a length of Tn, after sorting the distance of the target vehicle in front of the vehicle, the median is selected as the filtered distance at the current time. For example, it is generally better to set Tn to 5.
- Kalman filtering is an optimized autoregressive data processing algorithm. In a dynamic system where the state is approximately linear and the measurement result is disturbed by Gaussian noise, the regression data processing algorithm is known when both the state transition equation and the measurement variance are known. Filtering the measured values can be used in the fields of robot navigation, control, sensor data fusion, radar missile tracking, and computer graphics processing.
- the state transition equation of Kalman filter is as follows:
- X (k) AX (k-1) + BU (k) + W (k)
- X (k) represents the state variable of the system at time k, which is the distance between the target vehicle in front of the vehicle and the target vehicle speed in this embodiment;
- A is the state transition matrix, which can be obtained by using the first-order constant speed model Out;
- BU (k) means external control items;
- W (k) is the state change caused by other unknown interference, in the absence of more information, it can be replaced by Gaussian noise with known variance, the larger the variance is set , Representing less confidence in the equation of state, that is, the higher the randomness of the movement of the distance ahead.
- the time window maintained by false detection and missed detection is about 1-15 frames, showing a peak-noise-like property.
- the median distance between the target vehicle in front of the vehicle and the vehicle is sequentially filtered and The abnormal window detection process can remove the false detection and missed detection noise, and improve the accuracy of the screening of the target vehicle in front.
- Kalman filtering is performed on the distance between the target vehicle in front and the vehicle to obtain the optimized distance between the target vehicle in front and the vehicle, which can ensure the measurement accuracy of the distance between vehicles.
- FIG. 2 is an exemplary structural block diagram of a preceding vehicle distance detection system according to an embodiment of the present disclosure.
- the system 200 may include a processing circuit 201.
- the processing circuit 201 of the system 200 provides various functions of the system 200.
- the processing circuit 201 of the system 200 may be configured to perform the preceding vehicle distance detection method described above with reference to FIG. 1.
- the processing circuit 201 may refer to various implementations of digital circuitry, analog circuitry, or mixed-signal (combination of analog and digital) circuitry that performs functions in a computing system.
- the processing circuit may include, for example, a circuit such as an integrated circuit (IC), an application specific integrated circuit (ASIC), a part or circuit of a separate processor core, an entire processor core, a separate processor, such as a field programmable gate array (FPGA) Programmable hardware devices, and / or systems that include multiple processors.
- IC integrated circuit
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- the processing circuit 201 may include an image acquisition unit 202, a target area setting unit 203, a size data extraction unit 204, a constraint frame determination unit 205, an overlap ratio calculation unit 206, a distance calculation unit 207, a preceding vehicle target vehicle Determination unit 208.
- the image acquisition unit 202 is configured to acquire an RGB image and a depth image of the front angle of the vehicle; the target area frame setting unit 203 is configured to preset the target area frame in the RGB image; the size data extraction unit 204 is configured to extract the RGB image
- the constraint frame determination unit 205 is configured to determine the constraint frame of the vehicle in the RGB image according to the size data of the vehicle for each vehicle in the plurality of vehicles;
- the overlap rate calculation unit 206 is configured to calculate the overlap rate of the constraint frame of the vehicle and the target area frame for each of the plurality of vehicles;
- the distance calculation unit 207 is configured to target the plurality of vehicles For each vehicle in, based on the size data of the vehicle, the position of the constraint frame of the vehicle in the RGB image, and the position of the target area frame in the RGB image, calculate the relationship between the vehicle and the target area frame Normalized distance;
- the preceding vehicle target vehicle determining unit 208 is configured to determine the preceding vehicle target
- the system 200 may also include memory (not shown).
- the memory of the system 200 may store the information generated by the processing circuit 201 and the programs and data used for the operation of the system 200.
- the memory may be volatile memory and / or non-volatile memory.
- the memory may include, but is not limited to, random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), read only memory (ROM), and flash memory.
- the system 200 may be implemented at the chip level, or may also be implemented at the device level by including other external components.
- the system 200 may include a motion filtering unit (not shown) configured to sequentially median the distance between the target vehicle in front and the own vehicle in a predetermined length of time window centered on the current time Filtering, anomaly window detection processing and Kalman filtering to obtain an optimized distance between the target vehicle in front and the own vehicle, wherein the anomaly window detection processing detects an anomalous window in the predetermined length of time window, using The vehicle distance fitting result calculated by the vehicle distance value in the time window before and after the abnormal window replaces the vehicle distance value in the abnormal window.
- a motion filtering unit (not shown) configured to sequentially median the distance between the target vehicle in front and the own vehicle in a predetermined length of time window centered on the current time Filtering, anomaly window detection processing and Kalman filtering to obtain an optimized distance between the target vehicle in front and the own vehicle, wherein the anomaly window detection processing detects an anomalous window in the predetermined length of time window, using The vehicle distance fitting result calculated by the vehicle distance value in the time window before and after the abnormal window replaces
- the above units are only logical modules divided according to the specific functions they implement, and are not intended to limit specific implementations. In actual implementation, the above units may be implemented as independent physical entities, or may be implemented by a single entity (for example, a processor (CPU or DSP, etc.), an integrated circuit, etc.).
- the preceding vehicle distance detection system provided by the embodiment of the present disclosure and the preceding vehicle distance detection method provided by the embodiments of the present disclosure belong to the same inventive concept, and can execute the preceding vehicle distance detection method provided by any embodiment of the present disclosure, and have The corresponding functional modules and beneficial effects of the detection method.
- the preceding vehicle distance detection method provided in the embodiments of the present disclosure and details are not described herein again.
- FIG. 3 is a block diagram of an exemplary configuration of a computing device that can implement embodiments of the present disclosure.
- the computing device 300 is an example of a hardware device to which the above-mentioned aspects of the present disclosure can be applied.
- the computing device 300 may be any machine configured to perform processing and / or calculations.
- the computing device 300 may be, but not limited to, a workstation, server, desktop computer, laptop computer, tablet computer, personal data assistant (PDA), smart phone, in-vehicle computer, or combination thereof.
- PDA personal data assistant
- the computing device 300 may include one or more elements that may connect or communicate with the bus 302 via one or more interfaces.
- the bus 302 may include, but is not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, etc.
- the computing device 300 may include, for example, one or more processors 304, one or more input devices 306, and one or more output devices 308.
- the one or more processors 304 may be any kind of processors, and may include, but are not limited to, one or more general-purpose processors or special-purpose processors (such as special-purpose processing chips).
- the processor 304 may correspond to, for example, the processing circuit 201 in FIG. 2 and is configured to implement the functions of the units of the preceding vehicle distance detection system of the present disclosure.
- the input device 306 may be any type of input device capable of inputting information to a computing device, and may include, but is not limited to, a mouse, keyboard, touch screen, microphone, and / or remote controller.
- the output device 308 may be any type of device capable of presenting information, and may include, but is not limited to, a display, a speaker, a video / audio output terminal, a vibrator, and / or a printer.
- the computing device 300 may also include or be connected to a non-transitory storage device 314, which may be any non-transitory storage device that can implement data storage, and may include but is not limited to disk drives, optical Storage devices, solid-state memory, floppy disks, flexible disks, hard disks, magnetic tape, or any other magnetic media, compact disks, or any other optical media, cache memory, and / or any other memory chips or modules, and / or computers can read data from them , Instructions and / or any other medium of code.
- the computing device 300 may also include random access memory (RAM) 310 and read-only memory (ROM) 312.
- the ROM 312 may store programs, utilities, or processes to be executed in a non-volatile manner.
- the RAM 310 may provide volatile data storage and store instructions related to the operation of the computing device 300.
- the computing device 300 may also include a network / bus interface 316 coupled to the data link 318.
- the network / bus interface 316 may be any kind of device or system capable of enabling communication with external devices and / or networks, and may include, but is not limited to, a modem, a network card, an infrared communication device, a wireless communication device, and / or a chipset (such as Bluetooth (TM) device, an 802.11 device, WiFi equipment, WiMax, cellular communication facilities, etc.).
- TM Bluetooth
- a vehicle distance detection device including:
- One or more processors are One or more processors;
- a memory on which computer-executable instructions are stored, which when executed by the one or more processors cause the one or more processors to:
- the overlap rate and the normalized distance corresponding to each of the plurality of vehicles determine the target vehicle in front of the plurality of vehicles, and obtain the target vehicle in front of the vehicle and the own vehicle according to the depth image Distance.
- Solution 2 In the preceding vehicle distance detection device of Solution 1, the computer-executable instructions, when executed by the one or more processors, cause the one or more processors to:
- the RGB image is collected using a 2D camera installed in the vehicle, and the depth image is collected using a millimeter wave radar / distance sensor installed in the vehicle.
- Solution 3 In the preceding vehicle distance detection device of solution 1, the computer-executable instructions, when executed by the one or more processors, cause the one or more processors to:
- Solution 4 In the preceding vehicle distance detection device of solution 1, the computer-executable instructions, when executed by the one or more processors, cause the one or more processors to:
- Car represents the bounding box area of the vehicle
- ROI represents the area of the target area box
- Solution 5 In the preceding vehicle distance detection device of solution 1, the computer-executable instructions, when executed by the one or more processors, cause the one or more processors to:
- Car.x represents the horizontal coordinate value of the center point of the constraint frame
- Car.y represents the vertical coordinate value of the center point of the constraint frame
- Target.x represents the predetermined target point of the target area frame
- the abscissa value, Target.y represents the ordinate value of the target point
- Car.width represents the width data of the vehicle
- Car.height represents the height data of the vehicle.
- the target area frame is a trapezoidal area frame directly in front of the vehicle selected from the RGB image, and the target point is a center point of the target area frame .
- Solution 7 In the preceding vehicle distance detection device of Solution 1, the computer-executable instructions, when executed by the one or more processors, cause the one or more processors to:
- Solution 8 In the preceding vehicle distance detection device of solution 1, the computer-executable instructions, when executed by the one or more processors, cause the one or more processors to:
- the abnormal window detection process detects an abnormal window in the predetermined length of time window, and replaces the abnormal window with a vehicle distance fitting result calculated based on the vehicle distance values in time windows before and after the abnormal window Car distance value.
- the aforementioned embodiments may be embodied as computer-readable codes on a computer-readable medium.
- the computer-readable medium is any data storage device that can store data, which can thereafter be read by a computer system. Examples of computer-readable media include read-only memory, random access memory, CD-ROM, DVD, magnetic tape, hard disk drives, solid-state drives, and optical data storage devices.
- the computer-readable medium can also be distributed among network-coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion.
- Hardware circuits may include combined logic circuits, clock storage devices (such as floppy disks, flip-flops, latches, etc.), finite state machines, memories such as static random access memory or embedded dynamic random access memory, custom-designed circuits, Any combination of programmable logic arrays, etc.
- clock storage devices such as floppy disks, flip-flops, latches, etc.
- finite state machines such as static random access memory or embedded dynamic random access memory
- custom-designed circuits any combination of programmable logic arrays, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
Abstract
La présente invention concerne un procédé et un système de mesure de distance par rapport à un véhicule de tête. Le procédé de mesure de la distance à un véhicule de tête comprend : l'acquisition d'une image RVB et d'une image de profondeur à un angle de visualisation avant d'un véhicule (S101) ; la définition d'un cadre de zone cible dans l'image RVB (S102) ; l'extraction de données de taille correspondant à une pluralité de véhicules dans l'image RVB (S103) ; pour chacun de la pluralité de véhicules, en fonction des données de taille du véhicule, la détermination d'un cadre de contrainte du véhicule dans l'image RVB (S104), le calcul d'un rapport de chevauchement du cadre de contrainte du véhicule au cadre de zone cible (S105), le calcul d'une distance normalisée entre le véhicule et le cadre de zone cible sur la base des données de taille du véhicule, de la position du cadre de contrainte du véhicule dans l'image RVB, et de la position du cadre de zone cible dans l'image RVB (S106) ; et en fonction du rapport de chevauchement et de la distance normalisée correspondant à chacun de la pluralité de véhicules, la détermination d'un véhicule de tête cible parmi la pluralité de véhicules, et l'obtention d'une distance de véhicule entre le véhicule de tête cible et le véhicule en fonction de l'image de profondeur (S107).
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG11202010955SA SG11202010955SA (en) | 2018-11-15 | 2019-07-15 | Method and system for detecting distance to front vehicle |
JP2019563448A JP6851505B2 (ja) | 2018-11-15 | 2019-07-15 | 先行車との距離の検出方法及びシステム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811359075.3 | 2018-11-15 | ||
CN201811359075.3A CN109541583B (zh) | 2018-11-15 | 2018-11-15 | 一种前车距离检测方法及系统 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020098297A1 true WO2020098297A1 (fr) | 2020-05-22 |
Family
ID=65847562
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/095980 WO2020098297A1 (fr) | 2018-11-15 | 2019-07-15 | Procédé et système de mesure de distance par rapport à un véhicule de tête |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP6851505B2 (fr) |
CN (1) | CN109541583B (fr) |
SG (1) | SG11202010955SA (fr) |
WO (1) | WO2020098297A1 (fr) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112567439A (zh) * | 2020-11-09 | 2021-03-26 | 驭势(上海)汽车科技有限公司 | 一种交通流信息的确定方法、装置、电子设备和存储介质 |
CN113421298A (zh) * | 2021-06-17 | 2021-09-21 | 深圳市高格通讯技术有限公司 | 车辆测距方法、车辆控制装置、车辆及可读存储介质 |
CN113781665A (zh) * | 2020-07-28 | 2021-12-10 | 北京沃东天骏信息技术有限公司 | 一种标注信息的审核方法和装置 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109541583B (zh) * | 2018-11-15 | 2020-05-01 | 众安信息技术服务有限公司 | 一种前车距离检测方法及系统 |
JP7291505B2 (ja) * | 2019-03-19 | 2023-06-15 | 株式会社Subaru | 車外環境検出装置 |
CN110401786A (zh) * | 2019-04-24 | 2019-11-01 | 解晗 | 数字设备开关控制机构 |
CN112580402B (zh) * | 2019-09-30 | 2024-08-20 | 广州汽车集团股份有限公司 | 一种单目视觉行人测距方法及其系统、车辆、介质 |
CN110794397B (zh) * | 2019-10-18 | 2022-05-24 | 北京全路通信信号研究设计院集团有限公司 | 一种基于相机和雷达的目标检测方法及系统 |
CN111009166B (zh) * | 2019-12-04 | 2021-06-01 | 上海市城市建设设计研究总院(集团)有限公司 | 基于bim和驾驶模拟器的道路三维视距验算方法 |
CN111369824B (zh) * | 2020-01-22 | 2020-12-15 | 星汉智能科技股份有限公司 | 一种基于图像识别定位的引导泊车方法及系统 |
CN111746545A (zh) * | 2020-06-29 | 2020-10-09 | 中国联合网络通信集团有限公司 | 车距检测方法及装置,和车距提醒方法及装置 |
CN111931864B (zh) * | 2020-09-17 | 2020-12-25 | 南京甄视智能科技有限公司 | 基于顶点距离与交并比多重优化目标检测器的方法与系统 |
CN112241717B (zh) * | 2020-10-23 | 2021-11-16 | 北京嘀嘀无限科技发展有限公司 | 前车检测方法、前车检测模型的训练获取方法及装置 |
CN112949544A (zh) * | 2021-03-17 | 2021-06-11 | 上海大学 | 一种基于3d卷积网络的动作时序检测方法 |
CN117471483A (zh) * | 2023-09-25 | 2024-01-30 | 中国科学院自动化研究所 | 多传感器融合的车距计算方法及系统 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150219758A1 (en) * | 2014-01-31 | 2015-08-06 | Applied Concepts, Inc. | Mobile radar and visual tracking coordinate transformation |
CN105469052A (zh) * | 2015-11-25 | 2016-04-06 | 东方网力科技股份有限公司 | 一种车辆检测跟踪方法和装置 |
CN107202983A (zh) * | 2017-05-19 | 2017-09-26 | 深圳佑驾创新科技有限公司 | 基于图像识别和毫米波雷达融合的自动刹车方法和系统 |
CN107272021A (zh) * | 2016-03-30 | 2017-10-20 | 德尔福技术有限公司 | 使用雷达和视觉定义的图像检测区域的对象检测 |
CN107463890A (zh) * | 2017-07-20 | 2017-12-12 | 浙江零跑科技有限公司 | 一种基于单目前视相机的前车检测与跟踪方法 |
CN108764108A (zh) * | 2018-05-22 | 2018-11-06 | 湖北省专用汽车研究院 | 一种基于贝叶斯统计决策的前车检测方法 |
CN109541583A (zh) * | 2018-11-15 | 2019-03-29 | 众安信息技术服务有限公司 | 一种前车距离检测方法及系统 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100588902C (zh) * | 2006-12-19 | 2010-02-10 | 北京中星微电子有限公司 | 一种车距检测方法及装置 |
JP4968369B2 (ja) * | 2010-06-25 | 2012-07-04 | アイシン・エィ・ダブリュ株式会社 | 車載装置及び車両認識方法 |
CN104837007B (zh) * | 2014-02-11 | 2018-06-05 | 阿里巴巴集团控股有限公司 | 一种数字图像质量分级的方法和装置 |
US20160132728A1 (en) * | 2014-11-12 | 2016-05-12 | Nec Laboratories America, Inc. | Near Online Multi-Target Tracking with Aggregated Local Flow Descriptor (ALFD) |
JP6591188B2 (ja) * | 2015-03-30 | 2019-10-16 | 株式会社Subaru | 車外環境認識装置 |
JP6236039B2 (ja) * | 2015-06-26 | 2017-11-22 | 株式会社Subaru | 車外環境認識装置 |
JP6427611B2 (ja) * | 2017-02-28 | 2018-11-21 | 株式会社東芝 | 車両画像処理装置、及び、車両画像処理システム |
-
2018
- 2018-11-15 CN CN201811359075.3A patent/CN109541583B/zh active Active
-
2019
- 2019-07-15 SG SG11202010955SA patent/SG11202010955SA/en unknown
- 2019-07-15 WO PCT/CN2019/095980 patent/WO2020098297A1/fr active Application Filing
- 2019-07-15 JP JP2019563448A patent/JP6851505B2/ja active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150219758A1 (en) * | 2014-01-31 | 2015-08-06 | Applied Concepts, Inc. | Mobile radar and visual tracking coordinate transformation |
CN105469052A (zh) * | 2015-11-25 | 2016-04-06 | 东方网力科技股份有限公司 | 一种车辆检测跟踪方法和装置 |
CN107272021A (zh) * | 2016-03-30 | 2017-10-20 | 德尔福技术有限公司 | 使用雷达和视觉定义的图像检测区域的对象检测 |
CN107202983A (zh) * | 2017-05-19 | 2017-09-26 | 深圳佑驾创新科技有限公司 | 基于图像识别和毫米波雷达融合的自动刹车方法和系统 |
CN107463890A (zh) * | 2017-07-20 | 2017-12-12 | 浙江零跑科技有限公司 | 一种基于单目前视相机的前车检测与跟踪方法 |
CN108764108A (zh) * | 2018-05-22 | 2018-11-06 | 湖北省专用汽车研究院 | 一种基于贝叶斯统计决策的前车检测方法 |
CN109541583A (zh) * | 2018-11-15 | 2019-03-29 | 众安信息技术服务有限公司 | 一种前车距离检测方法及系统 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113781665A (zh) * | 2020-07-28 | 2021-12-10 | 北京沃东天骏信息技术有限公司 | 一种标注信息的审核方法和装置 |
CN112567439A (zh) * | 2020-11-09 | 2021-03-26 | 驭势(上海)汽车科技有限公司 | 一种交通流信息的确定方法、装置、电子设备和存储介质 |
CN112567439B (zh) * | 2020-11-09 | 2022-11-29 | 驭势(上海)汽车科技有限公司 | 一种交通流信息的确定方法、装置、电子设备和存储介质 |
CN113421298A (zh) * | 2021-06-17 | 2021-09-21 | 深圳市高格通讯技术有限公司 | 车辆测距方法、车辆控制装置、车辆及可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN109541583A (zh) | 2019-03-29 |
CN109541583B (zh) | 2020-05-01 |
JP6851505B2 (ja) | 2021-03-31 |
JP2021508387A (ja) | 2021-03-04 |
SG11202010955SA (en) | 2020-12-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020098297A1 (fr) | Procédé et système de mesure de distance par rapport à un véhicule de tête | |
EP4044117A1 (fr) | Procédé et appareil de suivi de cible, dispositif électronique, et support de stockage lisible par ordinateur | |
CN110045376B (zh) | 可行驶区域获取方法、计算机可读存储介质及终端设备 | |
CN112292711B (zh) | 关联lidar数据和图像数据 | |
CN107272021B (zh) | 使用雷达和视觉定义的图像检测区域的对象检测 | |
CN110663060B (zh) | 一种用于表示环境元素的方法、装置、系统、以及车辆/机器人 | |
US10255673B2 (en) | Apparatus and method for detecting object in image, and apparatus and method for computer-aided diagnosis | |
EP3379509A1 (fr) | Appareil, procédé et dispositif de traitement d'image pour la détection de fumée | |
JP5353455B2 (ja) | 周辺監視装置 | |
CN108446622A (zh) | 目标物体的检测跟踪方法及装置、终端 | |
CN107038723A (zh) | 棒状像素估计方法和系统 | |
WO2022217630A1 (fr) | Procédé et appareil de détermination de la vitesse d'un véhicule, dispositif et support | |
TWI595450B (zh) | 物件偵測系統 | |
JP2017068700A (ja) | 物体検出装置、物体検出方法、及びプログラム | |
Pyo et al. | Front collision warning based on vehicle detection using CNN | |
Gluhaković et al. | Vehicle detection in the autonomous vehicle environment for potential collision warning | |
CN112505652B (zh) | 目标检测方法、装置及存储介质 | |
Meshram et al. | Traffic surveillance by counting and classification of vehicles from video using image processing | |
CN112711034A (zh) | 物体检测方法、装置及设备 | |
JP2014235743A (ja) | 深度画像に基づく手の位置確定方法と設備 | |
Romero-Cano et al. | Stereo-based motion detection and tracking from a moving platform | |
JP2016189084A (ja) | 車両状態判定装置 | |
EP3598175A1 (fr) | Système de détection d'objets | |
CN107356916B (zh) | 车距检测方法及装置、电子设备、计算机可读存储介质 | |
CN107563333A (zh) | 一种基于测距辅助的双目视觉手势识别方法和装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2019563448 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19884244 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19884244 Country of ref document: EP Kind code of ref document: A1 |