CN113687383A - Detection device containing radar and camera, roadside sensing equipment and intelligent traffic system - Google Patents

Detection device containing radar and camera, roadside sensing equipment and intelligent traffic system Download PDF

Info

Publication number
CN113687383A
CN113687383A CN202110966701.0A CN202110966701A CN113687383A CN 113687383 A CN113687383 A CN 113687383A CN 202110966701 A CN202110966701 A CN 202110966701A CN 113687383 A CN113687383 A CN 113687383A
Authority
CN
China
Prior art keywords
sensing unit
radar
camera
laser
intensity information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110966701.0A
Other languages
Chinese (zh)
Inventor
张庆舜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Zhilian Beijing Technology Co Ltd
Original Assignee
Apollo Zhilian Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Zhilian Beijing Technology Co Ltd filed Critical Apollo Zhilian Beijing Technology Co Ltd
Priority to CN202110966701.0A priority Critical patent/CN113687383A/en
Publication of CN113687383A publication Critical patent/CN113687383A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4804Auxiliary means for detecting or identifying lidar signals or the like, e.g. laser illuminators
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors

Abstract

The utility model provides a detection device, roadside sensing equipment and intelligent transportation system that contain radar and camera relates to sensing equipment technical field, especially relates to roadside sensing equipment technical field in the vehicle and road is in coordination. Wherein, this detecting device includes: the transmitting module is used for transmitting laser; the receiving module comprises a radar sensing unit and a camera sensing unit, the radar sensing unit receives the reflected light and generates first intensity information, and the camera sensing unit receives the reflected light and generates second intensity information; and the control and data processing module is used for controlling the transmitting module to transmit laser, and carrying out fusion processing on the first intensity information and the second intensity information to generate fusion intensity information. According to the technology disclosed by the invention, the higher fusion intensity information of the accuracy can be output, the better light supplementing effect can be realized under the condition of poor illumination condition, the light supplementing equipment does not need to be independently arranged for the camera sensing unit, and the equipment cost is reduced.

Description

Detection device containing radar and camera, roadside sensing equipment and intelligent traffic system
Technical Field
The utility model relates to an intelligent transportation technical field especially relates to roadside sensing equipment technical field in vehicle and road is in coordination.
Background
In the correlation technique, the road junction adopts the mode of laser radar and surveillance camera combination to carry out the road end monitoring, and when the surveillance camera was used under the extremely low scene of illuminance such as night, the mixed light filling unit or the laser instrument that need set up the high power alone usually carries out the light filling, and the hardware cost is higher.
Disclosure of Invention
The disclosure provides a detection device comprising a radar and a camera, a roadside sensing device and an intelligent transportation system.
According to an aspect of the present disclosure, there is provided a detection apparatus including a radar and a camera, including:
the transmitting module is used for transmitting laser, and the laser forms reflected light after meeting an obstacle;
the receiving module comprises a radar sensing unit and a camera sensing unit, the radar sensing unit receives the reflected light and generates first intensity information, and the camera sensing unit receives the reflected light and generates second intensity information;
and the control and data processing module is used for controlling the transmitting module to transmit laser, and carrying out fusion processing on the first intensity information and the second intensity information to generate fusion intensity information.
According to another aspect of the present disclosure, there is provided a roadside sensing device including:
the detection device comprising the radar and the camera according to the embodiment of the disclosure.
According to another aspect of the present disclosure, there is provided an intelligent transportation system including:
the roadside sensing device according to the above embodiment of the present disclosure;
and the road side calculating unit is used for receiving the laser point cloud data from the road side sensing equipment and performing data calculating processing on the laser point cloud data.
According to the technology disclosed by the invention, the fusion strength information with higher accuracy can be output; in addition, the camera sensing unit is supplemented with light by using the transmitting module, so that a better light supplementing effect can be realized under the condition of poor illumination condition, and light supplementing equipment does not need to be arranged for the camera sensing unit independently, thereby reducing the equipment cost.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 shows a schematic structural diagram of a detection apparatus including a radar and a camera according to an embodiment of the present disclosure;
fig. 2 shows a schematic structural diagram of a detection apparatus including a radar and a camera according to another embodiment of the present disclosure.
Description of reference numerals:
a detection device 1;
a transmitting module 10; a laser emitting unit 11; a laser adjusting unit 12;
a receiving module 20; a radar sensing unit 21; a camera sensing unit 22;
a control and data processing module 30.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
A detection apparatus 1 including a radar and a camera according to an embodiment of the present disclosure is described below with reference to fig. 1 and 2.
As shown in fig. 1, the detection device 1 comprises a transmitting module 10, a receiving module 20 and a control and data processing module 30.
Specifically, the emitting module 10 is used for emitting laser light, and the laser light forms reflected light after encountering an obstacle. The receiving module 20 includes a radar sensing unit 21 and a camera sensing unit 22, the radar sensing unit 21 receiving the reflected light and generating first intensity information, and the camera sensing unit 22 receiving the reflected light and generating second intensity information. The control and data processing module 30 is used for controlling the transmitting module 10 to transmit laser, and performing fusion processing on the first intensity information and the second intensity information to generate fusion intensity information.
For example, the detecting device 1 may be applied to a roadside sensing apparatus installed in a roadside environment, and the radar sensing unit 21 and the camera sensing unit 22 are used for detecting the same target region in the roadside environment. It can be understood that the detection field of view of the radar sensing unit 21 at least partially coincides with the monitoring field of view of the camera sensing unit 22, and a region corresponding to the coinciding portion of the detection field of view and the monitoring field of view is the same target region in the roadside environment.
The transmission module 10 may be a semiconductor laser, and the transmission module 10 may be integrally provided with the radar sensing unit 21 or integrally provided with the camera sensing unit 22.
In the embodiment of the present disclosure, the radar sensing unit 21 and the camera sensing unit 22 are both photoelectric sensors, and the photoelectric principles of the two are substantially similar. The radar sensing unit 21 and the camera sensing unit 22 both use light as a leading medium, and unlike the millimeter wave laser radar in the related art, the radar sensing unit 21 in the embodiment of the present disclosure may acquire pixel-level raw data, that is, the radar sensing unit 21 may output first intensity information of each laser point, and the camera sensing unit 22 may output second intensity information of each pixel point.
For example, the radar sensing unit 21 may adopt an Avalanche Photodiode (APD) architecture, a Single Photon Avalanche photodiode (SPAD) array, a Silicon photomultiplier (SiPM), or the like.
Illustratively, the camera sensing unit 22 may employ a Charge-coupled Device (CCD), a Complementary Metal Oxide Semiconductor (CMOS), or the like.
The control and data processing module 30 is configured to control the transmitting module 10 to transmit laser light, perform information synchronization processing on the radar sensing unit 21 and the camera sensing unit 22, and perform fusion calculation processing according to the first intensity information and the second intensity information.
For example, before performing the fusion calculation on the first intensity information and the second intensity information, internal and external reference calibration processing may be performed on the radar sensing unit 21 and the camera sensing unit 22 to calibrate and align the detection view field of the radar sensing unit 21 and the monitoring view field of the camera sensing unit 22, so as to determine a corresponding relationship between each laser point in the radar sensing data and each pixel point in the camera sensing data.
The control and data processing module 30 time-synchronizes the received radar sensing data and camera sensing data. Based on the synchronized first intensity information and second intensity information, the first intensity information of each laser point and the second intensity information of the corresponding pixel point are weighted and superposed by using a predetermined weight coefficient to obtain intensity fusion information. The intensity fusion information I' can be calculated according to the following formula:
I’=aI1+bI2
wherein, I1For characterizing first intensity information, I2For characterizing the second intensity information, a for characterizing a first weight coefficient corresponding to the first intensity information, and b for characterizing a second weight coefficient corresponding to the second intensity information. The first weighting factor a and the second weighting factor b can be determined by dynamic adjustment of a deep learning algorithm.
In one specific example, the transmitting module 10 transmits a laser pulse, the laser pulse forms a reflected light after encountering an obstacle, and the radar sensing unit 21 and the camera sensing unit 22 respectively receive the reflected light. The radar sensing unit 21 generates distance information and intensity information after receiving the reflected light, and the camera sensing unit 22 may adopt a photoelectric accumulation type device, converts an optical signal into an electrical signal through a photoelectric conversion element, and may obtain more accurate intensity information according to the strength of the electrical signal. However, since the radar sensing unit 21 adopts a pulse ranging method, the laser pulse may fluctuate during detection, and the intensity information error may reach 50%. The intensity fusion information is obtained by combining the first intensity information and the second intensity information, and point cloud data with high accuracy can be output based on the intensity fusion information and the distance information.
According to the detection apparatus 1 of the embodiment of the present disclosure, the first intensity information output by the radar sensing unit 21 and the second intensity information output by the camera sensing unit 22 are fused to obtain intensity fusion information, and the distance information and the intensity fusion information obtained by the radar sensing unit 21 are output, so that point cloud data with high accuracy is obtained. Therefore, the detection device 1 of the embodiment of the present disclosure has high detection accuracy, and can accurately output the intensity value of each laser spot, thereby improving the identification accuracy of the target object in the target area. In addition, the camera sensing unit is supplemented with light by using the transmitting module, so that a better light supplementing effect can be realized under the condition of poor illumination condition, and light supplementing equipment does not need to be arranged for the camera sensing unit independently, thereby reducing the equipment cost.
As shown in fig. 1, in one embodiment, the radar sensing unit 21 and the camera sensing unit 22 are separately provided.
In other words, the radar sensing unit 21 and the camera sensing unit 22 may be two separately disposed sensing elements, and the radar sensing unit 21 and the camera sensing unit 22 operate independently without interfering with each other. The radar sensing unit 21 and the camera sensing unit 22 respectively receive the reflected light through respective optical receiving elements.
The radar sensing unit 21 and the camera sensing unit 22 are respectively provided with a synchronous interface, and the radar sensing unit 21 and the camera sensing unit 22 realize communication through respective synchronous interfaces, so that hardware-level synchronization between the radar sensing unit 21 and the camera sensing unit 22 is realized. The synchronous interface may be an Input/Output (I/O) interface.
The control and data processing module 30 performs signal synchronization on the radar sensing unit 21 and the camera sensing unit 22 so that the radar sensing data is consistent with the time information of the camera sensing data. Then, fusion processing is performed on the radar sensing data and the camera sensing data, that is, weighted superposition is performed on the first intensity information output by the radar sensing unit 21 and the second intensity information output by the camera sensing unit 22, so as to obtain intensity fusion information.
In a specific example, the radar sensing unit 21 and the camera sensing unit 22 may be disposed in a roadside environment, for example, the radar sensing unit 21 and the camera sensing unit 22 may be adjacently disposed on a supporting rod with a certain height on the roadside to detect and monitor the roadside environment. More specifically, the detection apparatus 1 may employ a laser radar including the transmission module 10 and the radar sensing unit 21 and a camera device.
Through the above embodiment, the radar sensing unit 21 and the camera sensing unit 22 can work independently without interfering with each other, and optical crosstalk between the radar sensing unit 21 and the camera sensing unit 22 can be avoided.
In one embodiment, the radar sensing unit 21 comprises a single photon avalanche diode array and the camera sensing unit 22 comprises a photo accumulation type device.
It will be appreciated that a single photon avalanche diode is an avalanche photodiode operating in geiger mode. The bias voltage of the avalanche photodiode in the Geiger mode is higher than the breakdown voltage, the avalanche photodiode can enter a reverse breakdown state after receiving photons, and a large reverse current is passed, so that the function of detecting the photons is realized.
Illustratively, the single photon avalanche diode can be processed by a Complementary Metal Oxide Semiconductor (CMOS) process, which can result in a large-batch and highly integrated single photon avalanche diode array, wherein the size of the single photon avalanche diode is only 15um × 15 um. Therefore, on the basis of ensuring that the radar sensing unit 21 has high pixel detection capability, the size of the radar sensing unit 21 can be further reduced.
In other examples of the present disclosure, the radar sensing unit 21 may also be a Silicon Photomultiplier (SIPM). Specifically, the silicon photomultiplier is composed of a plurality of (several to several thousands) avalanche photodiode units, each avalanche photodiode unit is formed by connecting an avalanche photodiode and a large-resistance quenching resistor in series, and the operating mode of the avalanche photodiode is a geiger mode so as to form a single photon avalanche diode. The plurality of avalanche photodiode units are connected in parallel to form a surface receiving array. Since the silicon photomultiplier has a large dynamic range, the detection accuracy of the radar sensing unit 21 can be improved. In addition, the radar sensing unit 21 may be packaged with a plurality of silicon photomultiplier tubes to form a plurality of surface receiving arrays, so as to improve the receiving performance of the reflected laser pulse signals and reduce the probability of missed detection of the laser pulse signals.
Illustratively, the photo-accumulation-type Device may employ a Charge-coupled Device (CCD). The charge coupled device is a detecting element which uses charge quantity to represent signal magnitude and uses coupling mode to transfer signal. Specifically, the charge-coupled element includes a photodiode, a shift signal register, a parallel signal register, a signal amplifier, and an analog-to-digital converter. The shift signal register is used for temporarily storing charges generated after the photosensitive diode is sensitive to light, the parallel signal register is used for temporarily storing analog signals of the parallel register and exclusively amplifying the charges, the signal amplifier is used for amplifying weak electric signals, and the analog converter is used for converting the amplified electric signals into digital signals. Thus, the digital signal output by the CCD can obtain the second intensity information.
According to the embodiment, the single photon avalanche diode array is used as the radar sensing unit 21, so that the resolution and the integration level of the radar sensing unit 21 are improved, the laser detection requirement of high beam density can be met, the missing detection probability of laser pulse signals is reduced, the filling factor of the radar sensing unit 21 is greatly improved, and the detection of a small target object in a long distance can be realized. Secondly, by using the photo accumulation device as the camera sensing unit 22, the camera sensing unit 22 can output the second intensity information, and the output intensity value is more accurate. Furthermore, through setting up the emission module 10 that radar sensing unit 21 and camera sensing unit 22 share, can utilize emission module 10 to carry out the light filling to camera sensing unit 22, can realize better light filling effect under the poor condition of illumination condition to need not to set up light filling equipment alone for camera sensing unit 22, thereby reduced equipment cost.
As shown in fig. 2, in one embodiment, the radar sensing unit 21 and the camera sensing unit 22 are integrally provided.
Among them, the radar sensing unit 21 and the camera sensing unit 22 share one optical receiving element, that is, the radar sensing unit 21 and the camera sensing unit 22 simultaneously receive the reflected light through one optical receiving element.
Through the above embodiment, the integration level of the receiving module 20 is improved, and the overall size of the receiving module 20 is reduced, so that the overall size of the detecting device 1 is reduced, and the application range of the detecting device 1 is enlarged.
In one embodiment, the radar sensing unit 21 comprises an array of single photon avalanche photodiodes and the camera sensing unit 22 comprises a plurality of PIN photodiodes, with a single avalanche photodiode packaged with a single PIN photodiode and forming a single pixel.
It is understood that a PIN photodiode, also called a PIN junction diode, a PIN diode, a photodetector in which an I-type layer is formed between a P region and an N region at a PN junction between two semiconductors or in the vicinity of a junction between a semiconductor and a metal, and light radiation is absorbed to generate a photocurrent, has advantages of a small junction capacitance, a short transit time, high sensitivity, and the like. The magnitude of the electrical signal output by the PIN photodiode is proportional to the intensity of the optical signal, and therefore, the intensity value can be obtained according to the magnitude of the current output by the PIN photodiode.
For example, the radar sensing units 21 and the camera sensing units 22 may be regularly arrayed to form a sensing area array, and the sensing area array may include 128 × 128 pixel units. Wherein a single photon avalanche diode and a PIN photodiode are encapsulated and form a pixel cell.
In other examples of the present disclosure, the receiving module 20 may also adopt a combination of a silicon photomultiplier tube and a plurality of PIN photodiodes, wherein the silicon photomultiplier tube is disposed adjacent to the plurality of PIN photodiodes arranged in an array and respectively receives the emitted light.
According to the above embodiment, by integrally disposing the single-photon avalanche diode and the PIN diode to form a plurality of detection pixels, not only the integration degree of the radar sensing unit 21 and the camera sensing unit 22 is improved, the external size of the detection apparatus 1 is further reduced, but also the detection sensitivity is improved and the response time is shortened.
In one embodiment, the emitting module 10 includes a laser emitting unit 11 and a laser adjusting unit 12, and the laser adjusting unit 12 is configured to optically shape or optically deflect the laser light emitted from the laser emitting unit 11 so that the laser light covers a preset area.
Illustratively, the Laser Emitting unit 11 may employ a Vertical-Cavity Surface-Emitting Laser (VCSEL) or a semiconductor Laser (LD). Preferably, the laser emitting unit 11 may employ a vertical cavity surface emitting laser. The vertical cavity surface emitting laser has the advantages of high output power, high conversion efficiency, high laser quality and the like, and can improve the sensing precision of the radar sensing unit 21 or the camera sensing unit 22, reduce the operation power consumption and improve the working reliability.
In one example, the laser adjusting unit 12 may employ a scanning galvanometer, and the scanning galvanometer may reflect, refract or diffract the laser light emitted by the laser emitting unit 11 so that the laser light covers a preset target detection area. It is understood that a scanning galvanometer is a Micro-actuated mirror based on Micro-Electro-Mechanical systems (MEMS) technology, and the mirror surface diameter is usually only a few millimeters. By arranging the scanning galvanometer, the emission module 10 has high controllability, that is, any direction of laser can be realized within an allowable angle range, and high-density scanning can be performed in a key area.
In the embodiment of the present disclosure, the scanning galvanometer may adopt various types of galvanometers known by those skilled in the art or known in the future, for example, a resonant state galvanometer and a quasi-static galvanometer, as long as the laser emitted by the laser emitting unit 11 can be adjusted to cover the preset area.
In another example, the Optical conditioning unit may employ an Optical Phased Array (OPA). It can be understood that the basic principle of the optical phased array is similar to that of the microwave phased array, the optical phased array includes a plurality of optical beam splitters, the laser emitted by the laser emitting unit 11 is divided into a plurality of optical signals by the optical beam splitters, and under the condition that there is no phase difference between the optical signals, the time for the light to reach the equiphase plane is the same, and the light propagates forward without interference, so that beam deflection does not occur. After the phase difference is added to each optical signal, the equiphase surface is not perpendicular to the waveguide direction any more but has a certain deflection, the beams satisfying the equiphase relation have coherent phase, the beams not satisfying the equiphase condition cancel each other out, and the direction of the beams is always perpendicular to the equiphase surface. Thereby, the emission direction of the laser light can be adjusted by optically shaping the laser light emitted from the laser emitting unit 11.
Through the above embodiment, compared with a scanning structure that drives the laser emitting unit 11 to rotate in the related art, the emitting module 10 of the detection device 1 of the embodiment of the present disclosure has the advantages of simple structure and small size, and is beneficial to improving the integration level of the detection device 1 and reducing the hardware cost of the detection device 1.
In one embodiment, the laser emitting unit 11 employs a line laser or a surface laser.
It is understood that the line laser means a plurality of lasers arranged in a straight line, and the line laser emitted from the line laser is deflected in a predetermined direction by the scanning galvanometer, so that the line laser forms a scanning surface in a predetermined area. The area laser refers to a plurality of laser arrays which are arranged to form an area array, laser emitted by the area lasers directly forms a scanning surface in a preset area, and the scanning surface completely covers the preset area through a scanning galvanometer.
Therefore, the laser emission power and intensity of the emission module 10 can be improved, and the distance measurement capability of the sensing unit can be improved.
Alternatively, the laser emitting unit 11 includes a plurality of emitting areas, and the control and data processing module 30 controls each emitting area to emit laser light individually.
Illustratively, the detection field of view of the radar sensing unit 21 is formed by splicing a plurality of detection areas, and a plurality of emission areas correspond to the plurality of detection areas one to one. The laser emission unit 11 includes a plurality of laser emission subunits, each laser emission subunit forms a corresponding emission region, and the plurality of laser emission subunits are spliced with each other to form surface laser. The control and data processing module 30 may control one or more emitting areas therein to emit laser light, and the radar sensing unit 21 receives the reflected light with a detection area corresponding to the emitting area.
Through the embodiment, the detection of the detection field of view in the sub-areas can be realized, and each detection area can be controlled independently, so that a specific target area in the detection field of view is detected independently, and the detection precision of the specific target area can be improved in a targeted manner by fusing the intensity values of the specific target area by the radar sensing unit 21 and the camera sensing unit 22.
In one embodiment, the radar sensing unit 21 is further configured to receive the reflected light and generate first distance information, and the camera sensing unit 22 is further configured to receive the reflected light and generate second distance information; the control and data processing module 30 performs a fusion process based on the first distance information and the second distance information and generates distance fusion information.
Illustratively, the camera sensing unit 22 may further include an infrared receiver, and the transmitting module 10 may include an infrared transmitter. The control and data processing module 30 controls the infrared transmitter to transmit infrared light, and the infrared receiver receives the infrared light and generates a corresponding electrical signal and outputs second distance information.
The control and data processing module 30 obtains distance fusion information by weighted superposition based on the first distance information output by the radar sensing unit 21 and the second distance information output by the camera sensing unit 22. The weight coefficients corresponding to the first distance information and the second distance information can be dynamically adjusted and determined through a deep learning algorithm.
Through the above embodiment, the detection apparatus 1 according to the embodiment of the present disclosure may not only output the fusion strength information, but also obtain the distance fusion information based on the distance information output by each of the radar sensing unit 21 and the camera sensing unit 22. Therefore, the detection precision of the distance value is improved, and the detection device 1 can accurately detect the target area.
As another aspect of the disclosed embodiments, a roadside sensing device is also provided. The roadside sensing device includes the detection apparatus 1 including the radar and the camera according to the above-described embodiment of the present disclosure.
Illustratively, the roadside sensing device includes a base body, and the detection device 1 is arranged on the base body. The inside integration of pedestal has control unit and power module, and control unit is used for sending synchronizing signal respectively to radar sensing unit 21 and camera sensing unit 22 to realize radar sensing unit 21 and camera sensing unit 22's signal synchronization, and power module is used for supplying power respectively to emission module 10 and receiving module 20.
According to the roadside sensing equipment of the embodiment of the disclosure, by utilizing the detection device 1 according to the embodiment of the disclosure, the detection precision of the roadside environment is favorably improved, especially under the scene with poor illumination condition, the strength value with higher precision can be output, and the roadside sensing equipment has higher equipment integration and lower equipment cost.
Other configurations of the roadside sensing devices of the above-described embodiments may be adopted for various technical solutions known by those skilled in the art now and in the future, and will not be described in detail here.
As another aspect of the disclosed embodiment, an intelligent transportation system is also provided.
The intelligent transportation system comprises the road side sensing equipment and a road side calculating unit according to the embodiment of the disclosure, wherein the road side calculating unit is used for receiving laser point cloud data from the road side sensing equipment and performing data calculating processing on the laser point cloud data.
Illustratively, the roadside computing unit may be an edge computing unit, and is configured to receive the fusion strength information, the fusion distance information, and the point cloud information sent by the roadside sensing device, and execute corresponding decision processing to obtain relevant information of a target object in a target environment, so as to implement other functions such as prediction perception, path planning, and early warning of the target object.
The intelligent transportation system can further comprise a cloud server and a vehicle-end server, and any two of the roadside computing unit, the cloud server and the vehicle-end server can perform information interaction.
In the description of the present specification, it is to be understood that the terms "center," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," "axial," "radial," "circumferential," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the present disclosure and to simplify the description, but are not intended to indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and are therefore not to be considered limiting of the present disclosure.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present disclosure, "a plurality" means two or more unless specifically limited otherwise.
In the present disclosure, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integral; the connection can be mechanical connection, electrical connection or communication; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present disclosure can be understood by those of ordinary skill in the art as appropriate.
In the present disclosure, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may comprise the first and second features being in direct contact, or may comprise the first and second features being in contact, not directly, but via another feature in between. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly above and obliquely above the second feature, or simply meaning that the first feature is at a lesser level than the second feature.
The above disclosure provides many different embodiments or examples for implementing different features of the disclosure. In order to simplify the disclosure of the present disclosure, specific example components and arrangements are described above. Of course, they are merely examples and are not intended to limit the present disclosure. Moreover, the present disclosure may repeat reference numerals and/or reference letters in the various examples, which have been repeated for purposes of simplicity and clarity and do not in themselves dictate a relationship between the various embodiments and/or arrangements discussed.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (11)

1. A detection device including a radar and a camera, comprising:
the transmitting module is used for transmitting laser, and the laser forms reflected light after meeting an obstacle;
the receiving module comprises a radar sensing unit and a camera sensing unit, the radar sensing unit receives the reflected light and generates first intensity information, and the camera sensing unit receives the reflected light and generates second intensity information;
and the control and data processing module is used for controlling the transmitting module to transmit laser, performing fusion processing on the first intensity information and the second intensity information and generating fusion intensity information.
2. The radar-and-camera based sounding device of claim 1, wherein the radar sensing unit and the camera sensing unit are integrally disposed.
3. The radar-and-camera detection apparatus of claim 2, wherein the radar sensing unit comprises an array of single photon avalanche photodiodes and the camera sensing unit comprises a plurality of PIN photodiodes, a single said avalanche photodiode being packaged with a single said PIN photodiode and forming a single pixel.
4. The radar-and-camera based sounding device of claim 1, wherein the radar sensing unit and the camera sensing unit are provided separately.
5. The radar-and-camera detection apparatus of claim 4, wherein the radar sensing unit comprises a single photon avalanche diode array and the camera sensing unit comprises a photo-accumulation-type device.
6. The radar and camera based detection apparatus of claim 1, wherein the emitting module comprises a laser emitting unit and a laser adjusting unit, and the laser adjusting unit is configured to optically shape or optically deflect the laser emitted by the laser emitting unit so that the laser covers a predetermined area.
7. The radar and camera based detection apparatus of claim 6, wherein the laser emitting unit is a line laser or a surface laser.
8. The radar and camera based detection apparatus of claim 6, wherein the laser emitting unit comprises a plurality of emitting areas, and the control and data processing module controls each emitting area to emit laser light individually.
9. The radar-and-camera detection apparatus of any one of claims 1 to 8, wherein the radar sensing unit is further configured to receive the reflected light and generate first range information, and the camera sensing unit is further configured to receive the reflected light and generate second range information; and the control and data processing module performs fusion processing based on the first distance information and the second distance information and generates distance fusion information.
10. A roadside sensing device characterized by comprising:
a radar and camera containing detection apparatus as claimed in any one of claims 1 to 9.
11. An intelligent transportation system, comprising:
the roadside sensing apparatus of claim 10;
and the road side calculating unit is used for receiving the laser point cloud data from the road side sensing equipment and performing data calculating processing on the laser point cloud data.
CN202110966701.0A 2021-08-23 2021-08-23 Detection device containing radar and camera, roadside sensing equipment and intelligent traffic system Pending CN113687383A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110966701.0A CN113687383A (en) 2021-08-23 2021-08-23 Detection device containing radar and camera, roadside sensing equipment and intelligent traffic system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110966701.0A CN113687383A (en) 2021-08-23 2021-08-23 Detection device containing radar and camera, roadside sensing equipment and intelligent traffic system

Publications (1)

Publication Number Publication Date
CN113687383A true CN113687383A (en) 2021-11-23

Family

ID=78581353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110966701.0A Pending CN113687383A (en) 2021-08-23 2021-08-23 Detection device containing radar and camera, roadside sensing equipment and intelligent traffic system

Country Status (1)

Country Link
CN (1) CN113687383A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1695155A (en) * 2002-08-28 2005-11-09 Bae系统航空控制公司 Image fusion system and method
CN105730427A (en) * 2014-12-26 2016-07-06 丰田自动车株式会社 Vehicle braking control apparatus
CN107219533A (en) * 2017-08-04 2017-09-29 清华大学 Laser radar point cloud and image co-registration formula detection system
CN111458714A (en) * 2020-04-13 2020-07-28 山东工商学院 Medium-and-far-infrared thermal imaging laser illumination system and method
EP3726247A1 (en) * 2019-04-18 2020-10-21 Veoneer Sweden AB Lidar imaging apparatus for a motor vehicle
CN112731443A (en) * 2021-02-08 2021-04-30 山东大学 Three-dimensional imaging system and method for fusing single photon laser radar and short wave infrared image
CN112912766A (en) * 2021-02-02 2021-06-04 华为技术有限公司 Detection device, control method, fusion detection system and terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1695155A (en) * 2002-08-28 2005-11-09 Bae系统航空控制公司 Image fusion system and method
CN105730427A (en) * 2014-12-26 2016-07-06 丰田自动车株式会社 Vehicle braking control apparatus
CN107219533A (en) * 2017-08-04 2017-09-29 清华大学 Laser radar point cloud and image co-registration formula detection system
EP3726247A1 (en) * 2019-04-18 2020-10-21 Veoneer Sweden AB Lidar imaging apparatus for a motor vehicle
CN111458714A (en) * 2020-04-13 2020-07-28 山东工商学院 Medium-and-far-infrared thermal imaging laser illumination system and method
CN112912766A (en) * 2021-02-02 2021-06-04 华为技术有限公司 Detection device, control method, fusion detection system and terminal
CN112731443A (en) * 2021-02-08 2021-04-30 山东大学 Three-dimensional imaging system and method for fusing single photon laser radar and short wave infrared image

Similar Documents

Publication Publication Date Title
US9638520B2 (en) Measuring apparatus and measuring device for measuring a target object in a multidimensional manner
US20190257924A1 (en) Receive path for lidar system
US8946637B2 (en) Compact fiber-based scanning laser detection and ranging system
WO2021196194A1 (en) Laser emitting-and-receiving system, laser radar and automatic driving apparatus
CN103443648A (en) Measurement device for measuring distance between the measurement device and target object using optical measurement beam
CN214795207U (en) Solid state lidar
US20230022688A1 (en) Laser distance measuring device, laser distance measuring method, and movable platform
CN112805595B (en) Laser radar system
CN109444850A (en) Phased-array laser radar
CN112068150A (en) Laser radar and ranging method
WO2022083198A1 (en) Multi-line scanning distance measurement system
CN106772426B (en) System for realizing remote laser high-sensitivity single photon imaging
US20210382147A1 (en) Lidar and detection apparatus thereof
CN114488173A (en) Distance detection method and system based on flight time
US20240069162A1 (en) Solid-state lidar and method for detection using same
CN209590262U (en) Phased-array laser radar
CN113687383A (en) Detection device containing radar and camera, roadside sensing equipment and intelligent traffic system
CN215264040U (en) Roadside sensing equipment and intelligent transportation system
CN110333500B (en) Multi-beam laser radar
CN110346779B (en) Measuring method for time channel multiplexing of multi-beam laser radar
CN115201844A (en) Solid-state laser radar and detection method using same
CN110726983A (en) Laser radar
CN111308498A (en) Three-dimensional imaging laser radar device
CN114930191A (en) Laser measuring device and movable platform
CN215678765U (en) Hybrid TOF sensor system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination