CN117201728A - Radar video all-in-one - Google Patents

Radar video all-in-one Download PDF

Info

Publication number
CN117201728A
CN117201728A CN202310927510.2A CN202310927510A CN117201728A CN 117201728 A CN117201728 A CN 117201728A CN 202310927510 A CN202310927510 A CN 202310927510A CN 117201728 A CN117201728 A CN 117201728A
Authority
CN
China
Prior art keywords
radar
traffic
camera
module
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310927510.2A
Other languages
Chinese (zh)
Inventor
隋永吉
杜子建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunkong Zhixing Technology Co Ltd
Original Assignee
Yunkong Zhixing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunkong Zhixing Technology Co Ltd filed Critical Yunkong Zhixing Technology Co Ltd
Priority to CN202310927510.2A priority Critical patent/CN117201728A/en
Publication of CN117201728A publication Critical patent/CN117201728A/en
Pending legal-status Critical Current

Links

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The embodiment of the specification discloses radar video all-in-one, includes: the radar device comprises a camera module, a radar module, a core processor, a cover plate and a bottom box, wherein the bottom box comprises a base and peripheral side surfaces; the camera module, the radar module and the core processor are positioned in a space formed by the cover plate and the bottom box; the camera module comprises a plurality of cameras with different shooting distances and is used for collecting traffic video stream data in the different shooting distances; the radar module is used for collecting traffic point cloud data; and the core processor is used for carrying out fusion processing on the traffic video stream data and the traffic point cloud data.

Description

Radar video all-in-one
Technical Field
The application relates to the technical field of monitoring equipment, in particular to a radar video all-in-one machine.
Background
The radar video all-in-one machine integrates radar speed measurement and video snapshot onto the same equipment, so that the targets can be accurately captured and measured in the process of video monitoring. The sensor of the traditional radar video all-in-one machine consists of a millimeter wave radar and a camera, and a larger perception blind area exists at the near end in the perception direction.
In the prior art, the radar video all-in-one machine of the previous point position is generally required to cover the perception blind area of the radar video all-in-one machine of the current point position, so that the angle of a camera of the radar video all-in-one machine needs to be adjusted, however, the situation that the coverage of the perception blind area is insufficient possibly occurs in angle adjustment, and the overall perception effect is influenced.
Therefore, how to avoid the occurrence of the near-end sensing blind area becomes a technical problem to be solved urgently.
Disclosure of Invention
The embodiment of the specification provides a radar video all-in-one machine to solve the great problem of current near-end perception blind area.
In order to solve the above technical problems, the embodiments of the present specification are implemented as follows:
the embodiment of the present specification provides a radar video all-in-one, includes: the radar device comprises a camera module, a radar module, a core processor, a cover plate and a bottom box, wherein the bottom box comprises a base and peripheral side surfaces;
the camera module, the radar module and the core processor are positioned in a space formed by the cover plate and the bottom box;
the camera module comprises a plurality of cameras with different shooting distances and is used for collecting traffic video stream data in the different shooting distances;
the radar module is used for collecting traffic point cloud data;
and the core processor is used for carrying out fusion processing on the traffic video stream data and the traffic point cloud data.
Optionally, the cameras with different shooting distances comprise a near-view camera, a far-view camera and a fisheye camera.
Optionally, the fisheye camera is located on the base, and the near view camera and the far view camera are located on the same side of the peripheral side; the fish-eye camera is matched with the first through hole on the base, the near-view camera is matched with the first through hole on the peripheral side face, and the far-view camera is matched with the second through hole on the peripheral side face.
Optionally, the radar module and the near view camera and the far view camera are located on the same side face of the peripheral side face.
Optionally, the device further comprises an external connector, and the external connector is electrically connected with the core processor and used for outputting the data after fusion processing.
Optionally, the system further comprises a GPS timing module, wherein the GPS timing module is electrically connected with the camera module and the radar module, and is used for synchronizing the traffic video stream data with the time stamp of the traffic point cloud data to obtain synchronized traffic data.
Optionally, the core processor is connected with the cover plate, and is located in different horizontal planes with the camera module and the radar module.
Optionally, heat dissipation teeth are arranged on the outer surface of the cover plate.
The embodiment of the specification also provides a traffic data processing method, which is applied to the radar video all-in-one machine and comprises the following steps: acquiring traffic video stream data and traffic point cloud data in different shooting distances; and carrying out fusion processing on the traffic video stream data and the traffic point cloud data, and outputting the fused data.
One embodiment of the present specification achieves the following advantageous effects: according to the embodiment, the plurality of cameras with different shooting distances are arranged, so that traffic data in different sensing ranges can be collected, a near-end sensing blind area is avoided, the sensing precision is improved, and the sensing range is increased.
Drawings
In order to more clearly illustrate the embodiments of the present description or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments described in the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a radar video all-in-one machine according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a camera module and a radar module according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of an application of a radar video all-in-one machine according to an embodiment of the present disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of one or more embodiments of the present specification more clear, the technical solutions of one or more embodiments of the present specification will be clearly and completely described below in connection with specific embodiments of the present specification and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present specification. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without undue burden, are intended to be within the scope of one or more embodiments herein.
The following describes in detail the technical solutions provided by the embodiments of the present specification with reference to the accompanying drawings.
The road side perception system is deployed on two sides of a road, vehicles, non-motor vehicles, pedestrians and other objects passing through the coverage area are detected through perception sensors such as cameras, millimeter wave radars and the like, the radar video all-in-one machine is a brand new generation intelligent sensor specially designed for the cooperation of the vehicle, the traffic sensor is formed by integrating the cameras, the radars and a high-performance processor, the sensor of the traditional radar video all-in-one machine consists of one radar and one camera, the perception range is 25-150 m, and a larger perception blind area can appear at the near end in the perception direction.
In the prior art, a radar video all-in-one machine of a previous point location is generally required to cover a perception blind area of the radar video all-in-one machine of the current point location, so that the situation of insufficient coverage of the perception blind area can occur, and the overall perception effect is influenced.
In order to solve the drawbacks of the prior art, the present solution provides the following embodiments:
fig. 1 is a schematic structural diagram of a radar video all-in-one machine according to an embodiment of the present disclosure.
In this embodiment of the present disclosure, as shown in fig. 1, the radar video all-in-one machine may include: the camera module, the radar module, the core processor 2, the cover plate 1 and the bottom box 6, wherein the bottom box 6 comprises a base and peripheral side surfaces;
the camera module, the radar module and the core processor 2 are positioned in a space formed by the cover plate 1 and the bottom box 6;
the camera module comprises a plurality of cameras with different shooting distances and is used for collecting traffic video stream data in the different shooting distances;
the radar module is used for collecting traffic point cloud data;
the core processor 2 is configured to perform fusion processing on the traffic video stream data and the traffic point cloud data.
In the embodiment of the present disclosure, the cover plate 1 may be formed by a panel and a frame connected to the periphery of the panel, and the bottom case 6 may be formed by a base and a peripheral side connected to the periphery of the base, where the cover plate 1 and the bottom case 6 cooperate to form a closed space. The material of apron 1 and end box 6 can be the aluminum alloy material, and this material is favorable to radar video all-in-one heat dissipation.
The camera module can acquire video data of the target object on the traffic road, and the radar module can sense the position, the speed, the course angle and the like of the target object on the traffic road.
The fusion processing can be understood as that the traffic video stream data collected by the camera module and the point cloud data stream collected by the radar module are simultaneously accessed to a core processor of the radar video all-in-one machine through a mobile industry processor interface (MobileIndustryProcessor Interface, MIPI) and a serial peripheral interface (SerialPeripheralInterface, SPI). The core processor directly extracts an artificial intelligence (ArtificialIntelligence, AI) target from the traffic video stream, then projects the video target into a radar coordinate system through a built-in coordinate mapping system, and finally performs fusion tracking processing on the video target and the radar target to realize real-time vectorization and tracking of the global target.
In practical application, in order to prevent dust, moisture and the like from entering the closed space, a sealing strip can be arranged between the cover plate 1 and the bottom box 6, and the cover plate and the bottom box can be tightly screwed and sealed through screws.
According to the embodiment, the plurality of cameras with different shooting distances are arranged, so that traffic data in different sensing ranges can be acquired, and a near-end sensing blind area is avoided.
Fig. 2 is a schematic structural diagram of a camera module and a radar module in the embodiment of the present disclosure.
In this embodiment of the present disclosure, as shown in fig. 2, the device may set three cameras with shooting distances, including a near-view camera 3, a far-view camera 7, a fisheye camera 5, and a radar 8, where the near-view camera 3 and the far-view camera 7 are located on the same side of a peripheral side and are respectively matched with a first through hole and a second through hole on the peripheral side; the fisheye camera 5 is located on the base and is matched with the first through hole on the base, the number of each camera can be multiple, and the number of the radars can also be multiple, which is not limited herein.
Each camera can be divided into a lens and a main body, specifically, the lenses of the near-view camera 3 and the far-view camera 7 can be positioned on the same side of the peripheral side, and the main bodies of the near-view camera 3 and the far-view camera 7 can be positioned on a base; the lens of the fisheye camera 5 can penetrate through the through hole on the base and can be downwards arranged perpendicular to the ground so as to sense a target object below the point location, and the main body of the fisheye camera 5 can be positioned on the base.
The shooting distance of the fish-eye camera can be a first distance and a second distance, the shooting distance of the near-view camera can be a third distance and a fourth distance, the shooting distance of the distant-view camera can be a fifth distance and a sixth distance, wherein the second distance is larger than the first distance, the fourth distance is larger than the third distance, and the sixth distance is larger than the fifth distance.
In order to prevent the occurrence of a blank photographing range, overlapping areas may occur in photographing ranges of the fisheye camera and the near view camera, overlapping areas may occur in photographing ranges of the near view camera and the far view camera, that is, the third distance may be equal to or less than the second distance, and the fifth distance may be equal to or less than the fourth distance.
In practical application, the radar video integrated machine can be arranged on a road rod, the radar can be a millimeter wave radar, and the radar mainly senses the position, speed, course angle and the like of a target object; the close-range camera mainly senses the target object with a short-distance point position, and the acquisition range can be 20-150 m; the long-range camera mainly senses a target object with a long-range point position, and the acquisition range can be 100-250 m; the fish-eye camera mainly senses the target object below the point-position rod piece, and the acquisition range can be a range of 35m below the device and about each. If the point position of the radar video integrated machine is 0 point, the left side of the point position is a negative value, and the right side of the point position is a positive value, the sensing range of the camera of the radar video integrated machine can be-35 m-250 m.
Through setting up the camera of three kinds of shooting distances, can gather the traffic data in the different sensing scope to can avoid appearing near-end perception blind area, simultaneously because near-field camera and distant view camera are responsible for gathering closely and long-range traffic video stream respectively, thereby can improve the perception precision, increase the perception scope.
Optionally, in order to be convenient for fuse traffic video stream data and point cloud data stream and handle, near vision camera 3, far vision camera 7 and radar 8 are located the same side of peripheral side, in order not shelter from each other and gather the realization, can install near vision camera 3, radar 8 and far vision camera 7 in proper order at a certain distance, and the installation order is here not limited, near vision camera 3, far vision camera 7, fisheye camera 5 and radar 8 all can be through the screw fixation to the bottom box.
Optionally, in order to improve accuracy of fusion processing of video data and radar data, the radar module is located on the same side of the peripheral side as the close-range camera 3 and the distant-range camera 7.
Optionally, the radar video all-in-one machine may further include an external connector 4, where the external connector 4 is electrically connected to the core processor 2, and is configured to output the data after the fusion processing. In order to avoid influencing the camera and radar acquisition data to the external connector, the external connector is prevented from being positioned on the same side face of the peripheral side face with the camera module and the radar module.
Optionally, the radar video all-in-one machine may further include a GPS timing module, where the GPS timing module is electrically connected with the camera module and the radar module, and is configured to synchronize the traffic video stream data with a timestamp of the traffic point cloud data, so as to obtain synchronized traffic data, and the GPS timing module may be installed on the cover plate or the bottom box as required.
Optionally, for easy to assemble, save installation space simultaneously, core processor 2 can be connected with apron 1, is located different horizontal planes with camera module and radar module, and core processor 2 is located the core and handles the board, can fix the core and handle the board on the apron inboard through the screw. The core processor 2 may perform fusion processing on the synchronized traffic data.
Optionally, in order to improve the heat dissipation of radar video all-in-one, the surface of apron can set up the heat dissipation tooth, and the heat dissipation tooth can be a plurality of protruding dentate components that make progress to increase the heat dissipation area of the device surface.
The embodiment of the specification also provides a traffic data processing method, which is applied to the thunder all-in-one machine and comprises the following steps: acquiring traffic video stream data and traffic point cloud data in different shooting distances; and carrying out fusion processing on the traffic video stream data and the traffic point cloud data, and outputting the fused data.
Fig. 3 is a schematic diagram of an application of a radar video all-in-one machine according to an embodiment of the present disclosure.
In this embodiment of the present disclosure, as shown in fig. 3, the fused data output by the radar video integrated machine may be received by the edge computing unit through the external connector, and the edge computing unit may analyze all traffic data on the road based on the fused data. Traffic data may be transmitted to the cloud control platform, assisting the cloud control platform in automatic driving control of the internet-enabled vehicle, and may also be transmitted to a road side unit (RoadSideUnit, RSU) that interacts with the vehicle-mounted terminal (OnBoardUnit, OBU) based on the traffic data.
The radar video all-in-one machine can be deployed on two sides of a road, detects passing vehicles, non-motor vehicles, pedestrians and other objects in the coverage area, performs deep fusion and analysis through the edge computing unit, accurately identifies the passing state of traffic participants on the road and the road traffic environment, and effectively assists in serving the safe driving of an automatic driving automobile. The radar video all-in-one machine can combine the state of the vehicle and other traffic facility information collected by the RSU equipment and cloud information obtained by networking to realize cooperative sensing of a vehicle end, a road side and a cloud, and improve detection and identification precision.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.

Claims (9)

1. A radar video all-in-one machine, comprising: the radar device comprises a camera module, a radar module, a core processor, a cover plate and a bottom box, wherein the bottom box comprises a base and peripheral side surfaces;
the camera module, the radar module and the core processor are positioned in a space formed by the cover plate and the bottom box;
the camera module comprises a plurality of cameras with different shooting distances and is used for collecting traffic video stream data in the different shooting distances;
the radar module is used for collecting traffic point cloud data;
and the core processor is used for carrying out fusion processing on the traffic video stream data and the traffic point cloud data.
2. The radar video all-in-one machine of claim 1, wherein the plurality of cameras of different shooting distances includes a near-view camera, a far-view camera, and a fisheye camera.
3. The radar video all-in-one machine of claim 2, wherein the fisheye camera is located on the base, and the near view camera and the far view camera are located on a same side of the peripheral side; the fish-eye camera is matched with the first through hole on the base, the near-view camera is matched with the first through hole on the peripheral side face, and the far-view camera is matched with the second through hole on the peripheral side face.
4. A radar video all-in-one machine as in claim 3 wherein the radar module is located on the same side of the peripheral side as the near and far cameras.
5. The radar video all-in-one machine of claim 1, further comprising an external connector electrically connected to the core processor for outputting fused data.
6. The radar video all-in-one machine according to claim 1, further comprising a GPS timing module electrically connected to the camera module and the radar module for synchronizing the time stamps of the traffic video stream data and the traffic point cloud data to obtain synchronized traffic data.
7. The radar video all-in-one machine of claim 1, wherein the core processor is coupled to the cover plate at a different level than the camera module and the radar module.
8. The radar video all-in-one machine of claim 1, wherein the outer surface of the cover plate is provided with heat dissipating teeth.
9. A traffic data processing method applied to the radar video all-in-one machine of claim 1, comprising:
acquiring traffic video stream data and traffic point cloud data in different shooting distances;
and carrying out fusion processing on the traffic video stream data and the traffic point cloud data, and outputting the fused data.
CN202310927510.2A 2023-07-26 2023-07-26 Radar video all-in-one Pending CN117201728A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310927510.2A CN117201728A (en) 2023-07-26 2023-07-26 Radar video all-in-one

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310927510.2A CN117201728A (en) 2023-07-26 2023-07-26 Radar video all-in-one

Publications (1)

Publication Number Publication Date
CN117201728A true CN117201728A (en) 2023-12-08

Family

ID=89004218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310927510.2A Pending CN117201728A (en) 2023-07-26 2023-07-26 Radar video all-in-one

Country Status (1)

Country Link
CN (1) CN117201728A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117420546A (en) * 2023-12-14 2024-01-19 杭州海康威视数字技术股份有限公司 Radar video all-in-one machine

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117420546A (en) * 2023-12-14 2024-01-19 杭州海康威视数字技术股份有限公司 Radar video all-in-one machine
CN117420546B (en) * 2023-12-14 2024-03-29 杭州海康威视数字技术股份有限公司 Radar video all-in-one machine

Similar Documents

Publication Publication Date Title
CN111368706B (en) Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision
US11915470B2 (en) Target detection method based on fusion of vision, lidar, and millimeter wave radar
EP3792660B1 (en) Method, apparatus and system for measuring distance
CN112558023B (en) Calibration method and device of sensor
Semertzidis et al. Video sensor network for real-time traffic monitoring and surveillance
CN111199578B (en) Unmanned aerial vehicle three-dimensional environment modeling method based on vision-assisted laser radar
CN112017431A (en) Active vehicle continuous tracking and positioning system and method based on multi-data fusion
US20180293450A1 (en) Object detection apparatus
CN109747530A (en) A kind of dual camera and millimeter wave merge automobile sensory perceptual system
CN117201728A (en) Radar video all-in-one
CN113850102B (en) Vehicle-mounted vision detection method and system based on millimeter wave radar assistance
CN112740225B (en) Method and device for determining road surface elements
WO2020113358A1 (en) Systems and methods for synchronizing vehicle sensors and devices
CN112687113A (en) Roadside information perception equipment
CN114002669A (en) Road target detection system based on radar and video fusion perception
CN112179362A (en) High-precision map data acquisition system and acquisition method
CN115690713A (en) Binocular camera-based radar-vision fusion event detection method
CN210518410U (en) Automobile sensor system based on time synchronization and automatic driving vehicle
CN209928281U (en) Automatic pilot
CN111612833A (en) Real-time detection method for height of running vehicle
CN115116034A (en) Method, device and system for detecting pedestrians at night
CN115950416A (en) High-altitude platform multi-view laser vision inertial fusion positioning and mapping device and method
CN114998436A (en) Object labeling method and device, electronic equipment and storage medium
CN212301884U (en) Peripheral environment sensing device of vehicle
CN215182441U (en) Roadside information perception equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination